A fish on the Great Barrier Reef continually acquires new information from its environment—the location of food, the murkiness of the water, and the sounds of distant ships, to name a few examples. But only some of that information is meaningful, in that it actually helps the fish survive. In various disciplines, from biology to artificial intelligence, identifying such meaningful, or “semantic,” information is a key challenge. Yet a broadly applicable, fully formal definition of this kind of information has never been developed.
A new paper by the Santa Fe Institute’s Artemy Kolchinsky, a postdoctoral fellow specializing in information theory, and professor David Wolpert, a mathematician and physicist, proposes one. Taking cues from statistical physics and information theory, they’ve come up with a definition that emphasizes how a particular piece of information contributes to the ability of a physical system to perpetuate itself—which in the context of common biological organisms means its ability to survive. Semantic information, they write, is “the information that a physical system has about its environment that is causally necessary for the system to maintain its own existence over time.”
For example, the location of food is semantic information to the Great Barrier Reef fish because it’s essential for the fish’s survival. But the sound of a distant ship does not contribute to the fish’s viability, so it does not qualify as semantic information.
Kolchinsky and Wolpert hope that this new, formal definition of semantic information can help researchers sort the wheat from the chaff when trying to make sense of the information a physical system has about its environment.
“Some information can be extraordinarily meaningful to an organism, while other information can have no meaning,” Wolpert says. “While it seems obvious that this distinction is crucial for analyzing biological organisms, it has never been formalized. Moreover, to avoid the fraught issue of defining a ‘biological organism,’ we wanted our definition of meaningful information to be applicable to both living and non-living physical systems, such as rocks and hurricanes.”
The researchers’ definition fills a hole in information theory left by Claude Shannon, who intentionally omitted the issue of the “meaning” of information in his iconic paper that created the field, “A Mathematical Theory of Communication,” in 1948.
In the realm of biology, understanding the role of semantic information could help answer some of the discipline’s most intriguing questions, such as how the earliest life forms evolved, or how existing ones adapt, says Kolchinsky. “When we talk about fitness and adaptation, does semantic information increase over evolutionary time? Do organisms get better at picking up information that’s meaningful to them?”
Artemy Kolchinsky et al, Semantic information, autonomous agency and non-equilibrium statistical physics, Interface Focus (2018). DOI: 10.1098/rsfs.2018.0041