Complexity

A New Perspective on Economic Growth: What is Information and Why Does it Grow?

The evolution of prosperity and accumulate knowledge

Share with your friends










Submit
More share buttons
Share on Pinterest

By Cesar A. Hidalgo

Ludwig was an unhappy man. Did the death of his son push him over the edge? Or was he broken down by his colleagues’ criticisms? Maybe he loved atoms too much?

While on summer vacation, Ludwig killed himself. Elsa, his youngest daughter, found him dangling from a rope. She refused to talk about this episode throughout her life.

Of course, the Ludwig that I am talking about is Ludwig Boltzmann. Ludwig was a successful scientist, but also an insecure man. Ludwig made important contributions to our understanding of nature. His scientific contributions, however, did not go unchallenged.

Get Evonomics in your inbox

Ludwig believed in atoms at a time when many of his colleagues considered atoms to be nothing more than a convenient analogy. Their skepticism troubled him. On the one hand, he knew he was on the right track. He had shown that the empirical behavior of gases could be attributed to the collective motion of molecules, or atoms. This finding gave him indirect evidence of the existence of atoms, but no way to observe these directly.

The lack of direct evidence left Ludwig vulnerable to the critiques of his colleagues. His nemesis, the physicist turned philosopher Ernst Mach, maintained that science should focus only on relationships among directly observable quantities. Additional theoretical constructs, like Boltzmann’s atoms, were not allowed.

But Ludwig’s troubles were not just social. For decades he had been trying to explain the origins of physical order. His attempts, while scientifically fruitful, were also unsuccessful. Ludwig’s theory predicted the opposite of what he wanted to show. His everyday experience indicated that order was increasing all around him: flowers bloomed, trees sprouted, and the rapidly industrializing society mass-produced new gadgets every day. Ludwig’s theory, however, predicted that order should not grow but disappear. It explained why heat flows from hot to cold, why swirls of milk disappear in coffee, and why whispers vanish in the wind. Ludwig showed that the microstructures of the universe gnaw away order, making it ephemeral. But he understood that this was not the full story.

The growth of order troubled Ludwig. It disturbed him in a way that only a scientist can understand. He intuited that something was missing from his theory, but he was unable to identify what that was. At the dusk of life, Ludwig became tired of battling both people and nature. Using a rope, he decided to take matter into his own hands. What was left was a shell of atoms that began a steady but certain decay, just as his theory predicted.

In 1906 Ludwig ended his life, but not the philosophical problems that troubled him. To explain the origins of physical order, Ludwig connected phenomena occurring at different spatial scales, mainly atoms and gases. Although it makes sense today, in Ludwig’s time working across spatial scales was a practice that violated an implicit contract among scientists. Many of Ludwig’s colleagues saw science as a hierarchy of Russian nesting dolls, with new structures emerging at each level. In this hierarchy, transgressing boundaries was thought unnecessary. Economics did not need psychology, just as psychology did not need biology. Biology did not need chemistry, and chemistry did not need physics. Explaining gases in terms of atoms, although not as preposterous as explaining human behavior in terms of biology, was seen as a betrayal of this implicit deal. Boltzmann had “sinned” by trying to explain the macroscopic properties of gases in terms of the motion of atoms.

The twentieth century vindicated Ludwig’s view of atoms, and to a lesser extent his passion for crossing academic boundaries. Quantum mechanics helped connect Ludwig’s atoms with chemistry and material science. Molecular biology and biochemistry helped connect the biology of the cell with the chemical properties of the proteins that populate them. On a parallel front, biology romanced psychology, as Darwin’s theory became a staple explanation of human behavior. Yet not all of the cross-fertilization took place near known scientific boundaries. Amid these multidisciplinary tangos, there was one concept that was promiscuous enough to play the field. This was the idea of information.

Information was the object of Ludwig’s fascination. It was the thing that eluded him and the thing he sought tirelessly to explain: why order in the universe could deteriorate even as it grew on earth.

In the twentieth century the study of information continued to grow. This time, however, the study of information was inspired not by the beauty of nature but by the horrors of war. During the Second World War competing armies developed a need to communicate using secret codes. These codes motivated efforts to decode intercepted messages, jump-starting the mathematical study of information.

Encoding and decoding messages was a mathematical problem that was too interesting to be abandoned as the war dwindled. Mathematicians continued to formalize the idea of information, but they framed their efforts in the context of communication technologies, rather than in terms of deciphering intercepted messages. The mathematicians who triumphed became known as the world’s first information theorists or cyberneticists. These pioneers included Claude Shannon, Warren Weaver, Alan Turing, and Norbert Wiener.

In the 1950s and 1960s the idea of information took science by storm. Information was welcomed in all academic fields as a powerful concept that cut across scientific boundaries. Information was neither microscopic nor macroscopic. It could be inscribed sparsely on clay tablets or packed densely in a strand of DNA. For many practical purposes, the scale at which information was embodied was not crucial. This scale independence made the idea of information attractive to academics from all fields, who adopted the concept and endowed it with their own disciplinary flavor.

Biologists embraced the idea of information as they explored how genes encoded inheritance. Engineers, inspired by the work of Shannon, designed transmitters and receivers as they wired the world with analog and digital networks. Computer scientists, psychologists, and linguists attempted to model the mind by building electronic thinking machines. As the twentieth century outgrew its atomic zeitgeist, information became the new ace in everyone’s hand.

The idea of information also found its way into the social sciences, and in particular into economics. Friedrich Hayek, an Austrian economist and a contemporary of Shannon, argued famously that prices transmitted information about the supply of and demand for goods. This helped reveal the information needed for Smith’s “invisible hand” to work. As Hayek wrote, “In a system in which the knowledge of the relevant facts is dispersed among many people, prices can act to coordinate the separate actions of different people.”

The idea of information also helped economists understand some important market failures. George Akerlof became famous by showing that markets could fail to operate when people had asymmetric information about the quality of the goods they wanted to exchange. On a parallel front, Herbert Simon, a polymath who contributed to economics, organizational theory, and artificial intelligence, introduced the idea of bounded rationality, which focused on the behavior of economic actors who had limited information about the world.

As the twentieth century roared along, the idea of information grew in status to an idea of global importance. Yet as the idea of information became more popular, we slowly began to forget about the physicality of information that had troubled Boltzmann. The word information became a synonym for the ethereal, the unphysical, the digital, the weightless, the immaterial. But information is physical. It is as physical as Boltzmann’s atoms or the energy they carry in their motion. Information is not tangible; it is not a solid or a fluid. It does not have its own particle, but it is as physical as movement and temperature, which do not have particles of their own either. Information is incorporeal, but it is always physically embodied. Information is not a thing; rather, it is the arrangement of physical things. It is physical order, like what distinguishes different shuffles of a deck of cards. What is surprising to most people, however, is that information is meaningless, even though the meaningless nature of information, much like its physicality, is often misunderstood.

In 1949 Claude Shannon and Warren Weaver published a short book entitled The Mathematical Theory of Communication. In its first section, Weaver described the conceptual aspects of information. In the second section, Shannon described the mathematics of what we now know as information theory.

For information theory to be properly understood, Shannon and Weaver needed to detach the word information from its colloquial meaning. Weaver made this distinction early on his essay: “The word information, in this theory, is used in a special sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.”

Shannon also made this point early in his section, albeit invoking engineering arguments instead of semantic distinctions: “The fundamental problem of communication is that of reproducing in one point either exactly or approximately a message selected at another point. Frequently, the messages have meaning. These semantic aspects of communication [referring to the meaning of a message] are irrelevant to the engineering problem.”

But why were Shannon and Weaver so eager to divorce information from meaning? They needed to separate information from meaning for both technical and philosophical reasons. On the technical side, Shannon was interested in the construction of machines that could help communicate information regardless of the meaning of the message. Mixing information and meaning obfuscated the engineering problem. On the philosophical side, Shannon and Weaver understood that their use of the words information and meaning embodied concepts that were fundamentally different. Humans, and some machines, have the ability to interpret messages and infuse them with meaning. But what travels through the wires or electromagnetic waves is not that meaning. It is simpler. It is just information.

It is hard for us humans to separate information from meaning because we cannot help interpreting messages. We infuse messages with meaning automatically, fooling ourselves to believe that the meaning of a message is carried in the message. But it is not. This is only an illusion. Meaning is derived from context and prior knowledge. Meaning is the interpretation that a knowledge agent, such as a human, gives to a message, but it is different from the physical order that carries the message, and different from the message itself. Meaning emerges when a message reaches a life-form or a machine with the ability to process information; it is not carried in the blots of ink, sound waves, beams of light, or electric pulses that transmit information.

Think of the phrase “September 11.” When I say that phrase, most Americans automatically think of the 2001 attack on the Twin Towers. Chileans usually think about the 1973 coup d’état. But maybe when I am saying “September 11” I am just telling my students that I will be back at MIT on that date. As you can see, the meaning of the message is something that you construct. It is not part of the message, even if it seems to be. Meaning is something that we attach seamlessly as we interpret messages, because humans cannot help interpreting incoming bursts of physical order. This seamlessness does not mean that meaning and information are the same.

To create machines that could transmit information regardless of the meaning of the message, Shannon needed a formula to estimate the minimum number of characters required to encode a message. Building on the work of Harry Nyquist and Ralph Hartley, Shannon estimated how much information was needed to transmit a message through a clean or noisy channel. He also estimated the economies of communication brought by correlations in the structure of messages—such as the fact that in English the letter t is more likely to precede h than q. Shannon’s philosophical excursions put him on a mathematical path similar to the one traversed by Boltzmann. At the end of the path, Shannon found a basic formula for encoding an arbitrary message with maximum efficiency. This formula allowed anyone to embody information in a magnetic disk, electromagnetic waves, or ink and paper. Shannon’s formula was identical to the one Boltzmann had put forth almost fifty years earlier. This coincidence was not an accident.

The convergence of Shannon’s formula with Boltzmann’s points to the physical nature of information. That physical reality is critical to seeing how a study of atoms can help us understand the economy. For the most part, the natural sciences have focused on describing our universe from atoms to people, connecting the simplicity of the atom with the complexity of life. The social sciences have focused on the links among people, society, and economies, recasting humans as a fundamental unit—a social and economic atom, if I may. Yet this divorce is not lossless, as the mechanisms that allow information to grow transcend the barriers that separate the lifeless from the living, the living from the social, and the social from the economic.

So I will dedicate the following pages to an exploration of the mechanisms that contribute to the growth of information at all scales, from atoms to economies. Not from atoms to people, or from people to economies, as it is usually done. This will help us create bridges between the physical, biological, social, and economic factors that contribute to the growth of information and also limit our capacity to process information. That information-processing capacity involves computation, and at the scale of humans it requires the “software” we know colloquially as knowledge and knowhow. The result will be a book about the history of our universe, centered not on the arrow of time but on the arrow of complexity.

And it is the arrow of complexity—the growth of information—that marks the history of our universe and species. Billions of years ago, soon after the Big Bang, our universe did not have the capacity to generate the order that made Boltzmann marvel and which we all take for granted. Since then, our universe has been marching toward disorder, as Boltzmann predicted, but it has also been busy producing pockets that concentrate enormous quantities of physical order, or information. Our planet is a chief example of such a pocket.

The wave of stars that preceded the formation of our solar system synthesized the atomic elements needed for life to form. These elements included carbon, oxygen, calcium, nitrogen, and iron. From the corpses of these stellar ancestors a new generation of stars was formed. This time around, the planets that orbited them had the chemical richness required for life to evolve. Our planet, which is four to five billion years old, has since then exploited this chemical richness to become a singularity of complexity. For billions of years, information has continued to grow in our planet: first in its chemistry, then in simple life-forms, more recently in us. In a universe characterized mostly by empty space, our planet is an oasis where information, knowledge, and knowhow continue to increase, powered by the sun but also by the self-reinforcing mechanisms that we know as life.

Yet the continuity between the physics of the stars and the life-forms that populate our planet includes just two stops along the timeline of complexity and information. The evolution of information cuts across all boundaries, extending even to the information begotten by our economy and society. Information, when understood in its broad meaning as physical order, is what our economy produces. It is the only thing we produce, whether we are biological cells or manufacturing plants. This is because information is not restricted to messages. It is inherent in all the physical objects we produce: bicycles, buildings, streetlamps, blenders, hair dryers, shoes, chandeliers, harvesting machines, and underwear are all made of information. This is not because they are made of ideas but because they embody physical order. Our world is pregnant with information. It is not an amorphous soup of atoms, but a neatly organized collection of structures, shapes, colors, and correlations. Such ordered structures are the manifestations of information, even when these chunks of physical order lack any meaning.

But begetting information is not easy. Our universe struggles to do so. Our ability to beget information, and to produce the items, infrastructures, and institutions we associate with prosperity, requires us to battle the steady march toward disorder that characterizes our universe and which troubled Boltzmann. To battle disorder and allow information to grow, our universe has a few tricks up its sleeve. These tricks involve out-of-equilibrium systems, the accumulation of information in solids, and the ability of matter to compute. Together these three mechanisms contribute to the growth of information in small islands or pockets where information can grow and hide, like the pocket we call our planet.

So it is the accumulation of information and of our ability to process information that define an arrow of growth encompassing the physical, the biological, the social, and the economic, and which extends from the origin of the universe to our modern economy. It is the growth of information that unifies the emergence of life with the growth of economies, and the emergence of complexity with the origins of wealth.

Yet the growth of information is uneven, not just in the universe but on our planet. It takes place in pockets with the capacity to beget and store information. Cities, firms, and teams are the embodiment of the pockets where our species accumulates the capacity to produce information. Of course, the capacity of these cities, firms, and teams to beget information is highly uneven. Some are able to produce packets of information that embody concepts begotten by science fiction. Others are not quite there.

So by asking what information is and why it grows, we will be exploring not only the evolution of physical order but that of economic order as well. We will be connecting basic physical principles with information theory, and also with theories of social capital, economic sociology, theories of knowledge, and the empirics of industrial diversification and economic development. By asking why information grows, we will be asking about the evolution of prosperity, about rich and poor nations, about productive and unproductive teams, about the role of institutions in our capacity to accumulate knowledge, and about the mechanisms that limit people’s capacity to produce packets of physically embodied information. We will be taking a step back from traditional approaches to understanding social and economic phenomena. Instead, we will be generating a description that seeks to integrate physical, biological, social, and economic mechanisms to help explain the continuous growth of something that is not a thing. That something, which fascinates you and me as much as it did Boltzmann, is physical order, or information. It is the high concentration of complexity that we see every time we open our eyes, not because information is everywhere in the universe but because we are born from it, and it is born from us.

The obvious exceptions to this are geology and astronomy

Excerpted from Why Information Grows: The Evolution of Order, from Atoms to Economies by Cesar Hidalgo. With permission of the publisher, Basic Books

References

  1. In this context, the word atom is used to refer mainly to discrete particles, which could be either atoms or molecules.
  2. Two great books describing the interaction between evolution and behavior are Richard Dawkins, The Selfish Gene (Oxford: Oxford University Press, 2006), and Steven Pinker, The Blank Slate: The Modern Denial of Human Nature (New York: Penguin, 2003).
  3. Information theory also has a quantum version, known as quantum information theory. The existence of quantum information theory, however, does not invalidate the claim that classical information is a concept that works at a range of scales that is unusual for other theories.
  4. Friedrich Hayek, “The Use of Knowledge in Society,” American Economic Review 35, no. 4 (1945): 519–530.
  5. George A. Akerlof, “The Market for ‘Lemons’: Quality Uncertainty and the Market Mechanism,” Quarterly Journal of Economics 84, no. 3 (1970): 488–500.
  6. Claude E. Shannon and Warren Weaver, The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1963), 8.
  7. Ibid., 31.
  8. EQUATION TK

4  December 2015


Donating = Changing Economics. And Changing the World.

Evonomics is free, it’s a labor of love, and it's an expense. We spend hundreds of hours and lots of dollars each month creating, curating, and promoting content that drives the next evolution of economics. If you're like us — if you think there’s a key leverage point here for making the world a better place — please consider donating. We’ll use your donation to deliver even more game-changing content, and to spread the word about that content to influential thinkers far and wide.

MONTHLY DONATION
 $3 / month
 $7 / month
 $10 / month
 $25 / month

ONE-TIME DONATION
You can also become a one-time patron with a single donation in any amount.

If you liked this article, you'll also like these other Evonomics articles...




BE INVOLVED

We welcome you to take part in the next evolution of economics. Sign up now to be kept in the loop!