key: cord-005191-a70eedna authors: Cohen, Irun R. title: Informational Landscapes in Art, Science, and Evolution date: 2006-06-08 journal: Bull Math Biol DOI: 10.1007/s11538-006-9118-4 sha: doc_id: 5191 cord_uid: a70eedna An informational landscape refers to an array of information related to a particular theme or function. The Internet is an example of an informational landscape designed by humans for purposes of communication. Once it exists, however, any informational landscape may be exploited to serve a new purpose. Listening Post is the name of a dynamic multimedia work of art that exploits the informational landscape of the Internet to produce a visual and auditory environment. Here, I use Listening Post as a prototypic example for considering the creative role of informational landscapes in the processes that beget evolution and science. invites us to explore it-physically or in reverie. A landscape of associations weaves a narrative. An informational landscape denotes an array of information that, like a natural landscape, invites exploration; the informational landscape too holds a narrative. Here, I shall use Listening Post as an allegory to explore two other systems that deal with informational landscapes: biologic evolution and human understanding. Waddington has used the term epigenetic landscape as a metaphor to describe the interactions between genes and environment that take place during embryonic development (Waddington, 1940) . An informational landscape is quite another matter; this landscape represents the maze of information available for potential exploitation by a suitable system. As we shall see below, the informational landscape is a substrate for system-making. Let's start by seeing how Listening Post exploits information to organize a work of art. Listening Post is formed by two components: an overt visual-auditory display designed by artist Rubin and a covert process designed by mathematician Hansen. The display is composed of a suspended rectangular grid of over 200 brick-sized electronic monitors and a set of audio installations (Fig. 1) . The monitors pulsate with fragments of texts and compositions of light and the sound tracks pulsate with musical passages and artificial speech. The display triggers living associations: "a sense of cycles in life, day and night, the seasons. . .the information. . .lighting up as if the world is awakening from sleep and later changing to large sweeps." (Eleanor Rubin, personal communication; see, http//www.ellyrubin.com). The covert process that produces Listening Post's art is an algorithm developed by Hansen, then a mathematician at Bell Laboratories. The algorithm randomly samples, in real time, the many thousands of chats, bulletin boards, and bits of message that flow dynamically through the cyberspace of the Internet. This simultaneous mélange of signals, in the aggregate, is meaningless noise. The algorithm, by seeking key words and patterns of activity, artfully exploits this raw information to construct patterns of light, sound, and words that please human minds. The substrate of information flowing over the Internet is in constant flux so the patterns presented by Listening Post are unpredictable at the fine microscopic scale; but at the macroscopic scale of sensible experience, Listening Post is manifestly pleasing. The patterned flow of sights, sounds, and words (seen and heard) arouses associations, memories, and feelings in the mind of the viewer-which is what we call art: art, whether visual, auditory, tactile, gustatory, or verbal, is an artifact made to arouse associations, memories, and feelings. Listening Post is an artful representation of the complexity of information flowing through the Internet; Listening Post transforms the Internet's massive informational landscape into a comprehensible miniature. Two attributes of Listening Post illustrate our theme: the work feeds on information designed for other purposes and it survives by engaging our minds. Above, I have used four words-information, signal, noise, and meaning-that usually need no definition; we use them every day. But they are important to this discussion, so I will define them. I use the word information in the sense defined by Shannon (1948) : information is a just-so arrangement-a defined structure-as opposed to randomness (Cohen, 2000) . Information becomes a signal if you respond to the information. The meaning of the signal is how you respond to it (Atlan and Cohen, 1998) . For example, a Chinese character bears information (it is arranged in a characteristic form), but the character is not a signal unless you know how to read Chinese. What the Chinese character means is the way the community of Chinese readers use the character and respond to it. Information, then, is an intrinsic property of form. In biology, form is expressed by the characteristic structures of molecules, cells, organs, organisms, populations, and societies. Informational structures are not limited to material entities: organizations and processes through their defining regularities also fashion informational structures. An informational landscape encompasses a totality of information-material and organizational. Information expressed in biological structures is essential to life, but structure alone is not sufficient for living. Biological information is collective and reproducible-the structural information repeats itself at various scales, from molecules through societies (and beyond into human culture). The information encoded in biological structures persists and cycles; we call it development, physiology, and reproduction. But most importantly, biological structures perform functions-biological information bears meaning (Neuman, 2004) . Meaning, in contrast to information, is not an intrinsic property of an entity (a word or a molecule, for example); the meaning of an entity emerges from the interactions of the test entity (the word or molecule) with other entities (for example, words move people, and molecules interact with their receptors, ligands, enzymes, etc.). Interactions mark all manifestations of biological structure-molecular through social. Meaning can thus be viewed as the impact of information-what a word means is how people use it; what a molecule means is how the organism uses it; what an organism means is what it does to others and how others respond to it; and so on over the scales life-meaning is generated by interaction (Cohen, 2000) . In summary, we can connect the three conceptsinformation, signal, and meaning thusly: A signal is information with meaning. In other words, signalling is the ability of information to elicit a response, which is the meaning of the information. Note that a signal, like information in general, is free of intentions; astrophysicists, for example, receive signals from distant galaxies, which have no interest in communicating with an astrophysicist; it is the response of the astrophysicist to the galactic radiation (meaning-making) that transforms the structure of the radiation (its information) into a signal. Noise is more varied than signal. Noise can refer to a number of different situations: randomness as opposed to structured information (a scribble versus a Chinese character); information without meaning (a Greek conversation overheard by one who does not understand Greek); meaningful information in a meaningless context (across-the-room cocktail-party chatter). Noise in the generic sense proposed by Shannon is the unstructured; but noise can also refer to meaningless information that might be meaningful in other circumstances. For example, a DNA sequence composing an immunoglobulin V gene element has little meaning until the sequence is recombined with other DNA sequences to form a functioning antibody gene (Cohen, 2000) -that recombination charges the DNA sequence with meaning. Whether cyberspace transmits noise or signal, as we have defined the terms, depends on who you are and what you seek. If you are the recipient of a specific bit of information-a chat or an email directed to you, for example-then you perceive signal. But if you don't understand the language, or if you do understand the language but see intruding spam, or experience a profusion of many messages at once, then you hear only noise. (Noise too can have meaning if it makes you leave the room). The difference then between signal and noise, one might claim, is having the right reception. Or, to put it another way, the same information can be either noise or signal, depending on how you perceive it-or transmit it. Combatants may attempt to evade detection by disguising signals as noise; for example, humans encrypt secret transmissions (Kahn, 1996) and infectious agents scramble antigens (Cohen, 2000; Cohen, 2003) . The transformation of noise back into signal is part of the game; counter-intelligence can learn to break the enemy's linguistic codes; an immune system can learn to decipher a pathogen's molecular codes. In informational terms, Listening Post is a machine that transforms noisy cyberspace information into a new narrative by selecting and recombining fragments of the flux. Listening Post dynamically self-organizes, similar to a living organism (Atlan, 1987; Weisbuch and Solomon, 2002) . The Internet created a new informational landscape, a new niche, that could be sampled and exploited by Hansen and Rubin to enhance their fitness as artists in the wilds of the Manhattan art world (Fig. 2) . Biological evolution traditionally is described in terms of fitness: species evolve because the fittest survive to pass on their genes, while the unfit die with their genes (Darwin, 1859). Survival-of-the-fittest thus selects the "best" genetic variants from among the phenotypes in the breeding population (Plotkin, 1994) . The process leads, with time, to creatures that are better adapted and improved. Improved fitness is the aim of evolution, so say some experts. But there are at least two problems with the concept of fitness: First, fitness is difficult to define; most definitions involve a circular argument-what survives is fit, by definition (Gould and Lewontin, 1979) . This amounts to a tautology: fit is fit. Attempts have been made to define fitness in terms of reproductive success (Hoffman et al., 2004) . But different species survive well despite vastly different numbers of surviving offspring: compare the reproductive strategy of the elephant with that of the gnat that rides on its back; compare the whale with the sardine; each to its own reproductive profligacy or dearth. Second, evolution does not rest with fit creatures; evolution assembles increasingly complex creatures. Accumulating complexity is manifest in the evolutionary tree. Creatures higher in the evolutionary tree-more recent creatures-tend to be more complex than the creatures that preceded them: eukaryotic cells deploy more genes and house more organelles than do prokaryotic cells; mammals have more organs and express more behaviors than do the trees, round worms, or insects that preceded them. Quite simply, evolution generates complexity (Fig. 3) . Now, one might argue that the more complex species is the more fit species; if that is true, then the quest for fitness alone should generate increasing complexity. But is that true; does fitness itself drive complexity onward? What is complexity? Complexity is a relative term: More complex entities compared to simpler entities incorporate more component parts and integrate more diverse interactions between their component parts. Moreover, the single component parts of complex living systems usually participate in a variety of different functions (pleiotropism). Complexity thus can be defined in terms of information; complex systems sense, store, and deploy more information than do simple systems. Complexity presents two aspects: intrinsic and extrinsic. A complex system such as a cell, brain, organism, or society is complex intrinsically because of the way its parts interact and hold the system together. A complex system is also complex extrinsically because we who study it have difficulty understanding the properties that emerge from it (the biological perspective); we also have trouble fixing it (the medical perspective). Evolution. The Environment constitutes an Informational Landscape of considerable Complexity (designated by the bundles of arrows) that in the aggregate is not readily useful (note the puzzled face, ?). However, a creature with a suitable Physiology can extract useful information (a limited bundle of information) and the creature's new Phenotype is subject to Selection. Survival can lead to a New Species (New Meaning) and so the Complexity of the Informational Landscape is amplified and enriched through positive feedback. The new species too becomes an Informational Landscape for further exploitation by parasites and yet newer species. Fitness, unlike complexity, does not relate to the way a system is put together or the way the system is understood by those who study it. Fitness relates to the success of a system in thriving in its own world. We can conclude, therefore, that fitness and complexity describe independent attributes. So there is no necessary correlation between complexity and fitness: No dinosaurs have survived irrespective of how complex they were (and some were very complex indeed). In fact, it seems that the dinosaurs were unfit because they were too complex to survive the environmental disruptions brought about by the earth's collision with a comet (Morrison, 2003) . Primitive bacteria have survived such calamities despite their relative simplicity (probably because of their relative simplicity). Indeed, the lowest and simplest bacteria will probably prove to be more fit for survival than we more complex humans if we both have to face a really severe change in the world environment. Extremely complex systems, like us, are extremely fragile, and so they are less fit in certain situations. The bottom line is that the quest for fitness cannot explain the rise of complexity. But why then is complexity usually-but not always-more evident the higher one climbs up the evolutionary tree? It has been possible to demonstrate the evolution of complexity mathematically (Chaitin, 1998; Wolfram, 2002; Lenski et al., 2003) . But evolution on a computer (in silico) is not evolution in the world of nature (in mundo). If complex systems tend to be fragile, why does evolution persist in devising them? The accumulation of complexity during evolution can be explained, despite fragility, by a principle of self-organization; the principle, formulated by Atlan, is that existing information tends automatically to breed additional new information (Atlan, 1987 ). Atlan's argument goes like this: Existing information first generates surplus copies of itself, which happens regularly in reproducing biological systems. The surplus copies can then safely undergo mutations, and so create modified (new), added information without destroying the untouched copies of the old information. The system thus becomes enriched; it now contains the new information along with the old information. Indeed, it appears that the complexity of vertebrate evolution was preceded and made possible by a seminal duplication of the ancestral genome (Dehal and Boore, 2005 ). Atlan's formulation implies that the more structures (the more information) a system encompasses, the greater the likelihood of variation in that system (discussed in Cohen, 2000) . Hence, the amount of existing information composing a system (previously evolved structures) breeds a commensurate amount of new information (variant structures). Note that the constant variation of existing information is guaranteed by the Second Law of Thermodynamics-order tends automatically to dissipate into variant structures; hence, any information will mutate over time (Fig. 4) . Information, in other words, feeds back on itself in a positive way; a great amount of information, through its variation, leads to even more information. And as information varies it increases, and so does complexity. Once a relatively stable, nonrandom structure (an informational entity) comes into being, it will be subject to variation. All genes, for example, mutate; proteins assume different conformations and functions; minds get new ideas. Repositories of information like genes, proteins, minds, cultures, and so forth, vary to generate new genes, proteins, minds, or cultures that then get exploited for additional uses. A human is manyfold more complex than is a round worm, yet the human bears less than twice the number of genes borne by the worm (about 35,000 human genes compared to about 20,000 analogous worm genes). The human species has accumulated its characteristic complexity by using its set of gene products in much more complicated ways than does the worm. Humans exploit a more complex informational landscape than do round worms. Humans are also more fragile: the Columbia space shuttle that disintegrated upon re-entry into the atmosphere in 2003 carried both humans and round worms; only the round worms survived. The difference between classical Darwinian evolution and the idea of evolving informational landscapes is highlighted by the difference between selection and exploitation. Darwin, familiar with the art of animal husbandry, applied to Nature the Atlan (1987) , the net amount of information in a system will increase only if two conditions are satisfied: the Original Information Duplicates to produce surplus copies and the surplus copies Mutate to produce variant new Information. System X, for example, has not duplicated its information (triangle); even if the triangle mutates, there will be no net increase in Information, since the Original Information is merely replaced (n = n). System Y, in contrast, has duplicated its original Information (both triangle and box), and so mutation of each surplus copy generates a net increase in Information. Also note that Information accelerates logarithmically (log2): in System Y, two mutations can give rise to n × 4 Complexity. A system that starts out with more information (System Y) will generate new information faster than does a system that starts with less information (System X). wisdom of artificial selection; the wise breeder selects for propagation from among the variant household creatures those individuals most fitting market or household needs. Likewise, Nature has selected for propagation those creatures most fit for survival-natural selection recapitulates artificial selection (Darwin, 1859). My point here is that the natural informational landscape, in contrast to the 19th century English manor, does not merely provide grounds for selecting what fits the market or the whims of the landlord; the natural informational landscape provides grounds for extravagant exploitation. Any organism, simple or complex, that manages to mine the landscape for enough energy and information to create meaning (through productive interactions) might manage to survive there. Exploitation, as I use the term here, refers to the modification of information for new uses. Listening Post arose by exploiting the new informational landscape provided by the Internet, which itself arose by exploiting other informational landscapeslanguage, computers, society, culture. Let us extend the Listening Post metaphor and say that biologic evolution proceeds through the inevitable exploitation of new informational landscapes by new or variant creatures. Evolving and evolved creatures themselves become part of an enriched informational landscape, continuously available for further exploitation by other creatures. The dynamics of evolution are the dynamics of information. Like the algorithm of Listening Post, an evolving species creates new meaning by exploiting information flowing through its environment-its cyberspace. That in a nutshell is the informational view of evolution. Information exploits information and compounds information to generate new meanings; life is constant ferment. The escalation of complexity is evident medically. Consider parasitism: there exists no creature that is not exploited by some other creature. Hosts are informational landscapes for raising parasites. Indeed, the biosphere is one great food chain; each species makes a living by exploiting the information (structures and processes) of other creatures. The generative informational landscape includes even the artifacts of cultural evolution, and not only the natural products of chemical and cellular evolution. The human brain generated language; language generated culture; and culture is changing the world. The evolution of human air travel-a complex of machines, logistics, economics, and habits of mind designed for anything but parasites-contributed to the informational landscapes that generated the spread of HIV and West Nile Virus; poultry farming gave rise to SARS and avian influenza; air conditioning systems provided a landscape for Legionnaires' Disease. Information, because of positive feedback, breeds more information, and complexity results. Note that positive feedback will accelerate any process; witness the accelerating complexity that marks our world: 100,000 years of hunting and gathering was replaced by 10,000 years of agriculture and animal husbandry that generated a thousand years of accelerated urbanization, 200 years of industrialization, a few decades of informational civilization, and an emerging global economic culture. Unfortunately, positive feedback, unless regulated by negative feedback or other kinds of control, accelerates into instability, chaos, and disaster (Robertson, 1991; Segel and Bar-Or, 1999) . Information that breeds ever more complex systems is dangerous-like the dinosaurs, an overly complex system collapses of its own inherent fragility, and complexity has to again re-start its evolution (Fig. 5) . Consider an example closer to home: Your personal computer crashes when you try to run too many programs simultaneously; the more complex your program, the more often you have to push restart. (Being smarter than dinosaurs, we might use our foresight to prevent global warming, over-population, collective terror, and who knows what else; let us avoid having to restart a world.) In summary, we can say that complexity is managed throughout evolution by a balance between two opposing forces: autocatalysis and fragility. On the one hand, complexity inexorably increases in time through the autocatalytic force of increasing information. On the other hand, catastrophic extinctions mark the fragility of large complex ecosystems; on a lesser scale, complexity may also be held in check or reduced by the day-to-day survival of the fittest creature that may be the less complex creature. Darwin's concept of natural selection, including survival of the fittest, does play a critical role in the process of evolution, but mostly after a new or variant species has begun to exploit an informational landscape. Quite simply, the species has to survive long enough to maintain itself. Indeed, the informational landscape might include other species competing for the same sources of information and energy; Fig. 4 ) over evolutionary time (t) leads to intrinsically fragile complex systems (c) that are susceptible to crash when challenged by severe environmental perturbations. Complexity then has to Restart its accumulation from a lower level. The scales and the form of the curve shown here are hypothetical. A species' fitness may be quantified as the measure of time occupied by that species from its origin to its extinction. This formulation avoids the difficulties of identifying factors common to all the various species with their vastly different strategies for survival (reproduction rate, lifespan, efficiency of energy use, etc.); to quantify the fitness of a species, we merely measure its survival time. In the hypothetical Figure, we compare the fitness of a hypothetical bacterium that continues to survive for some 5 × 10 8 years compared to a hypothetical dinosaur species that survived for some 10 7 years till its extinction about 65 × 10 6 years ago. The human species arose some 10 5 years ago, and who knows how long it will last. The figure suggests that there is no positive correlation between complexity (c) and fitness; the opposite might be the case. in that case, survival of the fittest is an apt description of the conflict between the competitors. But much of evolution occurs without competition. In fact it is clear that many products of evolution are neutral; they manifest no selective advantage over other phenotypes (Kimura, 1983) . Neutral evolution simply expresses the exploitation of an informational landscape: survival of the fittest is not an explanation for neutral evolution. Fitness, then, is only another word for survival (so, as I said, survival of the fittest is a tautology). The measure of a species' fitness can be assessed most clearly by hindsight, by observing the species' past history of success. A species that manages to last only a short time in its informational landscape is manifestly less fit than is a species that lasts a longer time in its informational landscape. Thus the fitness of a species is commensurate with the amount of time the species has persisted from its inception to its extinction, its crash. Fitness can be measured quantitatively by the amount of time needed by the environment to eradicate the species-by the elapsed time to unfitness, the time to extinction (Fig. 5) . This notion of fitness will be elaborated elsewhere; the point I want to make here is that fitness is merely the temporary absence of unfitness. Fitness, in other words, can be reduced to mere survival. The offspring of fitness is taught classically to be improvement. Listening Post shows us that the evolution of new arrangements of information (new complexity) may not necessarily lead to improvement. Hansen and Rubin use new information to create a new art; is Listening Post an improvement over Rembrandt's old art? Homo sapiens is certainly more artistically pleasing than is E. coli or C. elegans, but hardly better adapted (Cohen, 2000) . Indeed, in the world of art, fitness has always been measured by survival time; it's only natural. Rembrandt's paintings have thrived for hundreds of years and will certainly continue to live with us. Influenced by the beauty of Newtonian mechanics, biologists have long embraced the hope that if we could inspect the ultimate parts of an organism, we would be able to reduce the complexity of the organism to simple principles-the aim of ultimate understanding. The organism just had to be some sort of machine, however complicated. Now we have almost achieved our wish and are close to seeing all the parts of the organism (the genome, the proteome) out on the table. But to our dismay, the organism is not just a complicated clock. Even solving the genome project has not graced us with ultimate understanding (Cohen and Atlan, 2003) . The organism clearly is not a collection of wheels and cogs; the organism is more akin to cyberspace. In place of electromagnetic codes generated by computer networks, the information flowing within and through the cell-life's subunit-is encoded in molecules. But the informational structure of both networks, cell and Internet, is similar: Each molecule in a cell, like a chat box signal, emerges from a specific origin, bears an address, and carries a message. Our problem is that the cell's molecules are not addressed to our minds, so we don't understand them. The mind exists on a different scale than does the cell; the mind and the cell live in different informational landscapes. We are unable to directly see molecular information; we have to translate the cell's molecules and processes into abstract representations: words, numbers, and pictures. The cell looks to us like a seething swarm of molecules, large and small, that appear redundant, pleiotropic, and degenerate (Cohen, 2000) . Every ligand signals more than one receptor, every receptor binds more than one ligand; every species of molecule has more than one function; and every function seems to be carried out redundantly by different agents. Causes cannot be reduced to simple one-to-one relationships between molecules; biologic causality seems more like a complex pattern or web of interactions. The flowing patterns of subunit molecules in the cell, like aggregates of Internet signals, make no obvious sense to us, the outside observers. We stand before the cyberspace of the cell and the maelstrom of information that confronts us becomes noise (Fig. 6) . The more information we gather, the more confused we become. The flow of information generated by the living cell, viewed from our scale, is turbulence. How can we understand living matter when its complexity exceeds the ability of our minds to remember and process the mass of accumulating information? Intrinsic complexity (the organism) leads to extrinsic complexity (our confusion). The informational landscape of the cell-organism-species-society is like the informational landscape of the Internet; viewed in the aggregate it is incomprehensible noise. True, minds can be helped by computers. But understanding a cell is not to be had merely by reproducing in silico the complexity of a real cell, even if that were possible. Cataloguing the molecules and their connections to simulate the cell on a computer is a useful way to begin. But precise specification is not enough. Human understanding is not mere representation-linguistic, mathematical, visual, or auditory; understanding is the exercise of proficiency (Cohen, 2005; Efroni et al., 2005) . We understand a thing when we know how to interact with it and use it well. Thus, we understand complex information by transforming it, as does Listening Post, into a meaningful signal. We understand complexity by learning to respond to it in new and productive ways; information triggers new thoughts. Human understanding is not a static state of knowledge; human understanding is a creative process of interaction with the world (Fig. 6) . The information present in the world is a fertile landscape for growing ideas. This functional view of understanding fits the definition of science proposed by the scientist and educator James B. Conant, president of Harvard University from 1933 to 1953 is "an interconnected series of concepts and conceptual schemes that have developed as the result of experimentation and observation and are fruitful for further experimentation and observation" (Conant, 1951) . Science, according to Conant, is a self-perpetuating process: scientific understanding amounts to doing good experiments that lead to more ideas for better experiments (and leading, in the case of needs, to better solutions). So to understand the cell means to turn its complexity into a metaphor that productively stimulates our minds to devise better experiments, ideas, and treatments. Classically, we have been taught that science is driven by the formulation of hypotheses and by experiments designed to discredit them (Popper, 2002) . A hypothesis that has been falsified experimentally is then replaced by a modified hypothesis. The new, modified hypothesis too is tested by experimentation and its falsification leads to a third hypothesis. Thus, science advances ideally toward the truth by continuously adjusting its hypotheses through experimentation. Unfortunately, this description of science may be suitable for certain fundamental aspects of physics and chemistry, but the study of biology and other complex systems doesn't really progress that way. Living systems are just too complex to be described adequately by simple hypotheses (Pennisi, 2003) . We seem to learn much more by tinkering with them than we do by hypothecating about them. But biology aims for more than mere tinkering; we don't want merely to accumulate data; we want to comprehend essential principles about life. How can masses of complex data be transformed into comprehension? Biologists today (and in time to come) are joining forces with information scientists to develop ways to answer that question. Computers are helpful. The most primitive of computers puts our memory to shame; our conscious minds are no match for advanced parallel processing. Nevertheless, we are endowed with a unique gift. Our cognitive advantage over the computer is our ability to see associations, to create and use metaphors. The computer's memory is composed of precisely retrievable lists; our memory is a web of associations-our memory is not the computer's RAM. But we, unlike computers, can create art and science, and we can respond to art and science. In closing, I will describe one example, most familiar to me, of the new synthesis between biology and information science-between mind and computer. My colleagues and I have begun to consider ways we might achieve two objectives: to record and catalogue complex scientific data in a precise format amenable to computer-assisted simulation and testing; and to have the data themselves construct representations that stimulate human minds productively. We have termed this two-tiered approach Reactive Animation (RA). RA emerged from our simulation of the complex development of a key class of cells in the adaptive immune system-T cells (Efroni et al., 2003) . In the first tier, we recorded basic information about T-cell development culled from some 400 research papers using the visual language of Statecharts to convert the data to a precise computer format (Harel, 1987) ; Statecharts had been developed by David Harel and his colleagues more than 20 years earlier for building and analyzing complex man-made systems (actually, Statecharts grew out of the need to coordinate the planning of a new fighter aircraft). Statecharts seems suitable to deal with biologically evolved systems and not only with systems generated by human minds; objects such as molecules, cells, and organs can be described in terms of their component parts, interactions, and transitions from state to state, which is the way we actually do our experiments and record the data. Moreover, the Statecharts formalism is modular; we can easily add new objects, interactions, and states as we accumulate information. And most importantly, we can run the system dynamically on a computer and see the outcomes in Statecharts format of any stimulation or modification of the system we care to study. Sol Efroni, at the time a student jointly supervised by David Harel and me, then added a second tier to the representation; he devised a way for the Statecharts simulation to produce and direct an animation of the action showing the T cells and other cells moving, interacting, multiplying, differentiating, or dying in the course of development in the organ called the thymus (Efroni et al., 2003) . RA makes it possible for us to cross scales-to zoom into individual cells and their component molecules and to zoom out to view thousands of cells forming the thymus as an organ, as we please. RA allows us to experiment with the animated system in silico. Best of all, RA shows us the emergence of properties in T-cell development we never dreamed were there; the animation arm of RA reveals to our eyes aspects of the data hidden from intuition. The experiments motivated by RA in silico lead to new laboratory experiments in mundo; the results produce new data for improved Statecharts, and the cycle described by Conant continues (Fig. 7) . The details of RA and of T cells and thymus are well beyond the scope of the present article; but know that RA, like Listening Post provides us with a representational analogue of how we might approach seemingly incomprehensible complexity; the complexity is reduced to representations that engage the mind. RA transforms the seeming noise of a complex system into a useful informational landscape (Fig. 6) . Listening Post exemplifies how transformations of informational landscapes are at the roots of the tree of life-a tree whose arborizations include biologic evolution and human understanding. Look again at Figs. 2, 3, and 6; they represent the same processes in essence-only the labels vary. Art and science flourish when they transform old information into a narrative of new ideas that engage ever more human minds. This issue of the Bulletin for Mathematical Biology is composed of a series of papers written in memory of our colleague Lee Segel, Professor of Applied Mathematics at the Weizmann Institute of Science, whose recent death has touched us all. Lee's child-like curiosity led him to apply math as an experimental tool to probe the wonder of living systems. He was always asking questions; he always wanted to see the latest experimental results; he always had an interpretation and a new question. Lee loved to teach as he loved to learn. The pleasure of his company was grounded in curiosity and wonder. For me, Lee was a central influence in a landscape of information that led me to explore ideas beyond the raw data of my particular field of research-cellular immunology. For me, Lee transformed experimental information into a narrative of new ideas that engaged my mind in ever evolving ways. He was my teacher as well as my friend. He lives on in all my work. Fig. 7 Reactive animation. The translation of experimental data into the language of Statecharts converts complex masses of basic facts into a format suitable for simulation in silico. Reactive animation (RA) empowers the Statecharts simulation to generate a realistic animation of the system and allows one to experiment with the representation in silico. The animation in silico reveals emergent properties of the system and stimulates the experimenter to undertake new experiments in mundo. The new data enter the cycle of animated representation and improved experimentation. RA represents the data in a way that engages the mind. Self-creation of meaning Immune information, self-organization and meaning The Limits of Mathematics, The Unknowable, Exploring Randomness, Conversations with a Mathematician Tending Adam's Garden: Evolving the Cognitive Immune Self Regen und Auferstehung: Talmud und Naturwissenschaft im Dialog mit der Welt Limits to genetic explanations impose limits on the human genome project HIV. Escape artist par excellence Science and Common Sense Two rounds of whole genome duplication in the ancestral vertebrate Toward rigorous comprehension of biological complexity: modeling, execution, and visualization of thymic T-cell maturation A theory for complex systems: reactive animation Poised Between Old and New. The Gallery. The Wall Street Journal The spandrels of San Marco and the panglossian paradigm: a critique of the adaptationist programme Statecharts: a visual formalism for complex systems Exploring the relationship between parental relatedness and male reproductive success in the Antarctic fur seal Arctocephalus gazella The Codebreakers: The Comprehensive History of Secret Communication from Ancient Times to the Internet, Rev The Neutral Theory of Molecular Evolution The evolutionary origin of complex features Impacts and evolution: Future prospects Meaning-making in the immune system Tracing life's circuitry Darwin Machines and the Nature of Knowledge The Logic of Scientific Discovery, 15th edn On the role of feedback in promoting conflicting goals of the adaptive immune system A mathematical theory of communication. Bell Syst. Tech. J. 30, 50. Waddington, C.H., 1940. Organizers and Genes A New Kind of Science I am the Mauerberger Professor of Immunology at the Weizmann Institute of Science, the Director of the Center for the Study of Emerging Diseases and a member of the Steering Committee of the Center for Complexity Science, Jerusalem, and the Director of the National Institute for Biotechnology in the Negev, Ben-Gurion University of the Negev. This paper has emerged from discussions with Henri Atlan, Yonatan Cohen, Sol Efroni, David Harel, Uri Hershberg, Dan Mishmar, Ohad Parnes, Ben Rubin, David Rubin, Eleanor Rubin, Eitan Rubin, Lee Segel, and Sorin Solomon.