Stone Soup

Use brings overflowing abundance.
— Dan Bricklin (“Cornucopia of the Commons”, 2000)

A weary traveler came upon a small village, asking for a warm meal and shelter for the night. “We’ve had no food for ourselves,” the villagers looked as hungry as they could. “It has been a poor harvest.”

“Well then, seeing that you have nothing, I’ll have to make stone soup," the traveler said loudly. The villagers stared. A large iron pot was brought to the village square, and it took many buckets of water to fill. A fire was built and the pot was set to boil. With great ceremony, the traveler produced three round, smooth stones and dropped it into the pot. The villagers’ eyes grew round.

“Stones like these generally make good soup.” The traveler smacked his lips in anticipation. “But if there were carrots, it would be much better.” Soon a boy appeared, holding a bunch of carrots retrieved from its hiding place. “A good stone soup should have cabbage,” said the traveler as he sliced the carrots into the pot. “But no use asking for what you don’t have.”

A girl returned with two small cabbages in her hands. “If we only had a bit of onions, this soup would be fit for a prince.” The villagers found a sack of onions, and then some barley, potatoes, and even sides of beef. Soon there was a steaming pot of delicious broth for everyone in the village to share – and all from a few stones. It seemed like magic!

The Stone Soup Paradigm: A Unified Blueprint for How Everything Fits Together (aka the Industry Model Canvas).

The Stone Soup Paradigm: A Unified Blueprint for How Everything Fits Together (aka the Industry Model Canvas).

A collaborative network has a systematic advantage over markets and firms in matching best available human capital to best available information inputs to produce new information goods, according to Yochai Benkler. He posits that the same framework that explains the emergence of property and firms could, in principal, also explain the emergence of information production organized around a collaborative network. In particular, collaborative network will emerge when the cost of organizing an acitivity on a peered basis is lower than the cost of using the market or hierarchical organization. Based on a similar rationale, one could say that as long as the cost of implementing and enforcing property rights in a given resource is higher than the value of increased efficiency in resource utilization due to the property regime, then the resource will operate without property rights, i.e., as commons.

Examples of successful collaborative network includes those that had brought the world Linux, Apache, Mozilla, Perl, Wikipedia, Project Gutenberg, etc. Under certain circumstances, a collaborative network could be a more cost-effective institutional form than either markets or hierarchical organizations. In a networked information economy, the characteristics of resources required for information production, as well as the cost and efficiency of communication among human participants in the productive enterprise, naturally favors the institution of collaborative network over the alternatives of markets or hierarchical organizations. Specifically, Benkler identified four attributes of the networked information economy as contributing factors: (i) the object of production – information – is considered a public goods and feeds into further production as input at almost zero social cost; (ii) the physical capital costs of information production has declined dramatically; (iii) human capital – creative talent – is central to production but highly variable; and (iv) the dramatic decline in communication costs permits more efficient coordination of distributed efforts and aggregation of results. Taken together, these factors allow substantially cheaper movement of information inputs to human beings, human talent to resources, and modular contributions to projects, so that widely dispersed contributions can be integrated into finished information goods.

In a sense, we can think of the different modes of organizing production as information processes with different strategies for reducing the uncertainty that agents face in evaluating different courses of actions. For example, markets reduce uncertainty regarding allocation decisions by producing a clear set of price signals; firms or hierarchical organizations resolve uncertainty by instituting an ordered set of organizational commands. A collaborative network, in comparison, permits extensive communication and feedback among participants about what needs to be done, who is doing what and how people value any given outcome. The substantial information gain from a collaborative network thus lies in its capacity to collect and process information about human capital. After all, given the variability of human creativity, an organizational model that does not require the contractual specification of human intellectual effort but allows individuals to self-identify for tasks will be better at gathering and utilizing information about who should be doing what than a system that does require such specification.

In addition, a collaborative network enjoys allocation gain made possible by the large sets of available resources, agents, and projects. This gain is cumulative and there are increasing returns to the size of a collaborative network. In contrast, markets or firms rely on properties and contracts to secure access to bounded sets of agents and resources in the pursuit of specific projects. The decision costs in a firm or transaction costs in a market can thus be a limiting factor, unlike peer production organized through a collaborative network with completely unbounded availability of all agents to all resources for all projects. In principle, a world in which all agents can act effectively on all resources will be substantially more productive in creating information goods than a world in which firms divide the universe of agents and resources into bounded sets. Furthermore, any redundancy from duplication of efforts will likely lead to an evolutionary model of innovation where alternate solutions present themselves for natural selection.

In general, one can state that any production organized around a collaborative network is limited not by the total cost or complexity of a project, but by its modularity, granularity, and the cost of integration. Hence, the key to large-scale production is in the assembly of many fine-grained contributions, i.e., how a project can be broken down into a large number of small components that can be independently and asynchronously produced before they are combined into a whole. In fact, a project will likely be more efficient if it can accommodate variously sized contributions, so that people with different levels of diverse motivations can easily collaborate by making smaller or larger grained contributions. Approaches to integration include technology embedded in the collaborative network (e.g., NASA Clickworkers), social norms (e.g., Wikipedia), and market or hierarchical mechanisms (e.g., Linux kernel community). Often, provisioning of the integration function itself presents yet another level of opportunities for innovative use of the collaborative network in a radically complementary way (e.g., Slashdot, Feynman’s “sum over histories”).

For example, as Dan Bricklin noted in his essay, The Cornucopia of the Commons, a good architecture of participation is such that every user who uses its service automatically helps to build the value of the shared database in small increments. This architectural insight may actually explain the runaway success of open source Linux, the Internet, and the World Wide Web better than a spirit of volunteerism, observed Tim O’Reilley in his article, The Architecture of Participation. The relative challenge facing GNU HURD after two decades of efforts, as compared to the early success enjoyed by the horde of Linux developers, highlights the importance of a good architecture of participation; technically competent contributory efforts alone does not guarantee success. To wit, a good architecture of participation allows users pursuing their own “selfish” interests to build collective value as an automatic byproduct, as if led by an “invisible hand” in the collaborative network that would have made Adam Smith proud. In other words, a desired netwrok effect can be induced in a new collaborative network simply by good design, or alternatively be overlaid on top of an existing collaborative network by application of consistent effort (e.g., Amazon Associates program).

The Stone Soup paradigm separates production methodology from ownership concept, as they  could easily get mixed up during certain debates. For instance, the descriptive statement “given enough eyeballs, all bugs are shallow” resides in the realm of production methodology; whereas “free speech, not free beer is a normative statement that is properly in the realm of ownership concept. Notwithstanding the object of information production by different groups actually belongs in the same category, e.g., a Unix clone, the mixing of these two ideas by proponents of Open Source vs. Free Software can sometimes be counter-productive to even greater collaboration that one might otherwise contemplate.

The Stone Soup paradigm also decouples value creation from value capture, as these two stages have shown a high degree of interesting separation in various “Clothesline Paradox” economies described by Tim O’Reilley. He illustrated an instance of value capture by the web-hosting industry based on value created in open source software, e.g., the ISPs can be viewed as essentially offering the open-source DNS, Apache, MySQL, and WordPress to their customers and charging a monthly service fee. Similarly, companies like Google, Facebook, and Twitter is known to have captured enormous value created by the pioneers of the World Wide Web, but in a roundabout way (e.g., as advertising revenue and in stock market capitalization). So what of the hidden economies of value creation without vaue capture? Do such opportunities exist and, more practically, where do we find them?

To paraphrase Clayton Christensen’s “Law of Conservation of Attractive Profits”: when something that used to be valuable becomes commoditized, something that is adjacent in the value chain suddenly becomes valuable. As an example, when IBM made PC hardware a commodity, Microsoft figured out a way to make PC software proprietary and valuable. As the Internet and open-source movement commoditized software, companies like Google in turn figured out how to make data and algorithms proprietary and valuable. In short, companies make good profits when they solve the hardest problems of their times. In the case of financial trading, it could be the challenge of dealing with provisioning and allocation of information goods acting as proxies for market inefficiencies, which have trading capacity limitation and are thus semi-rival in nature. We think new trading opportunities will arise when meaning can be extracted from vast corpuses of data, financial or otherwise. There is thus an incredible opportunity for new financial trading business models to emerge in the world of open data access. What do you think?

Ant.jpg
Ants.jpg
One question that I wondered about was why the ant trails look so straight and nice. The ants look as if they know what they’re doing, as if they have a good sense of geometry. I put some sugar on the other end of the bathtub… and behind where the ant went I drew a line so I could tell where his trail was. The ant wandered a little bit wrong to get back to the hole, so the line was quite wiggly, unlike a typical ant trail.

When the next ant to find the sugar began to go back, … he followed the first ant’s return trail back, rather than his own incoming trail. Already it was apparent that the second ant’s return was slightly straighter. With successive ants the same “improvement” of the trail by hurriedly and carelessly “following” it occurred. I followed eight or ten ants with my pencil until their trails became a neat line right along the bathtub.
— Richard Feynman (“Surely You’re Joking, Mr. Feynman!”, 1985)

References:

  1. Raymond, Eric S. (1997, May). The Cathedral and the Bazaar. Retrieved from: http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/ or http://www.unterstein.net/su/docs/CathBaz.pdf
  2. Raymond, Eric S. (1999, June). The Magic Cauldron. Retrieved from: http://www.catb.org/~esr/writings/magic-cauldron/magic-cauldron.html
  3. Benkler, Yochai (2002, December). Coase’s Penguin, or, Linux and The Nature of the Firm. Yale Law Journal, Vol. 112, No. 3, pp. 369-446. Retrieved from: http://www.yalelawjournal.org/pdf/354_t5aih5i1.pdf
  4. Hardin, Garrett (1968, December). The Tragedy of the Commons. Science, Vol. 162, No. 3859, pp. 1243-1248. Retrieved from: http://www.sciencemag.org/content/162/3859/1243.full
  5. Bricklin, Dan (2000, August). The Cornucopia of the Commons. Retrieved from: http://www.bricklin.com/cornucopia.htm and http://www.bricklin.com/speeches/c-of-c/
  6. O’Reilley, Tim (2004, June). The Architecture of Participation. Retrieved from: http://archive.oreilly.com/pub/a/oreilly/tim/articles/architecture_of_participation.html
  7. Baer, Steve (1975). The Clothesline Paradox. The CoEvolution Quarterly, Winter 1975. Retrieved from: http://www.wholeearth.com/issue/2008/article/358/the.clothesline.paradox
  8. O’Reilley, Tim (2012, July 18). The Clothesline Paradox: How Sharing Economies Create Value. OSCON 2012. Retrieved from: http://www.slideshare.net/timoreilly/the-clothesline-paradox-and-the-sharing-economy-pdf-with-notes-13685423 and http://edge.org/conversation/-39the-clothesline-paradox-39-

Collective Invention

Organizations which design systems are constrained to produce designs which are copies of the communication structures of these organizations.
— Melvin Conway (“How Do Committees Invent?”, 1968)

According to what is now celebrated as Conway’s Law, “if you have four groups working on a compiler, you’ll get a 4-pass compiler.” In other words, process becomes product. In an illustrative example, Melvin Conway describes a contract research organization with eight people who were to produce a COBOL and an ALGOL compiler. After some initial estimates of difficulty and time, five people were assigned to the COBOL job and three to the ALGOL job. Not surprisingly, the resulting COBOL compiler ran in five phases, and the ALGOL compiler ran in three.

In a similar fashion, the communication structure of an organization is shaped by its administrative structure and thus mirrors it. An example in mini-computer hardware design is offered by Tracy Kidder. Excerpted from his classic book, The Soul of a New Machine, is the following narrative: “Looking into the VAX, West had imagined he saw a diagram of DEC’s corporate organization. He felt that VAX was too complicated. He did not like, for instance, the system by which various parts of the machine communicated with each other, for his taste, there was too much protocol involved. He decided that VAX embodied flaws in DEC’s corporate organization. The machine expressed that phenomenally successful company’s cautious, bureaucratic style.” In other words, organization design becomes product design.

Conway advised that: a design effort should be organized according to the need for communication. We know from experience that the design which occurs first is almost never the best possible. Since the need for communication changes according to how design evolves over time, it is important to keep organizations lean and flexible. Therefore, Conway suggested that one must not naively assume that adding manpower adds to productivity. Instead, with great prescience in 1968, he advised answering basic questions about value of resources and techniques of communication as a first step towards a technology (i.e., a process innovation) of building systems with confidence.

The process of developing a new technology through open discussion is called collective invention. According to Peter Meyer, “it is a process in which improvements or experimental findings about a production process or tool are regularly shared.” Mayer documented five episodes of collective invention from recent historical experience: (1) steam engine (1811-1904); (2) iron blast furnace in Britain’s Cleveland district (1850s-1970s); (3) early steel production in the U.S. (1865-1880); (4) microcomputer clubs (1975-1985); and (5) Linux (1991-present).

In each case, there was no one single user or any single inventor. Instead, there was one central figure that played a key role in coordinating the success of collective invention: (1) Joel Lean, who edited the Lean’s Engine Reporter; (2) Isaac Lowthian and others, who published technical information about blast furnaces in operation; (3) Alexander Holley, who consulted widely and published technical reports for his steel industry clients; (4) Lee Felsenstein, who moderated the Homebrew Computer Club; and (5) Linus Torvald, who started Linux and guided its development. They offered the valuable service of information brokerage from the center of a star-shaped social network of experimenters, thus reducing the cost of search for innovations in the network. This enabled the accumulation of innovations to happen over time through experimentation and sharing across the collective invention network.

There are many explanations as to why individuals or firms would want to participate in sharing. A state of technological uncertainty and opportunity contributes to most of them. Sharing and experimentation are in fact complementary activities. It would be inefficient to do one without the other within the collective search process. After all, without a venue in which to share findings or learn from each other, some experiments might not have been carried out in the first place. Among the many diverse motivations that Robert Allen has identified as drivers of information sharing, two of them bear closer examination: (i) there are advantages in establishing engineering standards by giving away designs or software; and (ii) while firms compete locally against other firms, collectively they compete against other regions, and thus have an incentive to work together to make local production as efficiently as possible and remote regions irrelevant. Taken as a whole and reinterpreted in the modern context, one might say that agreeing to common engineering standards is a first step towards building a shared knowledge infrastructure that enables a proximate cluster of emerging firms to build a new and more efficient value network, in the process displacing the incumbents that are entrenched in the older value network. Economists would recognize this phenomenon and characterize it as follows: experimentation creates productive capital through sharing.

Individuals or firms have diverse resources, opportunities, insights, abilities, interests, skills, and agendas. Each one may have something unique to contribute to the collective search process. Those who want to make progress in the collective search process experiment more and find that it is optimal to share their findings. And those who participate in sharing find it optimal to experiment more. This summarizes the underlying self-reinforcing dynamics that drive a collective invention network. In contrast, a hierarchically organized system suffers from what is commonly known as the “Peter Principle,” where the selection of a candidate for a position is based on the candidate's performance in their current role rather than on abilities relevant to the intended role, and as a result “mangers rise to the level of their incompetence” throughout the system. Hierarchy impedes performance.

The emergence of the Internet and the Web offers an excellent example of large-scale collective invention in action, where the decentralized nature of its communication structure initially took shape as a “network of networks”, and subsequently evolved into what is now a “network of platforms.” This occurred over the past few decades within the collective invention network nurtured by DARPA, NSF, and eventually the open Internet and the Web, and through exposure to economic experimentation and community feedback from usage.

The structure of the Internet itself was unanticipated; its development started at a time when "packet switching" and "network of networks" were budding theoretical concepts, and nobody knew where implementation would lead. Not only was the early Internet a radical technological departure from existing practice, the geographical dispersal of its diverse research participants represented another major departure from DARPA's typical centralized program administration. Even the governance structure of the Internet was unprecedented in its openness and transparency, led by a surprising set of technological leaders, many of whom were graduate students at the time. Was it any coincidence then that the Internet became the underlying structure for decentralized exploration which created massive market value over time by aggregating innovations from its diverse participants as it continued to evolve?

The accumulated knowledge enabled the further creation of value in myriad numbers of applications, e.g., the new, ongoing "Internet of Things", that continue to shape the world around us. Perhaps in the not-too-distant future we might even see a “networked intelligence” arising through this process of collective invention. It would take off on its own, and re-design itself at an ever increasing rate,” is one such scenario anticipated by Stephen Hawking. In his recent remark about the future of artificial intelligence, Hawking also predicted that: “humans, who are limited by slow biological evolution, couldn't compete, and would be superseded.

Original Plan: The Mermaid (by John Reinhard Weguelin).

Original Plan: The Mermaid (by John Reinhard Weguelin).

Unanticipated Outcome: The Collective Invention (by Rene Magritte).

Unanticipated Outcome: The Collective Invention (by Rene Magritte).

Interestingly, collective invention is most valuable when there is uncertainty in a large search space along many dimensions, as there is the possibility of truly major innovations on the horizon. Collective invention may simply be an essential phase of technology improvement at its earliest stage. Early automobiles and airplanes seem to have developed along a collective invention path before industries started to form. When searching for market inefficiencies in the financial universe, we often wonder if sharing across the network – beyond academic publishing – may be a way of searching more efficiently given the accessibility of present cloud-computing infrastructure, and how that might impact the incumbents of today’s financial markets. What do you think?

Everything we see hides another thing, we always want to see what is hidden by what we see.
— Rene Magritte (1898-1967)

References:

  1. Conway, Melvin (1968). How do Committees Invent? Datamation (April, 1968). Retrieved from: http://www.melconway.com/Home/Committees_Paper.html and http://www.melconway.com/Home/pdf/committees.pdf
  2. Kidder, Tracy (1981). The Soul of a New Machine. Little Brown and Company.
  3. Allen, Robert C. (1983). Collective Invention. Journal of Economic Behavior and Organization, Vol. 4, pp. 1-24. Retrieved from: http://www.nuffield.ox.ac.uk/users/allen/collinvent.pdf
  4. Cowan, Robin and Jonard, Nicolas (2000). The Dynamics of Collective Invention. Journal of Economic Behavior and Organization, Vol. 52, No. 4, pp. 513-532. Retrieved from: http://arno.unimaas.nl/show.cgi?fid=279
  5. Meyer, Peter B. (2003). Episodes of Collective Invention. U.S. Bureau of Labor Statistics. Retrieved from: http://www.bls.gov/ore/pdf/ec030050.pdf
  6. Greenstein, Shane (2009). Nurturing the Accumulation of Innovations: Lessons from the Internet. In: Accelerating Energy Innovation: Insights from Multiple Sectors (2011), Rebecca M. Henderson and Richard G. Newell, editors (pp. 189-223). Retrieved from: http://www.nber.org/chapters/c11755.pdf
  7. Raymond, Eric S. (1996). A Brief History of Hackerdom. Retrieve from: http://www.catb.org/~esr/writings/cathedral-bazaar/hacker-history/
  8. Raymond, Eric S. (1999). Revenge of the Hackers. Retrieve from: http://www.catb.org/~esr/writings/cathedral-bazaar/hacker-revenge/
  9. Kessler, Andy (2005). How We Got Here: A Slightly Irreverent History of Technology and Markets. Harper Collins.
  10. Isaacson, Walter (2014). The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution. Simon and Schuster.

Economy of Attention

...in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.
— Herbert Simon (1971)

Seeing’ is not simply a matter of looking at an internal representation of the outside world; and perception, in general, is not a passive matter of light falling on the retina and entering the visual system. Rather, it depends on an active cognitive process of categorizing and classifying various aspects of interest, and paying attention to certain things. According to Kevin O’Regan of the Inititut Paris Descartes de Neurosciences et Cognition, the structure of the brain is such that neurons at different visual processing levels compete with each other to move up to higher levels of the brain, and thus determine what we end up paying attention to. However, the potential to turn our attention to different details gives us an impression of seeing everything, providing us with the illusion of a perfect visual world. For example, when we move our eyes or shift our attention, what we are in effect doing is to access the outside world in an intermittent and selective manner. In other words, our brain is using the external world as a memory store.

Now you see me? (Image Credit: Chris Chabris).

Now you see me? (Image Credit: Chris Chabris).

The role of selective attention, as explained by Kia Nobre of Oxford University, is to process and integrate this neuronal competition so as to enable people to take appropriate actions. Certain areas of the brain lie at the intersection of perception and action. For example, the areas involved in moving the eyes are also used to focus attention on something even if the eyes are not moving. Empirical studies suggest that the brain has the ability to insert anticipatory biases into the stream of perceptual information. In other words, the brain constantly constructs a forward-looking model of the world as it processes the different areas of neuronal activities, extracting regularities and building predictions. Long-term memory shapes anticipatory biases, and thus people’s ability to direct their attention may be affected by habit or training.

Attention is evidently a scarce cognitive resource with a veritable neurological basis. In fact, many demonstrated cases of ‘inattentional blindness’, i.e., not seeing something that is there, or the related phenomenon of ‘change blindness’, i.e., not noticing changes in a scene, show that the human brain’s internal representation of external reality are rather sparse and sketchy. Such cognitive phenomena is the antithesis of ‘insight’, i.e., seeing what others don’t. So, with so much information in the financial markets, how do people pay attention? How do traders generate market insights? And how do we go about measuring the value of attention?

Normal
0




false
false
false

EN-US
X-NONE
X-NONE

 
 
 
 
 
 
 
 
 
 
 
 

MicrosoftInternetExplorer4

 
 
 
 
 
 
 
 
 
 
 

“The closer you look, the less you'll see.” (Image Credit: Lucas Vassallo).
 <w:LatentStyles DefLockedState="false" D…

“The closer you look, the less you'll see.” (Image Credit: Lucas Vassallo).

Experiments by neuroscientists and psychologists focus on understanding the mechanisms of how cognitive systems cope with scarce attention. One obvious application area is the design of how financial information is visually presented. Economists, on the other hand, construct models of optimizing competition for attentional resources as a scarce commodity in order to better understand the economics of human attention. One area of economics where attention is a key factor is advertising. In any case, it seems evident that the brain will use energy as efficiently as possible, and so it appears this could be an objective function that will drive the overall optimization process, given the constraints of the brain architecture. An alternative criterion may be behavioral success, in which case energy efficiency may then be one of the constraints. It is the difference of optimizing for costs or for results. It would be interesting to examine this aspect of attentional resources from the viewpoint of ‘causal entropic forces’ as recently proposed by physicists, as there might be hidden connections between the various optimization models.

Cognitive processing is costly in terms of time and energy. So what does this mean for the organization? What is the effect of the existence of cognitive costs on an organization? How do organizations cope with information overload? Firms involve many people working together for a common purpose. People have to share information, coordinate with one another, make decisions and communicate them, all with limited amounts of time and energy. According to Kenneth Arrow, an organization can hold more information than any individual, but to do so it will need to develop special codes, and to economize on information costs through a hierarchical structure, which may be analogous to a ‘hub-and-spoke’ transport network. While greater access to diverse information can enhance productivity, information overload will reduce it. Specifically, more information will only be beneficial when the ‘gains from trade’ of information exchange outweigh the additional communications and cognitive costs of maintaining a network. This has tremendous implications for how a trading firm is to be organized, e.g., whether horizontally or vertically integrated, by taking into considerations economies of scale or scope under the constraints of cognitive processing costs at the organizational level.

It is important to realize that cognitive processing costs constrain the universe of possible outcomes for any organization design. In “Seeing What's Next,” Christensen uses the example of Dell Computer to illustrate his “Value Chain Evolution” theory's golden rule: Integrate to improve what is “not good enough” (i.e., speed, customization, and convenience of online ordering), and outsource what is “more than good enough” (i.e., architectural design of the PC). While this is certainly a helpful insight, it does not consider the cost of cognitive processing which is especially critical in a highly dynamic and uncertain environment such as that found in financial trading. For that, we will need to layer upon it a telescoping perspective based on cognitive constraints in order to avoid certain catastrophe.

A thousand ships launched on a rising tide...

A thousand ships launched on a rising tide...

... crashing into the invisible iceberg.

... crashing into the invisible iceberg.

We can thus picture three concentric rings as representing increasing scope of visibility:

  1. the scope of participation at the level of the firm;
  2. the scope of activity at the level of the value network; and
  3. the scope of vision at the level of the environment encompassing industry landscape, regulatory regime, and the macro economy.

The design of a firm then has to do with where the boundaries of the firm are drawn, how the value network is to be partitioned among the marketplace and the collaborative commons, and how to allocate scarce attention efficiently to the greater surrounding contexts of industry, regulation or macro economy. Despite all that can be automated by institutions or by machines, human attention at the highest level of a firm remains a valuable scarce resource that needs to be managed.

Seeing what's next: A telescoping view that relates vision, activity, and participation.

Seeing what's next: A telescoping view that relates vision, activity, and participation.

Economics is the science which studies human behaviour as a relationship between given ends and scarce means which have alternative uses.
— Lionel Robbins (“An Essay on the Nature and Significance of Economic Science”, 1935)

References:

  1. Simon, Herbert A. (1971). Designing Organizations for an Information-Rich World. In: Computers, Communication, and the Public Interest (Edited: Martin Greenberger). The Johns Hopkins Press.
  2. Chabris, Christopher and Simons, Daniel (2011). The Invisible Gorilla: How Our Intuitions Deceive Us. Harmony.
  3. The Invisible Hand Meets the Invisible Gorilla: The Economics and Psychology of Scarce Attention. Summary of Conference at IDEI, Toulouse School of Economics, September 2011. Retrieved from: http://www.idei.fr/doc/conf/psy/2011/summary.pdf
  4. Christensen, Clayton M. and Anthony, Scott D. and Roth, Erik A. (2004). Seeing What's Next: Using Theories of Innovation to Predict Industry Change. Harvard Business Review Press.
  5. Nielsen, Michael (2008, December 29). The Economics of Scientific Collaboration. Retrieved from: http://michaelnielsen.org/blog/the-economics-of-scientific-collaboration/
  6. Useem, Jeremy (2015, October). Are Bosses Necessary? The Atlantic. Retrieved from: http://www.theatlantic.com/magazine/archive/2015/10/are-bosses-necessary/403216/

Participatory Universe

What good is a universe without somebody around to look at it?
— Robert Dicke (1916-1997)
John A. Wheeler at Princeton University in 1967. (Image Credit: The NY Times).

John A. Wheeler at Princeton University in 1967. (Image Credit: The NY Times).

John Wheeler, who is mentor to many of today’s leading physicists, and the man who coined the term “black hole”, suggested that the nature of reality was revealed by the bizarre laws of quantum mechanics. According to the quantum theory, before the observation is made, a subatomic particle exists in several states, called a superposition (or, as Wheeler called it, a ‘smoky dragon’). Once the particle is observed, it instantaneously collapses into a single position (a process called ‘decoherence’).

Wheeler's hunch is that the universe is built like an enormous feedback loop, a loop in which we contribute to the ongoing creation of not just the present and the future but the past as well. To illustrate his idea, he devised what he calls his “delayed-choice experiment,” which was tested in a laboratory in 1984 (and 2007). This experiment was a variation on the famous “double-slit experiment” in which the dual nature of light was exposed depending on how the experiment was measured and observed, the light behaved like a particle (i.e., a photon) or like a wave.

Unlike the original “double-slit experiment”, in Wheeler’s version, the method of detection was changed after a photon had passed the double slit. The experiment showed that the path of the photon was not fixed until the physicists made their measurements. In other words, the outcome of the experiment depends on what the physicists try to measure: If they set up detectors beside the slits, the photons act like ordinary particles, always traversing one route or the other, not both at the same time. But if the physicists remove the detectors, each photon seems to travel both routes simultaneously like a tiny wave. When it comes to quantum systems, reality depends on how we interact with it.

Putting down her cup of tea, she asked in a timid voice, “Is light made of waves, or is it made of particles:” “Yes, exactly so,” replied the Mad Hatter. (Source: “Alice’s Adventures in Wonderland”).

Putting down her cup of tea, she asked in a timid voice, “Is light made of waves, or is it made of particles:” “Yes, exactly so,” replied the Mad Hatter. (Source: “Alice’s Adventures in Wonderland”).

These conclusions lead many scientists to speculate that the universe is fine-tuned for life. For example, this is how Robert Dicke, Wheeler’s colleague at Princeton, explained the existence of our universe:

If you want an observer around, you need life, and if you want life, you need heavy elements. To make heavy elements out of hydrogen, you need thermonuclear combustion. To have thermonuclear combustion, you need a time of cooking in a star of several billion years. In order to stretch out several billion years in its time dimension, the universe, according to general relativity, must be several billion years across in its space dimensions. So why is the universe as big as it is? Because we’re here!

Does this mean humans are necessary to the existence of the universe?

While conscious observers certainly partake in the creation of the participatory universe envisioned by Wheeler, they are not the only, or even primary, way by which quantum potentials become real. Ordinary matter and radiation play the dominant roles. Wheeler likes to use the example of a high-energy particle released by a radioactive element like radium in Earth's crust. The particle, as with the photons in the two-slit experiment, exists in many possible states at once, traveling in every possible direction, not quite real and solid until it interacts with something, say a piece of mica in Earth's crust. When that happens, one of those many different probable outcomes becomes real. In this case the mica, not a conscious being, is the object that transforms what might happen into what does happen. The trail of disrupted atoms left in the mica by the high-energy particle becomes part of the real world.

Erwin Schrödinger: “Until you observe the cat, it is both alive and dead at the same time.”

Erwin Schrödinger: “Until you observe the cat, it is both alive and dead at the same time.”

At every moment, in Wheeler's view, the entire universe is filled with such events, where the possible outcomes of countless interactions become real, where the infinite variety inherent in quantum mechanics manifests as a physical cosmos. And we see only a tiny portion of that cosmos. Wheeler suspects that most of the universe consists of huge clouds of uncertainty that have not yet interacted either with a conscious observer or even with some lump of inanimate matter. He sees the universe as a vast arena containing realms where the past is not yet fixed.

Wheeler had come to view quantum measurement, how it creates an actuality of what was mere potentiality, as the essential building block of reality. Quantum is the "crack in the armor" that covers the secret of existence, a clue that the mystery of creation may lie not in the distant past but in the living present. If the universe is a giant computer, the laws of nature will most likely be coded in a functional programming language based on “lazy evaluation”, or “call-by-need”, which is an evaluation strategy that delays the evaluation of an expression until its value is needed and which also avoids repeated evaluations. The benefits of lazy evaluation include: (i) the ability to construct potentially infinite data structures, (ii) the ability to define control flow structures as abstractions instead of primitives, and (iii) increase in performance by avoiding needless calculations. But most importantly, lazy evaluation can lead to reduction in memory footprint, since values are created when needed. It is thus consistent with Wheeler’s idea that the universe is designed under the advice of an “efficiency expert.

In fact, there is no obvious extravagance of scale in the construction of the universe, according to Wheeler. For the purpose of having somebody around to be aware of the universe, life on one planet only (i.e., the Earth) seems to be a reasonable design goal. The anthropic principle thus provides a new perspective on the question of life elsewhere in space: they are not essential because it is not economical. Put another way, the universe has to be such as to permit awareness of that universe itself; and to do so economically with life on just one planet. “This point of view is what gives me hope that the question — How come existence? — can be answered,” said Wheeler.

Faith is the number one element. It isn’t something that spreads itself uniformly. Faith is concentrated in a few people at particular times and places. If you can involve young people in an atmosphere of hope and faith, then I think they’ll figure out how to get the answer. Faith and hope are absolutely central to everything one does.
— John Archibald Wheeler (1911-2008)

Reference:

  1. Interview with John Wheeler: From the Big Bang to the Big Crunch. Cosmic Search, Vol. 1, No. 4. Retrieved from: http://www.bigear.org/vol1no4/wheeler.htm
  2. Folger, Tim (2002, June 1). Does the Universe Exist if We’re Not Looking? Discover. Retrieved from: http://discovermagazine.com/2002/jun/featuniverse
  3. Stenger, Victor J. (2007). The Anthropic Principle. In: The Encyclopedia of Nonbelief. Prometheus Books. Retrieved from: http://www.colorado.edu/philosophy/vstenger/Cosmo/ant_encyc.pdf