singularity

toc Read here about singularity and its relevance to studies of scale.

=Overview=

In mathematics, singularity refers to a point at which a given mathematical object is not defined, or a point of an exceptional set where the set fails to be well-behaved in some particular way, such as differentiability.

In this sense, singularity is important to studies of scale, where sets of behaviors in scale A cease to apply to scale B, particularly when scales A and B are related by a power law sequence.

=Technological Singularity=

Vernor Vinge and Ray Kurzweil use the term "singularity" to connote a specific confluence of a variety of power law trends resulting in the creation of machine-enabled superintelligence, and they argue that it is difficult or impossible for present-day humans to predict what a post-singularity world would be like, due to the difficulty of imagining the intentions and capabilities of such entities. The term "technological singularity" was originally coined by Vinge, who made an analogy between the breakdown in our ability to predict what would happen after the development of superintelligence and the breakdown of the predictive ability of modern physics at the space-time singularity beyond the event horizon of a black hole. Some writers use "the singularity" in a broader way to refer to any radical changes in our society brought about by new technologies such as molecular nanotechnology, although Vinge and other prominent writers specifically state that without superintelligence, such changes would not qualify as a true singularity. Many writers also tie the singularity to observations of exponential growth in various technologies (with Moore's Law being the most prominent example), using such observations as a basis for predicting that the singularity is likely to happen sometime within the 21st century.

A technological singularity includes the concept of an intelligence explosion, a term coined in 1965 by I. J. Good. Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia. However with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humanity. If superhuman intelligences were invented, either through the amplification of human intelligence or artificial intelligence, it would bring to bear greater problem-solving and inventive skills than humans, then it could design a yet more capable machine, or re-write its source code to become more intelligent. This more capable machine then could design a machine of even greater capability. These iterations could accelerate, leading to recursive self improvement, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.

Critique
Many prominent technologists and academics dispute the plausibility of a technological singularity, including Jeff Hawkins, John Holland, Jaron Lanier, and Gordon Moore, whose "Moore's Law" is often cited in support of the concept.

=Exponential Trends Associated with Singularity=

A collection of charts graphing exponential trends associated with singularity. Many, but not all, of these charts are reproduced from Kurzweil's book.

Complexity of Organisms
Kurzweil graphs here the exponential increase in the speed of evolutionary complexity, from the evolution of single celled organisms up to the personal computer.



Computer Capability
Kurzweil wrote: The exponential growth of computing is a marvelous quantitative example of the exponentially growing returns from an evolutionary process. The exponential growth of computing may be expressed in terms of its accelerating pace: it took ninety years to achieve the first MIPS per thousand dollars; today we add one MIPS per thousand dollars every five hours.

Power per Computing Cycle
The energy required per MIPS for computing devices has been falling exponentially, as shown in the following figure.

Bits Shipped Per Year
Despite this massive deflation in the cost of information technologies, demand has more than kept up. The number of bits shipped has doubled every 1.1 years, faster than the halving time in cost per bit, which is 1.5 years. As a result, the semiconductor industry enjoyed 18 percent annual growth in total revenue from 1958 to 2002. The entire information-technology (IT) industry has grown from 4.2 percent of the gross domestic product in 1977 to 8.2 percent in 1998. IT has become increasingly influential in all economic sectors. The share of value contributed by information technology for most categories of products and services is rapidly increasing. Even common manufactured products such as tables and chairs have an information content, represented by their computerized designs and the programming of the inventory-procurement systems and automated-fabrication systems used in their assembly.



CPU MIPS
Processor performance in MIPS has doubled every 1.8 years per processor (see the figure below). Again, note that the cost per processor has also declined through this period.

Computer Memory Trends
The following charts combine historical data with the semiconductor-industry road map (International Technology Roadmap for Semiconductors [ITRS] from Sematech), which projects through 2018. The cost of DRAM (dynamic random access memory) per square millimeter has also been coming down. The doubling time for bits of DRAM per dollar has been only 1.5 years.

Electronic Memory in General
Growth in electronic memories such as RAM is exponential. Note how the trend proceeds smoothly through different technology paradigms: vacuum tube to discrete transistor to integrated circuit.

Computer Power Consumption (Koomey's Law)
The following chart, called Koomey's law, focuses on energy use. Koomey’s Law states that the amount of power needed to perform a computing task will fall by half every one and a half years.



See Koomey's Law.

Computer Medical Informatics
There has been smooth exponential growth in the amount of DNA sequence data that has been collected (see the figure below). A dramatic recent example of this improving capacity was the sequencing of the SARS virus, which took only thirty-one days from the identification of the virus, compared to more than fifteen years for the HIV virus.

Medical Image Processing Time
Our tools for peering into our brains are improving at an exponential pace. The resolution of noninvasive brain-scanning devices is doubling about every twelve months (per unit volume).

Brain Scan Processing Time
The speed of brain scanning image reconstruction is growing exponentially in tandem with the improvement in medical image resolution.

Economic Activity
The overall information-technology sector is rapidly increasing its share of the economy.

IT Share of GDP
The overall information-technology sector is rapidly increasing its share of the economy and is increasingly influential on all other sectors, as noted in the figure below.

eCommerce Revenue
The boom-and-bust cycle in these information technologies was strictly a capital-markets (stock-value) phenomenon. Neither boom nor bust is apparent in the actual business-to-consumer (B2C) and business-to-business (B2B) data. Actual B2C revenues grew smoothly from $1.8 billion in 1997 to $70 billion in 2002. B2B had similarly smooth growth from $56 billion in 1999 to $482 billion in 2002.90 In 2004 it is approaching $1 trillion. The boom-and-bust business cycles are also not visible in the actual price-performance of the underlying technologies.

GDP and Productivity
The exponential trends underlying productivity growth are just beginning this explosive phase. The U.S. real gross domestic product has grown exponentially, fostered by improving productivity from technology, as seen in the figure below.

GDP Per Capita
Some critics credit population growth with the exponential growth in GDP, but this argument can be ruled out by examining per capita data.

Data Storage
This exponential trend reflects the squeezing of data onto a magnetic substrate, rather than transistors onto an integrated circuit, a completely different technical challenge pursued by different engineers and different companies.

Manufacturing
Smooth exponential growth has been taking place in the value produced by an hour of labor over the last half century. This trend does not take into account the vastly greater value of a dollar's power in purchasing information technologies (which has been doubling about once a year in overall price-performance). Despite these weaknesses in the productivity statistical methods, such gains in productivity are now actually reaching the steep part of the exponential curve. Labor productivity grew at 1.6 percent per year until 1994, then rose at 2.4 percent per year, and is now growing even more rapidly. Manufacturing productivity in output per hour grew at 4.4 percent annually from 1995 to 1999, durables manufacturing at 6.5 percent per year. In the first quarter of 2004, the seasonally adjusted annual rate of productivity change was 4.6 percent in the business sector and 5.9 percent in durable goods manufacturing.

Moore's Law
Kurzweil wrote: Moore's Law is actually not the first paradigm in computational systems. You can see this if you plot the price performance— measured by instructions per second per thousand constant dollars—of forty-nine famous computational systems and computers spanning the twentieth century (see the figure below). As the figure demonstrates, there were actually four different paradigms—electromechanical, relays, vacuum tubes, and discrete transistors—that showed exponential growth in the price-performance of computing long before integrated circuits were even invented. And Moore's paradigm won't be the last. When Moore's Law reaches the end of its S-curve, now expected before 2020, the exponential growth will continue with three-dimensional molecular computing, which will constitute the sixth paradigm.

Peter Diamandis presented this chart at his TED 2012 talk :



Below, the original chart from Kurzweil's Singularity book:



Intellectual Property
Trends in innovationand thoughts as reflected in the patent and scientific publication record.

Inventions
Kurzweil writes that the overall rate of adopting new paradigms, which parallels the rate of technological progress, is currently doubling every decade. That is, the time to adopt new paradigms is going down by half each decade. At this rate, technological progress in the twenty-first century will be equivalent (in the linear view) to two hundred centuries of progress (at the rate of progress in 2000).



Patents
Sun Tzu pointed out, "knowledge is power," and another ramification of the law of accelerating returns is the exponential growth of human knowledge, including intellectual property.



Nanotech Patents
As the salient feature size of a wide range of technologies moves inexorably closer to the multi-nanometer range (less than one hundred nanometers—billionths of a meter), it has been accompanied by a rapidly growing interest in nanotechnology. Nanotechnology science citations have been increasing significantly over the past decade, as noted in the figure below. The same phenomenon is visible in nanotechnology-related patents. These charts include the transition to software and business process patents, which might be affecting the numbers.



Sigmoid Patterns
The sigmoid or S curve is a component of these exponential trends.

Key Historical Events
Kurzweil graphs the occurence rate of key events over time, combining lists from a variety of sources (for example, the Encyclopaedia Britannica, the American Museum of Natural History, Carl Sagan's "cosmic calendar," and others). He finds that a resulting aggregating graph shows the smooth acceleration seen in the other charts. The following plot combines fifteen different lists of key events. These lists date from the 1980s and 1990s, with most covering the known history of the universe, while three focus on the narrower period of hominoid evolution. The dates used by some of the older lists are imprecise, but it is the events themselves, and the relative locations of these events in history, that are of primary interest.



Highlights of the list:
 * Theodore Modis, professor at DUXX, Graduate School in Business Leadership in Monterrey, Mexico, attempted to develop a "precise mathematical law that governs the evolution of change and complexity in the Universe." To research the pattern and history of these changes, he required an analytic data set of significant events where the events equate to major change. He did not want to rely solely on his own list, because of selection bias. Instead, he compiled thirteen multiple independent lists of major events in the history of biology and technology from these sources:
 * Carl Sagan, The Dragons of Eden: Speculations on the Evolution of Human Intelligence (New York: Ballantine Books, 1989). Exact dates provided by Modis.
 * American Museum of Natural History. Exact dates provided by Modis.
 * The data set "important events in the history of life" in the Encyclopaedia Britannica.
 * Educational Resources in Astronomy and Planetary Science (ERAPS), University of Arizona, http://ethel.as.arizona.edu/~collins/astro/subiects/evolve-26.html.
 * Paul D. Boyer, biochemist, winner of the 1997 Nobel Prize, private communication. Exact dates provided by Modis.
 * J. D. Barrow and J. Silk, "The Structure of the Early Universe," Scientific American 242.4 (April 1980): 118– 28.
 * J. Heidmann, Cosmic Odyssey: Observatoir de Paris, trans. Simon Mitton (Cambridge, U.K.: Cambridge University Press, 1989).
 * J.W. Schopf, ed., Major Events in the History of Life, symposium convened by the IGPP Center for the Study of Evolution and the Origin of Life, 1991 (Boston: Jones and Bartlett, 1991).
 * Phillip Tobias, "Major Events in the History of Mankind," chap. 6 in Schopf, Major Events in the History of Life.
 * David Nelson, "Lecture on Molecular Evolution I," http://drnelson.utmem.edu/evolution.html, and "Lecture Notes for Evolution II," http://drnelson.utmem.edu/evolution2.html. G. Burenhult, ed., The First Humans: Human Origins and History to 10,000 BC (San Francisco: HarperSanFrancisco, 1993).
 * D. Johanson and B. Edgar, From Lucy to Language (New York: Simon & Schuster, 1996).
 * R. Coren, The Evolutionary Trajectory: The Growth of Information in the History and Future of Earth, World Futures General Evolution Studies (Amsterdam: Gordon and Breach, 1998).

Miniaturization
Kurzweil: Another trend that will have profound implications for the twenty-first century is the pervasive movement toward miniaturization. The key feature sizes of a broad range of technologies, both electronic and mechanical, are decreasing, and at an exponential rate. At present, technology is shrinking by a factor of about four per linear dimension per decade. This miniaturization is a driving force behind Moore's Law, but it's also reflected in the size of all electronic systems—for example, magnetic storage.

Mechanical Device Miniturization
Kurzweil: The exponential miniaturization trend is also seen in the ongoing decrease in the size of mechanical devices.

Telescopic Observation
Improvements in computation and mechanics combine to exponentially increase humanity's ability to observe the cosmos.

Depth of Observation


The following diagram from Sky & Telescope illustrates the scope of the SETI project by plotting the capability of the varied scanning efforts against three major parameters: distance from Earth, frequency of transmission, and the fraction of the sky. They graph relative depth into space on the vertical axis, Fraction of Sky on the x axis, and Frequency (Ghz) on the other axis.
 * Serendip III
 * Serendip IV
 * SETI@Home
 * Project Phoenix
 * Allen telescope array
 * Omni-directional search system
 * Infrared and optical SETI

Transistors
Kurzweil writes: A similar trend can be seen with transistors. You could buy one transistor for a dollar in 1968; in 2002 a dollar purchased about ten million transistors. Since DRAM is a specialized field that has seen its own innovation, the halving time for average transistor price is slightly slower than for DRAM, about 1.6 years (see the figure below). Unlike Gertrude Stein's rose, it is not the case that a transistor is a transistor is a transistor. As they have become smaller and less expensive, transistors have also become faster by a factor of about one thousand over the course of the past thirty years (see the figure below)—again, because the electrons have less distance to travel.

Education Cost
Another implication of the law of accelerating returns is exponential growth in education and learning. Over the past 120 years, the United States has increased its investment in K-12 education (per student and in constant dollars) by a factor of ten. There has been a hundredfold increase in the number of college students. Automation started by amplifying the power of our muscles and in recent times has been amplifying the power of our minds. So for the past two centuries, automation has been eliminating jobs at the bottom of the skill ladder while creating new (and better-paying) jobs at the top of the skill ladder. The ladder has been moving up, and thus the U.S.A. has been exponentially increasing investments in education at all levels (see the figure below).93

War Casualties
As weapons have become more intelligent, there has been a dramatic trend toward more precise missions with fewer casualties. It may not seem that way when viewed alongside the tendency toward more detailed, realistic television news coverage. The great battles of World Wars I and II and the Korean War, in which tens of thousands of lives were lost over the course of a few days, were visually recorded only by occasional grainy newsreels. Today, we have a front-row seat for almost every engagement. Each war has its complexities, but the overall movement toward precision intelligent warfare is clear by examining the number of casualties. This trend is similar to one seen in medicine, where smart weapons against disease are able to perform specific missions with far fewer side effects. The trend is similar for collateral casualties, although it may not seem that way from contemporary media coverage (recall that about fifty million civilians died in World War II).

USA War Deaths
U.S. War Deaths from 1850 Civil War to 2004 Iraq war shows an exponential drop in war deaths and battle deaths. The chart includes the Civil War, World War I, World War II, the Korean War, the Vietnam War, the Gulf War and Iraq II. The conflicts included are not a complete list, see for example @http://en.wikipedia.org/wiki/United_States_military_casualties_of_war.

State Warfare
How the number of deaths in a war between states (from 1820 to 1997) varies with the frequency (equivalently, the probability) of such a conflict. The relationship is a mathematically simple one, called a power law. The last two data points are, respectively, World Wars I and II. From Why Society is a Complex Matter by Philip Ball, 2012 Springer p.49.

Terrorist Attack
How the number of deaths in a terrorist attack (between 1968 and 2008) varies with the frequency of such an attack. The relationship is another power law, but with a different exponent from that of inter-state wars. From Why Society is a Complex Matter by Philip Ball, 2012 Springer p.49.

Wireless Data Price
Kurzweil: The built environment is moving away from a physical tangle of connected wires, and implementing wireless communication instead. The bandwidth of information conveyed via wireless is doubling every ten to eleven months (see the figure below).

=Links=

Kurzweil site: @http://www.singularity.com/ Wikipedia article: []