Digital philosophy

Digital philosophy (also digital ontology) is a direction in philosophy and cosmology advocated by certain mathematicians and theoretical physicists, including: Edward Fredkin, Konrad Zuse, Stephen Wolfram, Rudy Rucker, Gregory Chaitin, and Seth Lloyd.


Digital philosophy is a modern re-interpretation of Gottfried Leibniz's monist metaphysics, one that replaces Leibniz's monads with aspects of the theory of cellular automata. Since, following Leibniz, the mind can be given a computational treatment, digital philosophy attempts to consider some main issues in the philosophy of mind. The digital approach attempts to deal with the non-deterministic quantum theory, where it assumes that all information must have finite and discrete means of its representation, and that the evolution of a physical state is governed by local and deterministic rules.[1]

In digital physics, existence and thought would consist of only computation. (However, not all computation would necessarily be thought.) Thus computation is the single substance of a monist metaphysics, while subjectivity arises from computational universality. There are many variants of digital philosophy; however, most of them are Digital data theories that view all of physical realities and cognitive science and so on, in framework of information theory.[1]

Digital philosophers

  1. All information must have a digital means of its representation.
  2. An informational process transforms the digital representation of the state of the system into its future state.
  3. If Fredkin's first fundamental law of information is correct then Einstein's theory of general relativity theory is not entirely correct, because the theory does not rely upon digital information.
  4. If Fredkin's second fundamental law is correct then the Copenhagen interpretation of quantum mechanics is not entirely correct, because quantum randomness lacks a digitally deterministic explanation.
  1. Below the Planck scale, there is an informational substrate that allows the build-up of time, space, and energy by means of an updating parameter.
  2. The updating parameter for the multiverse is analogous to time via a mathematical isomorphism, but the updating parameter involves a decomposition across alternate universes.
  3. The informational substrate consists of network nodes that can simulate random network models and Feynman path integrals.
  4. In physical reality, both energy and spacetime are secondary features. The most fundamental feature of reality is signal propagation caused by an updating parameter acting upon network nodes.
  5. The multiverse automaton has a model consisting of informational substrate, an updating parameter, a few simple rules, and a method for deriving all of quantum field theory and general relativity theory,
  6. The totally finite nature of the model implies the existence of weird, alternate-universe forces that might, or might not, be too small for empirical detection.
  • Rudy Rucker. In his book Mind Tools (1987),[4] mathematician/philosopher Rudy Rucker articulated this concept with the following conclusions about the relationship between Math and the universe. Rucker's second conclusion uses the jargon term 'fact-space'; this is Rucker's model of reality based on the notion that all that exists is the perceptions of various observers. An entity of any kind is a glob in fact-space. The world – the collection of all thoughts and objects – is a pattern spread out through fact-space. The following conclusions describe the digital philosophy that relates the world to fact-space.
  1. The world can be resolved into digital bits, with each bit made of smaller bits.
  2. These bits form a fractal pattern in fact-space.
  3. The pattern behaves like a cellular automaton.
  4. The pattern is inconceivably large in size and dimensions.
  5. Although the world started simply, its computation is irreducibly complex.

Fredkin's ideas on physics

Fredkin takes a radical approach to explaining the EPR paradox and the double-slit experiment in quantum mechanics. While admitting that quantum mechanics yields accurate predictions, Fredkin sides with Einstein in the Bohr-Einstein debates. In The Meaning of Relativity, Einstein writes, "One can give good reasons why reality cannot at all be represented by a continuous field. From the quantum phenomena it appears to follow with certainty that a finite system of finite energy can be completely described by a finite set of numbers (quantum numbers). This does not seem to be in accordance with a continuum theory, and must lead to attempts to find a purely algebraic theory for the description of reality. However, nobody knows how to find the basis for such a description."

Einstein's hope is a purely algebraic theory; however, Fredkin attempts to find a purely informational theory for the description of reality. At the same time, physicists find some vagueness, problems with Bell theorem compatibility, and lack of empirical falsifiability in Fredkin's expression of his ideas. In "Digital Philosophy (DP)", Chapter 11,[5] Fredkin raises the question, "Could physics have a strong law of conservation of information?" Fredkin answers his own question, "If so, we have to rethink particle disintegrations, inelastic collisions and Quantum Mechanics to better understand what is happening to the information. The appearance of a single truly random event is absolutely incompatible with a strong law of conservation of information. A great deal of information is obviously associated with the trajectory of every particle and that information must be conserved. This is a very large issue in DP, yet such issues are seldom considered in conventional physics."

Fredkin's "five big questions with pretty simple answers"

According to Fredkin,[6] "Digital mechanics predicts that for every continuous symmetry of physics there will be some microscopic process that violates that symmetry." Therefore, according to Fredkin, at the Planck scale, ordinary matter could have spin angular momentum that violates the equivalence principle.There might be weird Fredkin forces that cause a torsion in spacetime.

The Einstein–Cartan theory extends general relativity theory to deal with spin-orbit coupling when matter with spin is present. According to conventional wisdom in physics, torsion is nonpropagating, which means that torsion will appear within a massive body and nowhere else. According to Fredkin, torsion could appear outside and around massive bodies, because alternate universes have anomalous inertial effects.

See also


  1. ^ a b Fredkin, Edward (2003). "An Introduction to Digital Philosophy". International Journal of Theoretical Physics. 42 (2): 189–247. doi:10.1023/A:1024443232206.
  2. ^ Fredkin, E. (1992). Finite Nature (PDF). Proceedings of the XXVIIth Rencotre de Moriond. Archived from the original (PDF) on 2013-08-29.
  3. ^ * Wolfram, Stephen, A New Kind of Science. Wolfram Media, Inc., May 14, 2002. ISBN 1-57955-008-8
  4. ^ Rucker, Rudy, Mind Tools – the five levels of mathematical reality – Houghton Mifflin (1987)
  5. ^ Fredkin, Edward. "Digital Philosophy". Archived from the original on 2014-09-28.
  6. ^ Fredkin, E. (January 2004). "Five big questions with pretty simple answers". IBM Journal of Research and Development. 48 (1). doi:10.1147/rd.481.0031.

External links

Digital physics

In physics and cosmology, digital physics is a collection of theoretical perspectives based on the premise that the universe is describable by information. It is a form of digital ontology about the physical reality. According to this theory, the universe can be conceived of as either the output of a deterministic or probabilistic computer program, a vast, digital computation device, or mathematically isomorphic to such a device.

Digital probabilistic physics

Digital probabilistic physics is a branch of digital philosophy which holds that the universe exists as a nondeterministic state machine. The notion of the universe existing as a state machine was first postulated by Konrad Zuse's book Rechnender Raum. Adherents hold that the universe state machine can move between more and less probable states, with the less probable states containing more information. This theory is in contrast to digital physics, which holds that the history of the universe is computable and is deterministically unfolding from initial conditions.

The fundamental tenets of digital probabilistic physics were first explored at great length by Tom Stonier in a series of books which explore the notion of information as existing as a physical phenomenon of the universe. According to Stonier, the arrangement of atoms and molecules which make up physical objects contains information, and high-information objects such as DNA are low-probability physical structures. Within this framework, civilization itself is a low-probability construct maintaining its existence by propagating through communication. Stonier's work has been unique in considering information as existing as a physical phenomenon, being broader than as an application to the domain of telecommunications.

To distinguish the probability of the physical state of the molecules from the probability of the energy distribution of thermodynamics, the term extropy was appropriated to define the probability of the atomic configuration, as opposed to the entropy. Thus, in thermodynamics, a 'coarse-grain' set of partitions is defined which groups together similar microscopically different states and in digital probabilistic physics the specific microscopic state probability is considered alone. The extropy is defined to be the self-information of the Markov chain describing the physical system.

The extropy of a system in bits associated with the Markov chain configuration whose outcome has probability is[citation needed]:

Within this philosophy, the probability of the physical system does not necessarily change with the deterministic flow of energy through the atomic framework, but rather moves into a lesser probability state when the system goes through a bifurcating transition. Examples of this include Bernoulli cell formation, quantum fluctuations in a gravitational field causing gravitational precipitation points, and other systems moving through unstable self-amplifying state transitions.


Digitality (also known as digitalism) is used to mean the condition of living in a digital culture, derived from Nicholas Negroponte's book Being Digital in analogy with modernity and post-modernity.

Edward Fredkin

Edward Fredkin (born October 2, 1934) is a distinguished career professor at Carnegie Mellon University (CMU), Pennsylvania, and an early pioneer of Digital physics.Fredkin's primary contributions include his work on reversible computing and cellular automata. While Konrad Zuse's book, Calculating Space (1969), mentioned the importance of reversible computation, the Fredkin gate represented the essential breakthrough. In recent work, he uses the term Digital philosophy (DP).

During his career, Fredkin served on the faculties of Massachusetts Institute of Technology in Computer Science, was a Fairchild Distinguished Scholar at Caltech, and was Research Professor of Physics at Boston University, Massachusetts.

Fredkin finite nature hypothesis

In digital physics, the Fredkin finite nature hypothesis states that ultimately all quantities of physics, including space and time, are discrete and finite. All measurable physical quantities arise from some Planck scale substrate for information processing. Also, the amount of information in any small volume of spacetime will be finite and equal to a small number of possibilities.

Generative science

Generative science is an area of research that explores the natural world and its complex behaviours. It explores ways "to generate apparently unanticipated and infinite behaviour based on deterministic and finite rules and parameters reproducing or resembling the behavior of natural and social phenomena". By modelling such interactions, it can suggest that properties exist in the system that had not been noticed in the real world situation. An example field of study is how unintended consequences arise in social processes.

Generative sciences often explore natural phenomena at several levels of organization. Self-organizing natural systems are a central subject, studied both theoretically and by simulation experiments. The study of complex systems in general has been grouped under the heading of "general systems theory", particularly by Ludwig von Bertalanffy, Anatol Rapoport, Ralph Gerard, and Kenneth Boulding.

These sciences include psychology and cognitive science, cellular automata, generative linguistics, natural language processing, connectionism, self-organization, evolutionary biology, neural network, social network, neuromusicology, quantum cellular automata, information theory, systems theory, genetic algorithms, computational sociology, communication networks, artificial life, chaos theory, complexity theory, network science, epistemology, quantum dot cellular automaton, quantum computer, systems thinking, genetics, economy, philosophy of science, quantum mechanics, cybernetics, digital physics, digital philosophy, bioinformatics, agent-based modeling and catastrophe theory.

Gottfried Wilhelm Leibniz

Gottfried Wilhelm (von) Leibniz (sometimes spelled Leibnitz) (; German: [ˈɡɔtfʁiːt ˈvɪlhɛlm fɔn ˈlaɪbnɪts] or [ˈlaɪpnɪts]; French: Godefroi Guillaume Leibnitz; 1 July 1646 [O.S. 21 June] – 14 November 1716) was a prominent German (of Slavic origin) polymath and philosopher in the history of mathematics and the history of philosophy. His most notable accomplishment was conceiving the ideas of differential and integral calculus, independently of Isaac Newton's contemporaneous developments. Mathematical works have always favored Leibniz's notation as the conventional expression of calculus, while Newton's notation became unused. It was only in the 20th century that Leibniz's law of continuity and transcendental law of homogeneity found mathematical implementation (by means of non-standard analysis). He became one of the most prolific inventors in the field of mechanical calculators. While working on adding automatic multiplication and division to Pascal's calculator, he was the first to describe a pinwheel calculator in 1685 and invented the Leibniz wheel, used in the arithmometer, the first mass-produced mechanical calculator. He also refined the binary number system, which is the foundation of all digital computers.

In philosophy, Leibniz is most noted for his optimism, i.e. his conclusion that our universe is, in a restricted sense, the best possible one that God could have created, an idea that was often lampooned by others such as Voltaire. Leibniz, along with René Descartes and Baruch Spinoza, was one of the three great 17th-century advocates of rationalism. The work of Leibniz anticipated modern logic and analytic philosophy, but his philosophy also looks back to the scholastic tradition, in which conclusions are produced by applying reason to first principles or prior definitions rather than to empirical evidence.

Leibniz made major contributions to physics and technology, and anticipated notions that surfaced much later in philosophy, probability theory, biology, medicine, geology, psychology, linguistics, and computer science. He wrote works on philosophy, politics, law, ethics, theology, history, and philology. Leibniz also contributed to the field of library science. While serving as overseer of the Wolfenbüttel library in Germany, he devised a cataloging system that would serve as a guide for many of Europe's largest libraries. Leibniz's contributions to this vast array of subjects were scattered in various learned journals, in tens of thousands of letters, and in unpublished manuscripts. He wrote in several languages, but primarily in Latin, French, and German. There is no complete gathering of the writings of Leibniz translated into English.

Gregory Chaitin

Gregory John Chaitin ( CHY-tin; born 25 June 1947) is an Argentine-American mathematician and computer scientist. Beginning in the late 1960s, Chaitin made contributions to algorithmic information theory and metamathematics, in particular a computer-theoretic result equivalent to Gödel's incompleteness theorem. He is considered to be one of the founders of what is today known as Kolmogorov (or Kolmogorov-Chaitin) complexity together with Andrei Kolmogorov and Ray Solomonoff. Today, algorithmic information theory is a common subject in any computer science curriculum.

Index of physics articles (D)

The index of physics articles is split into multiple pages due to its size.

To navigate by individual letter use the table of contents below.

Konrad Zuse

Konrad Zuse (German: [ˈkɔnʁat ˈtsuːzə]; 22 June 1910 – 18 December 1995) was a German civil engineer, inventor and computer pioneer. His greatest achievement was the world's first programmable computer; the functional program-controlled Turing-complete Z3 became operational in May 1941. Thanks to this machine and its predecessors, Zuse has often been regarded as the inventor of the modern computer.Zuse was also noted for the S2 computing machine, considered the first process control computer. He founded one of the earliest computer businesses in 1941, producing the Z4, which became the world's first commercial computer. From 1943 to 1945 he designed the first high-level programming language, Plankalkül. In 1969, Zuse suggested the concept of a computation-based universe in his book Rechnender Raum (Calculating Space).

Much of his early work was financed by his family and commerce, but after 1939 he was given resources by the Nazi German government. Due to World War II, Zuse's work went largely unnoticed in the United Kingdom and the United States. Possibly his first documented influence on a US company was IBM's option on his patents in 1946.

There is a replica of the Z3, as well as the original Z4, in the Deutsches Museum in Munich. The Deutsches Technikmuseum in Berlin has an exhibition devoted to Zuse, displaying twelve of his machines, including a replica of the Z1 and several of Zuse's paintings.

List of philosophies

Philosophies: particular schools of thought, styles of philosophy, or descriptions of philosophical ideas attributed to a particular group or culture - listed in alphabetical order.


Mathematicism is any opinion, viewpoint, school of thought, or philosophy that states that everything can be described/defined/modelled ultimately by mathematics, or that the universe and reality (both material and mental/spiritual) are fundamentally/fully/only mathematical, i.e. that 'everything is mathematics' necessitating the ideas of logic, reason, mind, and spirit.

Mechanism (philosophy)

Mechanism is the belief that natural wholes (principally living things) are like complicated machines or artefacts, composed of parts lacking any intrinsic relationship to each other. Thus, the source of an apparent thing's activities is not the whole itself, but its parts or an external influence on the parts.The doctrine of mechanism in philosophy comes in two different flavors. They are both doctrines of metaphysics, but they are different in scope and ambitions: the first is a global doctrine about nature; the second is a local doctrine about humans and their minds, which is hotly contested. For clarity, we might distinguish these two doctrines as universal mechanism and anthropic mechanism.

There is no constant meaning in the history of philosophy for the word Mechanism. Originally, the term meant that cosmological theory which ascribes the motion and changes of the world to some external force. In this view material things are purely passive, while according to the opposite theory (i. e., Dynamism), they possess certain internal sources of energy which account for the activity of each and for its influence on the course of events; These meanings, however, soon underwent modification. The question as to whether motion is an inherent property of bodies, or has been communicated to them by some external agency, was very often ignored. With a large number of cosmologists the essential feature of Mechanism is the attempt to reduce all the qualities and activities of bodies to quantitative realities, i. e. to mass and motion. But a further modification soon followed. Living bodies, as is well known, present at first sight certain characteristic properties which have no counterpart in lifeless matter. Mechanism aims to go beyond these appearances. It seeks to explain all "vital" phenomena as physical and chemical facts; whether or not these facts are in turn reducible to mass and motion becomes a secondary question, although Mechanists are generally inclined to favour such reduction. The theory opposed to this biological mechanism is no longer Dynamism, but Vitalism or Neo-vitalism, which maintains that vital activities cannot be explained, and never will be explained, by the laws which govern lifeless matter.

Norman Margolus

Norman H. Margolus (born 1955) is a Canadian-American physicist and computer scientist, known for his work on cellular automata and reversible computing. He is a research affiliate with the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology.Margolus was one of the organizers of a seminal research meeting on the connections between physics and computation theory, held on Mosquito Island in 1982. He is known for inventing the block cellular automaton and the Margolus neighborhood for block cellular automata, which he used to develop cellular automaton simulations of billiard-ball computers. In the same work, Margolus also showed that the billiard ball model could be simulated by a second order cellular automaton, a different type of cellular automaton invented by his thesis advisor, Edward Fredkin. These two simulations were among the first cellular automata that were both reversible (able to be run backwards as well as forwards for any number of time steps, without ambiguity) and universal (able to simulate the operations of any computer program); this combination of properties is important in low-energy computing, as it has been shown that the energy dissipation of computing devices may be made arbitrarily small if and only if they are reversible. In connection with this issue, Margolus and his co-author Lev B. Levitin proved the Margolus–Levitin theorem showing that the speed of any computer is limited by the fundamental laws of physics to be at most proportional to its energy use; this implies that ultra-low-energy computers must run more slowly than conventional computers.With Tommaso Toffoli, Margolus developed the CAM-6 cellular automaton simulation hardware, which he extensively described in his book with Toffoli, Cellular Automata Machines (MIT Press, 1987), and with Tom Knight he developed the "Flattop" integrated circuit implementation of billiard-ball computation. He has also done pioneering research on the reversible quantum gate logic needed to support quantum computers.Margolus received his Ph.D. in physics in 1987 from MIT, under the supervision of Edward Fredkin. He founded and was chief scientist for Permabit, an information storage device company.

Object-oriented ontology

In metaphysics, object-oriented ontology (OOO) is a 21st-century Heidegger-influenced school of thought that rejects the privileging of human existence over the existence of nonhuman objects. This is in contrast to what it calls the "anthropocentrism" of Kant's Copernican Revolution, as accepted by most other current metaphysics, in which phenomenal objects are said to conform to the mind of the subject and, in turn, become products of human cognition. Object-oriented ontology maintains that objects exist independently (as Kantian noumena) of human perception and are not ontologically exhausted by their relations with humans or other objects. Thus, for object-oriented ontologists, all relations, including those between nonhumans, distort their related objects in the same basic manner as human consciousness and exist on an equal footing with one another.Object-oriented ontology is often viewed as a subset of speculative realism, a contemporary school of thought that criticizes the post-Kantian reduction of philosophical enquiry to a correlation between thought and being, such that the reality of anything outside of this correlation is unknowable. Object-oriented ontology predates speculative realism, however, and makes distinct claims about the nature and equality of object relations to which not all speculative realists agree. The term "object-oriented philosophy" was coined by Graham Harman, the movement's founder, in his 1999 doctoral dissertation "Tool-Being: Elements in a Theory of Objects". In 2009, Levi Bryant rephrased Harman's original designation as "object-oriented ontology", giving the movement its current name.


PhilPapers is an international, interactive academic database of journal articles for professionals and students in philosophy. It is maintained by the Centre for Digital Philosophy at the University of Western Ontario.

As of 2018, the general editors are David Bourget (ANU and University of London) and David Chalmers (ANU).

PhilPapers receives financial support from other organizations, including a substantial grant in early 2009 from the Joint Information Systems Committee in the United Kingdom. The archive is praised for its comprehensiveness and organization, and for its regular updates. In addition to archiving papers, the editors engage in surveying academic philosophers.

Philosophy of information

The philosophy of information (PI) is a branch of philosophy that studies topics relevant to computer science, information science and information technology.

It includes:

the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilisation and sciences

the elaboration and application of information-theoretic and computational methodologies to philosophical problems.

Philosophy of physics

In philosophy, philosophy of physics deals with conceptual and interpretational issues in modern physics, and often overlaps with research done by certain kinds of theoretical physicists. Philosophy of physics can be very broadly lumped into three main areas:

The interpretations of quantum mechanics: Concerning issues, mainly, with how to formulate an adequate response to the measurement problem, and understand what the theory tells us about reality.

The nature of space and time: Are space and time substances, or purely relational? Is simultaneity conventional or just relative? Is temporal asymmetry purely reducible to thermodynamic asymmetry?

Inter-theoretic relations: the relationship between various physical theories, such as thermodynamics and statistical mechanics. This overlaps with the issue of scientific reduction.

Simulated reality

Simulated reality is the hypothesis that reality could be simulated—for example by quantum computer simulation—to a degree indistinguishable from "true" reality. It could contain conscious minds which may or may not be fully aware that they are living inside a simulation. This is quite different from the current, technologically achievable concept of virtual reality. Virtual reality is easily distinguished from the experience of actuality; participants are never in doubt about the nature of what they experience. Simulated reality, by contrast, would be hard or impossible to separate from "true" reality. There has been much debate over this topic, ranging from philosophical discourse to practical applications in computing.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.