Determinism is the philosophical idea that all events, including moral choices, are determined completely by previously existing causes. Determinism is at times understood to preclude free will because it entails that humans cannot act otherwise than they do. It can also be called hard determinism from this point of view. Hard determinism is a position on the relationship of determinism to free will. The theory holds that the universe is utterly rational because complete knowledge of any given situation assures that unerring knowledge of its future is also possible. Some philosophers suggest variants around this basic definition. Deterministic theories throughout the history of philosophy have sprung from diverse and sometimes overlapping motives and considerations. The opposite of determinism is some kind of indeterminism (otherwise called nondeterminism). Determinism is often contrasted with free will.
Determinism often is taken to mean causal determinism, which in physics is known as cause-and-effect. It is the concept that events within a given paradigm are bound by causality in such a way that any state (of an object or event) is completely determined by prior states. This meaning can be distinguished from other varieties of determinism mentioned below.
Other debates often concern the scope of determined systems, with some maintaining that the entire universe is a single determinate system and others identifying other more limited determinate systems (or multiverse). Numerous historical debates involve many philosophical positions and varieties of determinism. They include debates concerning determinism and free will, technically denoted as compatibilistic (allowing the two to coexist) and incompatibilistic (denying their coexistence is a possibility). Determinism should not be confused with self-determination of human actions by reasons, motives, and desires. Determinism rarely requires that perfect prediction be practically possible.
"Determinism" may commonly refer to any of the following viewpoints:
Although some of the above forms of determinism concern human behaviors and cognition, others frame themselves as an answer to the debate on nature and nurture. They will suggest that one factor will entirely determine behavior. As scientific understanding has grown, however, the strongest versions of these theories have been widely rejected as a single-cause fallacy.
In other words, the modern deterministic theories attempt to explain how the interaction of both nature and nurture is entirely predictable. The concept of heritability has been helpful in making this distinction.
Environmental determinism, also known as climatic or geographical determinism, proposes that the physical environment, rather than social conditions, determines culture. Supporters of environmental determinism often also support Behavioral determinism. Key proponents of this notion have included Ellen Churchill Semple, Ellsworth Huntington, Thomas Griffith Taylor and possibly Jared Diamond, although his status as an environmental determinist is debated.
Other 'deterministic' theories actually seek only to highlight the importance of a particular factor in predicting the future. These theories often use the factor as a sort of guide or constraint on the future. They need not suppose that complete knowledge of that one factor would allow us to make perfect predictions.
Psychological determinism can mean that humans must act according to reason, but it can also be synonymous with some sort of Psychological egoism. The latter is the view that humans will always act according to their perceived best interest.
Linguistic determinism claims that our language determines (at least limits) the things we can think and say and thus know. The Sapir–Whorf hypothesis argues that individuals experience the world based on the grammatical structures they habitually use.
Technological determinism is a reductionist theory that presumes that a society's technology drives the development of its social structure and cultural values.
Philosophers have debated both the truth of determinism, and the truth of free will. This creates the four possible positions in the figure. Compatibilism refers to the view that free will is, in some sense, compatible with determinism. The three incompatibilist positions, on the other hand, deny this possibility. The hard incompatibilists hold that both determinism and free will do not exist, the libertarianists that determinism does not hold, and free will might exist, and the hard determinists that determinism does hold and free will does not exist.
The Dutch philosopher Baruch Spinoza was a determinist thinker, and argued that human freedom can be achieved through knowledge of the causes that determine our desire and affections. He defined human servitude as the state of bondage of the man who is aware of his own desires, but ignorant of the causes that determined him. On the other hand, the free or virtuous man becomes capable, through reason and knowledge, to be genuinely free, even as he is being "determined". For the Dutch philosopher, acting out of our own internal necessity is genuine freedom while being driven by exterior determinations is akin to bondage. Spinoza's thoughts on human servitude and liberty are respectively detailed in the fourth  and fifth  volumes of his work of art Ethics.
The standard argument against free will, according to philosopher J. J. C. Smart, focuses on the implications of determinism for 'free will'. However, he suggests free will is denied whether determinism is true or not. On one hand, if determinism is true, all our actions are predicted and we are assumed not to be free; on the other hand, if determinism is false, our actions are presumed to be random and as such we do not seem free because we had no part in controlling what happened.
In his book, The Moral Landscape, author and neuroscientist Sam Harris also argues against free will. He offers one thought experiment where a mad scientist represents determinism. In Harris' example, the mad scientist uses a machine to control all the desires, and thus all the behavior, of a particular human. Harris believes that it is no longer as tempting, in this case, to say the victim has "free will". Harris says nothing changes if the machine controls desires at random - the victim still seems to lack free will. Harris then argues that we are also the victims of such unpredictable desires (but due to the unconscious machinations of our brain, rather than those of a mad scientist). Based on this introspection, he writes "This discloses the real mystery of free will: if our experience is compatible with its utter absence, how can we say that we see any evidence for it in the first place?" adding that "Whether they are predictable or not, we do not cause our causes." That is, he believes there is compelling evidence of absence of free will. Harris argues at greater length against both libertarian and compatibilist views in his later book Free Will.
Some determinists argue that materialism does not present a complete understanding of the universe, because while it can describe determinate interactions among material things, it ignores the minds or souls of conscious beings.
A number of positions can be delineated:
Another topic of debate is the implication that Determinism has on morality. Hard determinism (a belief in determinism, and not free will) is particularly criticized for seeming to make traditional moral judgments impossible. Some philosophers, however, find this an acceptable conclusion.
Philosopher and incompatibilist Peter van Inwagen introduces this thesis as such:
Argument that Free Will is Required for Moral Judgments
However, a compatibilist might have an issue with Inwagen's process, because one cannot change the past as his arguments center around. A compatibilist who centers around plans for the future might posit:
Determinism was developed by the Greek philosophers during the 7th and 6th centuries BC by the Pre-socratic philosophers Heraclitus and Leucippus, later Aristotle, and mainly by the Stoics. Some of the main philosophers who have dealt with this issue are Marcus Aurelius, Omar Khayyám, Thomas Hobbes, Baruch Spinoza, Gottfried Leibniz, David Hume, Baron d'Holbach (Paul Heinrich Dietrich), Pierre-Simon Laplace, Arthur Schopenhauer, William James, Friedrich Nietzsche, Albert Einstein, Niels Bohr, Ralph Waldo Emerson and, more recently, John Searle, Sam Harris, Ted Honderich, and Daniel Dennett.
Mecca Chiesa notes that the probabilistic or selectionistic determinism of B.F. Skinner comprised a wholly separate conception of determinism that was not mechanistic at all. Mechanistic determinism assumes that every event has an unbroken chain of prior occurrences, but a selectionistic or probabilistic model does not.
In the West, some elements of determinism have been expressed in Greece from the 6th century BC by the Presocratics Heraclitus and Leucippus. The first full-fledged notion of determinism appears to originate with the Stoics, as part of their theory of universal causal determinism. The resulting philosophical debates, which involved the confluence of elements of Aristotelian Ethics with Stoic psychology, led in the 1st-3rd centuries CE in the works of Alexander of Aphrodisias to the first recorded Western debate over determinism and freedom, an issue that is known in theology as the paradox of free will. The writings of Epictetus as well as Middle Platonist and early Christian thought were instrumental in this development. The Jewish philosopher Moses Maimonides said of the deterministic implications of an omniscient god: "Does God know or does He not know that a certain individual will be good or bad? If thou sayest 'He knows', then it necessarily follows that [that] man is compelled to act as God knew beforehand he would act, otherwise God's knowledge would be imperfect.…"
Determinism in the West is often associated with Newtonian physics, which depicts the physical matter of the universe as operating according to a set of fixed, knowable laws. The "billiard ball" hypothesis, a product of Newtonian physics, argues that once the initial conditions of the universe have been established, the rest of the history of the universe follows inevitably. If it were actually possible to have complete knowledge of physical matter and all of the laws governing that matter at any one time, then it would be theoretically possible to compute the time and place of every event that will ever occur (Laplace's demon). In this sense, the basic particles of the universe operate in the same fashion as the rolling balls on a billiard table, moving and striking each other in predictable ways to produce predictable results.
Whether or not it is all-encompassing in so doing, Newtonian mechanics deals only with caused events, e.g.: If an object begins in a known position and is hit dead on by an object with some known velocity, then it will be pushed straight toward another predictable point. If it goes somewhere else, the Newtonians argue, one must question one's measurements of the original position of the object, the exact direction of the striking object, gravitational or other fields that were inadvertently ignored, etc. Then, they maintain, repeated experiments and improvements in accuracy will always bring one's observations closer to the theoretically predicted results. When dealing with situations on an ordinary human scale, Newtonian physics has been so enormously successful that it has no competition. But it fails spectacularly as velocities become some substantial fraction of the speed of light and when interactions at the atomic scale are studied. Before the discovery of quantum effects and other challenges to Newtonian physics, "uncertainty" was always a term that applied to the accuracy of human knowledge about causes and effects, and not to the causes and effects themselves.
Newtonian mechanics as well as any following physical theories are results of observations and experiments, and so they describe "how it all works" within a tolerance. However, old western scientists believed if there are any logical connections found between an observed cause and effect, there must be also some absolute natural laws behind. Belief in perfect natural laws driving everything, instead of just describing what we should expect, led to searching for a set of universal simple laws that rule the world. This movement significantly encouraged deterministic views in western philosophy, as well as the related theological views of Classical Pantheism.
The idea that the entire universe is a deterministic system has been articulated in both Eastern and non-Eastern religion, philosophy, and literature.
In the philosophical schools of India, the concept of precise and continual effect of laws of Karma on the existence of all sentient beings is analogous to western deterministic concept. Karma is the concept of "action" or "deed" in Indian religions. It is understood as that which causes the entire cycle of cause and effect (i.e., the cycle called saṃsāra) originating in ancient India and treated in Hindu, Jain, and Sikh. Karma is considered predetermined and deterministic in the universe, and in combination with the decisions (free will) of living beings, accumulates to determine futuristic situations that the living being encounters. See Karma in Hinduism.
Although it was once thought by scientists that any indeterminism in quantum mechanics occurred at too small a scale to influence biological or neurological systems, there is indication that nervous systems are influenced by quantum indeterminism due to chaos theory. It is unclear what implications this has for the problem of free will given various possible reactions to the problem in the first place. Many biologists don't grant determinism: Christof Koch argues against it, and in favour of libertarian free will, by making arguments based on generative processes (emergence). Other proponents of emergentist or generative philosophy, cognitive sciences and evolutionary psychology, argue that a certain form of determinism (not necessarily causal) is true. They suggest instead that an illusion of free will is experienced due to the generation of infinite behaviour from the interaction of finite-deterministic set of rules and parameters. Thus the unpredictability of the emerging behaviour from deterministic processes leads to a perception of free will, even though free will as an ontological entity does not exist. Certain experiments looking at the neuroscience of free will can be said to support this possibility.
As an illustration, the strategy board-games chess and Go have rigorous rules in which no information (such as cards' face-values) is hidden from either player and no random events (such as dice-rolling) happen within the game. Yet, chess and especially Go with its extremely simple deterministic rules, can still have an extremely large number of unpredictable moves. When chess is simplified to 7 or fewer pieces, however, endgame tables are available that dictate which moves to play to achieve a perfect game. This implies that, given a less complex environment (with the original 32 pieces reduced to 7 or fewer pieces), a perfectly predictable game of chess is possible. In this scenario, the winning player can announce that a checkmate will happen within a given number of moves, assuming a perfect defense by the losing player, or fewer moves if the defending player chooses sub-optimal moves as the game progresses into its inevitable, predicted conclusion. By this analogy, it is suggested, the experience of free will emerges from the interaction of finite rules and deterministic parameters that generate nearly infinite and practically unpredictable behavioural responses. In theory, if all these events could be accounted for, and there were a known way to evaluate these events, the seemingly unpredictable behaviour would become predictable. Another hands-on example of generative processes is John Horton Conway's playable Game of Life. Nassim Taleb is wary of such models, and coined the term "ludic fallacy".
Certain philosophers of science argue that, while causal determinism (in which everything including the brain/mind is subject to the laws of causality) is compatible with minds capable of science, fatalism and predestination is not. These philosophers make the distinction that causal determinism means that each step is determined by the step before and therefore allows sensory input from observational data to determine what conclusions the brain reaches, while fatalism in which the steps between do not connect an initial cause to the results would make it impossible for observational data to correct false hypotheses. This is often combined with the argument that if the brain had fixed views and the arguments were mere after-constructs with no causal effect on the conclusions, science would have been impossible and the use of arguments would have been a meaningless waste of energy with no persuasive effect on brains with fixed views.
Many mathematical models of physical systems are deterministic. This is true of most models involving differential equations (notably, those measuring rate of change over time). Mathematical models that are not deterministic because they involve randomness are called stochastic. Because of sensitive dependence on initial conditions, some deterministic models may appear to behave non-deterministically; in such cases, a deterministic interpretation of the model may not be useful due to numerical instability and a finite amount of precision in measurement. Such considerations can motivate the consideration of a stochastic model even though the underlying system is governed by deterministic equations.
Since the beginning of the 20th century, quantum mechanics—the physics of the extremely small—has revealed previously concealed aspects of events. Before that, Newtonian physics—the physics of everyday life—dominated. Taken in isolation (rather than as an approximation to quantum mechanics), Newtonian physics depicts a universe in which objects move in perfectly determined ways. At the scale where humans exist and interact with the universe, Newtonian mechanics remain useful, and make relatively accurate predictions (e.g. calculating the trajectory of a bullet). But whereas in theory, absolute knowledge of the forces accelerating a bullet would produce an absolutely accurate prediction of its path, modern quantum mechanics casts reasonable doubt on this main thesis of determinism.
Relevant is the fact that certainty is never absolute in practice (and not just because of David Hume's problem of induction). The equations of Newtonian mechanics can exhibit sensitive dependence on initial conditions. This is an example of the butterfly effect, which is one of the subjects of chaos theory. The idea is that something even as small as a butterfly could cause a chain reaction leading to a hurricane years later. Consequently, even a very small error in knowledge of initial conditions can result in arbitrarily large deviations from predicted behavior. Chaos theory thus explains why it may be practically impossible to predict real life, whether determinism is true or false. On the other hand, the issue may not be so much about human abilities to predict or attain certainty as much as it is the nature of reality itself. For that, a closer, scientific look at nature is necessary.
Quantum physics works differently in many ways from Newtonian physics. Physicist Aaron D. O'Connell explains that understanding our universe, at such small scales as atoms, requires a different logic than day-to-day life does. O'Connell does not deny that it is all interconnected: the scale of human existence ultimately does emerge from the quantum scale. O'Connell argues that we must simply use different models and constructs when dealing with the quantum world. Quantum mechanics is the product of a careful application of the scientific method, logic and empiricism. The Heisenberg uncertainty principle is frequently confused with the observer effect. The uncertainty principle actually describes how precisely we may measure the position and momentum of a particle at the same time — if we increase the accuracy in measuring one quantity, we are forced to lose accuracy in measuring the other. "These uncertainty relations give us that measure of freedom from the limitations of classical concepts which is necessary for a consistent description of atomic processes."
This is where statistical mechanics come into play, and where physicists begin to require rather unintuitive mental models: A particle's path simply cannot be exactly specified in its full quantum description. "Path" is a classical, practical attribute in our every day life, but one that quantum particles do not meaningfully possess. The probabilities discovered in quantum mechanics do nevertheless arise from measurement (of the perceived path of the particle). As Stephen Hawking explains, the result is not traditional determinism, but rather determined probabilities. In some cases, a quantum particle may indeed trace an exact path, and the probability of finding the particles in that path is one (certain to be true). In fact, as far as prediction goes, the quantum development is at least as predictable as the classical motion, but the key is that it describes wave functions that cannot be easily expressed in ordinary language. As far as the thesis of determinism is concerned, these probabilities, at least, are quite determined. These findings from quantum mechanics have found many applications, and allow us to build transistors and lasers. Put another way: personal computers, Blu-ray players and the internet all work because humankind discovered the determined probabilities of the quantum world. None of that should be taken to imply that other aspects of quantum mechanics are not still up for debate.
On the topic of predictable probabilities, the double-slit experiments are a popular example. Photons are fired one-by-one through a double-slit apparatus at a distant screen. Curiously, they do not arrive at any single point, nor even the two points lined up with the slits (the way you might expect of bullets fired by a fixed gun at a distant target). Instead, the light arrives in varying concentrations at widely separated points, and the distribution of its collisions with the target can be calculated reliably. In that sense the behavior of light in this apparatus is deterministic, but there is no way to predict where in the resulting interference pattern any individual photon will make its contribution (although, there may be ways to use weak measurement to acquire more information without violating the Uncertainty principle).
Some (including Albert Einstein) argue that our inability to predict any more than probabilities is simply due to ignorance. The idea is that, beyond the conditions and laws we can observe or deduce, there are also hidden factors or "hidden variables" that determine absolutely in which order photons reach the detector screen. They argue that the course of the universe is absolutely determined, but that humans are screened from knowledge of the determinative factors. So, they say, it only appears that things proceed in a merely probabilistically determinative way. In actuality, they proceed in an absolutely deterministic way.
John S. Bell criticized Einstein's work in his famous Bell's Theorem, which proved that quantum mechanics can make statistical predictions that would be violated if local hidden variables really existed. A number of experiments have tried to verify such predictions, and so far they do not appear to be violated. Improved continue to verify the result, including the 2015 "Loophole Free Test" that plugged all known sources of error and the 2017 "Cosmic Bell Test" that based the experiment cosmic data streaming from different directions toward the Earth, precluding the possibility the sources of data could have had prior interactions. However, it is possible to augment quantum mechanics with non-local hidden variables to achieve a deterministic theory that is in agreement with experiment. An example is the Bohm interpretation of quantum mechanics. Bohm's Interpretation, though, violates special relativity and it is highly controversial whether or not it can be reconciled without giving up on determinism.
More advanced variations on these arguments include Quantum contextuality, by Bell, Simon B. Kochen and Ernst Specker in which argues that hidden variable theories cannot be "sensible," which here means that the values of the hidden variables inherently depend on the devices used to measure them.
This debate is relevant because it is easy to imagine specific situations in which the arrival of an electron at a screen at a certain point and time would trigger one event, whereas its arrival at another point would trigger an entirely different event (e.g. see Schrödinger's cat - a thought experiment used as part of a deeper debate).
Thus, quantum physics casts reasonable doubt on the traditional determinism of classical, Newtonian physics in so far as reality does not seem to be absolutely determined. This was the subject of the famous Bohr–Einstein debates between Einstein and Niels Bohr and there is still no consensus.
All uranium found on earth is thought to have been synthesized during a supernova explosion that occurred roughly 5 billion years ago. Even before the laws of quantum mechanics were developed to their present level, the radioactivity of such elements has posed a challenge to determinism due to its unpredictability. One gram of uranium-238, a commonly occurring radioactive substance, contains some 2.5 x 1021 atoms. Each of these atoms are identical and indistinguishable according to all tests known to modern science. Yet about 12600 times a second, one of the atoms in that gram will decay, giving off an alpha particle. The challenge for determinism is to explain why and when decay occurs, since it does not seem to depend on external stimulus. Indeed, no extant theory of physics makes testable predictions of exactly when any given atom will decay. At best scientists can discover determined probabilities in the form of the element's half life.
So if the wave function itself is reality (rather than probability of classical coordinates), then the unitary evolution of the wave function in quantum mechanics, can be said to be deterministic. But the unitary evolution of the wave function is not the entirety of quantum mechanics.
Asserting that quantum mechanics is deterministic by treating the wave function itself as reality might be thought to imply a single wave function for the entire universe, starting at the origin of the universe. Such a "wave function of everything" would carry the probabilities of not just the world we know, but every other possible world that could have evolved. For example, large voids in the distributions of galaxies are believed by many cosmologists to have originated in quantum fluctuations during the big bang. (See cosmic inflation, primordial fluctuations and large-scale structure of the cosmos.)
However, neither the posited reality nor the proven and extraordinary accuracy of the wave function and quantum mechanics at small scales can imply or reasonably suggest the existence of a single wave function for the entire universe. Quantum mechanics breaks down wherever gravity becomes significant, because nothing in the wave function, or in quantum mechanics, predicts anything at all about gravity. And this is obviously of great importance on larger scales.
Gravity is thought of as a large-scale force, with a longer reach than any other. But gravity becomes significant even at masses that are tiny compared to the mass of the universe.
A wave function the size of the universe might successfully model a universe with no gravity. Our universe, with gravity, is vastly different from what quantum mechanics alone predicts. To forget this is a colossal error.
Objective collapse theories, which involve a dynamic (and non-deterministic) collapse of the wave function (e.g. Ghirardi–Rimini–Weber theory, Penrose interpretation, or causal fermion systems) avoid these absurdities. The theory of causal fermion systems for example, is able to unify quantum mechanics, general relativity and quantum field theory, via a more fundamental theory that is non-linear, but gives rise to the linear behaviour of the wave function and also gives rise to the non-linear, non-deterministic, wave-function collapse. These theories suggest that a deeper understanding of the theory underlying quantum mechanics shows the universe is indeed non-deterministic at a fundamental level.
a theory is deterministic if, and only if, given its state variables for some initial period, the theory logically determines a unique set of values for those variables for any other period.
Predeterminism: the philosophical and theological view that combines God with determinism. On this doctrine events throughout eternity have been foreordained by some supernatural power in a causal sequence.
Predeterminism is here defined by the assumption that the experimenter's 'free will' in deciding what to measure (such as his choice to measure the x- or the y-component of an electron's spin), is in fact limited by deterministic laws, hence not free at all, and Sukumar, CV (1996). "A new paradigm for science and architecture". City. 1 (1–2): 181–183. doi:10.1080/13604819608900044.
Quantum Theory provided a beautiful description of the behaviour of isolated atoms and nuclei and small aggregates of elementary particles. Modern science recognized that predisposition rather than predeterminism is what is widely prevalent in nature.
Leibniz presents a clear case of a philosopher who does not think that predeterminism requires universal causal determinism
"Determinism" is, in essence, the position which holds that all behavior is caused by prior behavior. "Predeterminism" is the position which holds that all behavior is caused by conditions which predate behavior altogether (such impersonal boundaries as "the human conditions", instincts, the will of God, inherent knowledge, fate, and such).
The problem of predeterminism is one that involves the factors of heredity and environment, and the point to be debated here is the relation of the present self that chooses to these predetermining agencies, and Garris, M.D.; et al. (1992). "A Platform for Evolving Genetic Automata for Text Segmentation (GNATS)". Science of Artificial Neural Networks. 1710: 714–724. doi:10.1117/12.140132.
However, predeterminism is not completely avoided. If the codes within the genotype are not designed properly, then the organisms being evolved will be fundamentally handicapped.
theological determinism, or the doctrine of predestination: the view that everything which happens has been predestined to happen by an omniscient, omnipotent divinity. A weaker version holds that, though not predestined to happen, everything that happens has been eternally known by virtue of the divine foreknowledge of an omniscient divinity. If this divinity is also omnipotent, as in the case of the Judeo-Christian religions, this weaker version is hard to distinguish from the previous one because, though able to prevent what happens and knowing that it is going to happen, God lets it happen. To this, advocates of free will reply that God permits it to happen in order to make room for the free will of humans.
Theological determinism constitutes a fifth kind of determinism. There are two types of theological determinism, both compatible with scientific and metaphysical determinism. In the first, God determines everything that happens, either in one all-determining single act at the initial creation of the universe or through continuous divine interactions with the world. Either way, the consequence is that everything that happens becomes God's action, and determinism is closely linked to divine action and God's omnipotence. According to the second type of theological determinism, God has perfect knowledge of everything in the universe because God is omniscient. And, as some say, because God is outside of time, God has the capacity of knowing past, present, and future in one instance. This means that God knows what will happen in the future. And because God's omniscience is perfect, what God knows about the future will inevitably happen, which means, consequently, that the future is already fixed.
Theological determinism, on the other hand, claims that all events are determined by God. On this view, God decree that everything will go thus-and-so and ensure that everything goes that way, so that ultimately God is the cause of everything that happens and everything that happens is part of God's plan. We might think of God here as the all-powerful movie director who writes script and causes everything to go accord with it. We should note, as an aside, that there is some debate over what would be sufficient for theological determinism to be true. Some people claim that God's merely knowing what will happen determines that it will, while others believe that God must not only know but must also cause those events to occur in order for their occurrence to be determined.
The key question is whether to understand the nature of this probability as epistemic or ontic. Along epistemic lines, one possibility is that there is some additional factor (i.e., a hidden mechanism) such that once we discover and understand this factor, we would be able to predict the observed behavior of the quantum stoplight with certainty (physicists call this approach a "hidden variable theory"; see, e.g., Bell 1987, 1–13, 29–39; Bohm 1952a, 1952b; Bohm and Hiley 1993; Bub 1997, 40–114, Holland 1993; see also the preceding essay in this volume by Hodgson). Or perhaps there is an interaction with the broader environment (e.g., neighboring buildings, trees) that we have not taken into account in our observations that explains how these probabilities arise (physicists call this approach decoherence or consistent histories15). Under either of these approaches, we would interpret the observed indeterminism in the behavior of stoplights as an expression of our ignorance about the actual workings. Under an ignorance interpretation, indeterminism would not be a fundamental feature of quantum stoplights, but merely epistemic in nature due to our lack of knowledge about the system. Quantum stoplights would turn to be deterministic after all.
So, was Einstein wrong? In the sense that the EPR paper argued in favour of an objective reality for each quantum particle in an entangled pair independent of the other and of the measuring device, the answer must be yes. But if we take a wider view and ask instead if Einstein was wrong to hold to the realist's belief that the physics of the universe should be objective and deterministic, we must acknowledge that we cannot answer such a question. It is in the nature of theoretical science that there can be no such thing as certainty. A theory is only 'true' for as long as the majority of the scientific community maintain a consensus view that the theory is the one best able to explain the observations. And the story of quantum theory is not over yet.
Biological determinism, also known as genetic determinism is the belief that human behaviour is controlled by an individual's genes or some component of their physiology, generally at the expense of the role of the environment, whether in embryonic development or in learning. Genetic reductionism is a similar concept, but it is distinct from genetic determinism in that the former refers to the level of understanding, while the latter refers to the supposedly causal role of genes. It has been associated with movements in science and society including eugenics, scientific racism, the debate around the heritability of IQ, the biological basis for gender roles, and the sociobiology debate.
In 1892 August Weismann proposed in his germ plasm theory that heritable information is transmitted only via germ cells, which he thought contained determinants (genes). Francis Galton, supposing that undesirable traits such as club foot and criminality were inherited, advocated eugenics, aiming to prevent supposedly defective people from breeding. Samuel George Morton and Paul Broca attempted to relate the cranial capacity (internal skull volume) to skin colour, intending to show that white people were superior. Other workers such as H. H. Goddard, and Robert Yerkes attempted to measure people's intelligence and to show that the resulting scores were heritable, again to demonstrate the supposed superiority of people with white skin.
Galton popularized the phrase nature and nurture, later often used to characterize the heated debate over whether genes or the environment determined human behavior. Scientists such as ecologists and behavioural geneticists now see it as obvious that both factors are essential, and that they are intertwined.Late in the 20th century, the determinism of gender roles was debated by geneticists and others. Biologists such as John Money and Anke Ehrhardt attempted to describe femininity and homosexuality according to then-current social standards; against this, the evolutionary biologist Richard Lewontin and others argued that clothing and other preferences vary in different societies. The biologist E. O. Wilson founded the discipline of sociobiology, founded on observations of animals such as social insects, controversially suggesting that its explanations of social behaviour might apply to humans.Compatibilism
Compatibilism is the belief that free will and determinism are mutually compatible and that it is possible to believe in both without being logically inconsistent. Compatibilists believe freedom can be present or absent in situations for reasons that have nothing to do with metaphysics. They define free will as freedom to act according to one's motives without arbitrary hindrance from other individuals or institutions.Similarly, political liberty is a non-metaphysical concept. Statements of political liberty, such as the United States Bill of Rights, assume moral liberty: the ability to choose to do otherwise than one does.Cultural determinism
Cultural determinism is the belief that the culture in which we are raised determines who we are at emotional and behavioral levels. It contrasts with genetic determinism, the theory that biologically inherited traits and the environmental influences that affect those traits dominate who we are.
Yet another way of looking at the concept of cultural determinism is to contrast it with the idea of environmental determinism. The latter is the idea that the physical world- with all its constraints and potentially life-altering elements-is responsible for the make-up of each existing culture. Contrast this with the idea that we (humans) create our own situations through the power of thought, socialization, and all forms of information circulation.
It is also used to describe the concept that culture determines economic and political arrangements. It is an idea which has recurred in many cultures over human history, from ancient civilizations through the present.Destiny
Destiny, sometimes referred to as fate (from Latin fatum – destiny), is a predetermined course of events. It may be conceived as a predetermined future, whether in general or of an individual.Deterministic algorithm
In computer science, a deterministic algorithm is an algorithm which, given a particular input, will always produce the same output, with the underlying machine always passing through the same sequence of states. Deterministic algorithms are by far the most studied and familiar kind of algorithm, as well as one of the most practical, since they can be run on real machines efficiently.
Formally, a deterministic algorithm computes a mathematical function; a function has a unique value for any input in its domain, and the algorithm is a process that produces this particular value as output.Economic determinism
Economic determinism is a socioeconomic theory that economic relationships (such as being an owner or capitalist, or being a worker or proletarian) are the foundation upon which all other social and political arrangements in society are based. The theory stresses that societies are divided into competing economic classes whose relative political power is determined by the nature of the economic system. In the version associated with Karl Marx, the emphasis is on the proletariat who are considered to be locked in a class struggle with the capitalist class, which will eventually end with the revolutionary overthrow of the capitalist system and the gradual development of socialism. Marxist thinkers have dismissed plain and unilateral economic determinism as a form of "vulgar Marxism", or "economism", nowhere included in Marx's works.
In the writing of American history the term is associated with historian Charles A. Beard (1874–1948), who was not a Marxist but who emphasized the long-term political contest between bankers and business interest on the one hand, and agrarian interests on the other.Environmental determinism
Environmental determinism (also known as climatic determinism or geographical determinism) is the study of how the physical environment predisposes societies and states towards particular development trajectories. Nineteenth-century approaches held that climate and terrain largely determined human activity and psychology, and it was associated with institutionalized racism and eugenics. Many scholars underscore that this approach supported colonialism and eurocentrism, and devalued human agency in non-Western societies. Jared Diamond, Jeffrey Herbst, Ian Morris, and other social scientists sparked a revival of the theory during the late twentieth and early twenty-first centuries. This "neo-environmental determinism" school of thought examines how geographic and ecological forces influence state-building, economic development, and institutions.Fatalism
Fatalism is a philosophical doctrine that stresses the subjugation of all events or actions to destiny.
Fatalism generally refers to any of the following ideas:
The view that we are powerless to do anything other than what we actually do. Included in this is that humans have no power to influence the future, or indeed, their own actions. This belief is very similar to predeterminism.
An attitude of resignation in the face of some future event or events which are thought to be inevitable. Friedrich Nietzsche named this idea "Turkish fatalism" in his book The Wanderer and His Shadow.
That acceptance is appropriate, rather than resistance, against inevitability. This belief is very similar to defeatism.
Some take it to mean determinism.Free will
Free will is the ability to choose between different possible courses of action unimpeded.Free will is closely linked to the concepts of responsibility, praise, guilt, sin, and other judgements which apply only to actions that are freely chosen. It is also connected with the concepts of advice, persuasion, deliberation, and prohibition. Traditionally, only actions that are freely willed are seen as deserving credit or blame. There are numerous different concerns about threats to the possibility of free will, varying by how exactly it is conceived, which is a matter of some debate.
Some conceive free will to be the capacity to make choices in which the outcome has not been determined by past events. Determinism suggests that only one course of events is possible, which is inconsistent with the existence of free will thus conceived. This problem has been identified in ancient Greek philosophy and remains a major focus of philosophical debate. This view that conceives free will to be incompatible with determinism is called incompatibilism and encompasses both metaphysical libertarianism, the claim that determinism is false and thus free will is at least possible, and hard determinism, the claim that determinism is true and thus free will is not possible. It also encompasses hard incompatibilism, which holds not only determinism but also its negation to be incompatible with free will and thus free will to be impossible whatever the case may be regarding determinism.
In contrast, compatibilists hold that free will is compatible with determinism. Some compatibilists even hold that determinism is necessary for free will, arguing that choice involves preference for one course of action over another, requiring a sense of how choices will turn out. Compatibilists thus consider the debate between libertarians and hard determinists over free will vs determinism a false dilemma. Different compatibilists offer very different definitions of what "free will" even means and consequently find different types of constraints to be relevant to the issue. Classical compatibilists considered free will nothing more than freedom of action, considering one free of will simply if, had one counterfactually wanted to do otherwise, one could have done otherwise without physical impediment. Contemporary compatibilists instead identify free will as a psychological capacity, such as to direct one's behavior in a way responsive to reason, and there are still further different conceptions of free will, each with their own concerns, sharing only the common feature of not finding the possibility of determinism a threat to the possibility of free will.Haematopoiesis
Haematopoiesis (from Greek αἷμα, "blood" and ποιεῖν "to make"; also hematopoiesis in American English; sometimes also haemopoiesis or hemopoiesis) is the formation of blood cellular components. All cellular blood components are derived from haematopoietic stem cells. In a healthy adult person, approximately 1011–1012 new blood cells are produced daily in order to maintain steady state levels in the peripheral circulation.Hard determinism
Hard determinism (or metaphysical determinism) is a view on free will which holds that determinism is true, and that it is incompatible with free will, and, therefore, that free will does not exist. Although hard determinism generally refers to nomological determinism, it can also be a position taken with respect to other forms of determinism that necessitate the future in its entirety. Hard determinism is contrasted with soft determinism, which is a compatibilist form of determinism, holding that free will may exist despite determinism. It is also contrasted with metaphysical libertarianism, the other major form of incompatibilism which holds that free will exists and determinism is false.Historical determinism
Historical determinism is the stance that events are historically predetermined or currently constrained by various forces. Historical determinism can be understood in contrast to its negation, i.e. the rejection of historical determinism.
Some political philosophies (e.g. Early and Stalinist Marxism) assert a historical materialism of either predetermination or constraint, or both.
Used as a pejorative, it is normally meant to designate an overdetermination of present possibilities by historical conditions.Incompatibilism
Incompatibilism is the view that a deterministic universe is completely at odds with the notion that persons have a free will; that there is a dichotomy between determinism and free will where philosophers must choose one or the other. This view is pursued in at least three ways: libertarians deny that the universe is deterministic, the hard determinists deny that any free will exists, and pessimistic incompatibilists (hard indeterminists) deny both that the universe is determined and that free will exists.
Incompatibilism is contrasted with compatibilism, which rejects the determinism/free will dichotomy.Indeterminism
Indeterminism is the idea that events (or certain events, or events of certain types) are not caused, or not caused deterministically.
It is the opposite of determinism and related to chance. It is highly relevant to the philosophical problem of free will, particularly in the form of metaphysical libertarianism. In science, most specifically quantum theory in physics, indeterminism is the belief that no event is certain and the entire outcome of anything is probabilistic. The Heisenberg uncertainty relations and the "Born rule", proposed by Max Born, are often starting points in support of the indeterministic nature of the universe. Indeterminism is also asserted by Sir Arthur Eddington, and Murray Gell-Mann. Indeterminism has been promoted by the French biologist Jacques Monod's essay "Chance and Necessity".
The physicist-chemist Ilya Prigogine argued for indeterminism in complex systems.Libertarianism (metaphysics)
Libertarianism is one of the main philosophical positions related to the problems of free will and determinism, which are part of the larger domain of metaphysics. In particular, libertarianism, which is an incompatibilist position, argues that free will is logically incompatible with a deterministic universe and that agents have free will, and that, therefore, determinism is false. In the early modern period, some of the most important metaphysical libertarians were René Descartes, George Berkeley, Immanuel Kant, and Thomas Reid. Roderick Chisholm was a prominent defender of libertarianism in the 20th century, and contemporary libertarians include Robert Kane, Peter van Inwagen and Robert Nozick.Linguistic determinism
Linguistic determinism is the idea that language and its structures limit and determine human knowledge or thought, as well as thought processes such as categorization, memory, and perception. The term implies that people who speak different languages as their mother tongues have different thought processes.Linguistic determinism is the strong form of linguistic relativity (popularly known as the Sapir–Whorf hypothesis), which argues that individuals experience the world based on the structure of the language they habitually use.
Though it played a considerable role historically, linguistic determinism is now discredited among mainstream linguists.Nominative determinism
Nominative determinism is the hypothesis that people tend to gravitate towards areas of work that fit their names. The term was first used in the magazine New Scientist in 1994, after the magazine's humorous Feedback column noted several studies carried out by researchers with remarkably fitting surnames. These included a book on polar explorations by Daniel Snowman and an article on urology by researchers named Splatt and Weedon. These and other examples led to light-hearted speculation that some sort of psychological effect was at work. Since the term appeared, nominative determinism has been an irregularly recurring topic in New Scientist, as readers continue to submit examples. Nominative determinism differs from the related concept aptronym, and its synonyms aptonym, namephreak, and Perfect Fit Last Name, in that it focusses on causality. "Aptronym" merely means the name is fitting, without saying anything about why it has come to fit.
The idea that people are drawn to professions that fit their name was suggested by psychologist Carl Jung, citing as an example Sigmund Freud who studied pleasure and whose surname means "joy". A few recent empirical studies have indicated that certain professions are disproportionately represented by people with appropriate surnames (and sometimes given names), though the methods of these studies have been challenged. One explanation for nominative determinism is implicit egotism, which states that humans have an unconscious preference for things they associate with themselves. An alternative explanation is genetic: a person might be named Smith or Taylor because that was originally their occupation, and they would pass on their genes to their descendants, including an aptitude for activities involving strength in the case of Smith, or dexterity in the case of Taylor.Social determinism
Social determinism is the theory that social interactions and constructs alone determine individual behavior (as opposed to biological or objective factors).
Consider certain human behaviors, such as committing murder, or writing poetry. A social determinist would look only at social phenomena, such as customs and expectations, education, and interpersonal interactions, to decide whether or not a given person would exhibit any of these behaviors. They would discount biological and other non-social factors, such as genetic makeup, the physical environment, etc. Ideas about nature and biology would be considered to be socially constructed.Technological determinism
Technological determinism is a reductionist theory that assumes that a society's technology determines the development of its social structure and cultural values. Technological determinism tries to understand how technology has had an impact on human action and thought. Changes in technology are the primary source for changes in society. The term is believed to have originated from Thorstein Veblen (1857–1929), an American sociologist and economist. The most radical technological determinist in the United States in the 20th century was most likely Clarence Ayres who was a follower of Thorstein Veblen and John Dewey. William Ogburn was also known for his radical technological determinism.
The first major elaboration of a technological determinist view of socioeconomic development came from the German philosopher and economist Karl Marx, whose theoretical framework was grounded in the perspective that changes in technology, and specifically productive technology, are the primary influence on human social relations and organizational structure, and that social relations and cultural practices ultimately revolve around the technological and economic base of a given society. Marx's position has become embedded in contemporary society, where the idea that fast-changing technologies alter human lives is all-pervasive.
Although many authors attribute a technologically determined view of human history to Marx's insights, not all Marxists are technological determinists, and some authors question the extent to which Marx himself was a determinist. Furthermore, there are multiple forms of technological determinism.