Reductionism is any of several related philosophical ideas regarding the associations between phenomena which can be described in terms of other simpler or more fundamental phenomena.
For the sciences, application of methodological reductionism attempts explanation of entire systems in terms of their individual, constituent parts and their interactions. For example, the temperature of a gas is reduced to nothing beyond the average kinetic energy of its molecules in motion. Thomas Nagel speaks of 'psychophysical reductionism' (the attempted reduction of psychological phenomena to physics and chemistry), as do others and 'physico-chemical reductionism' (the attempted reduction of biology to physics and chemistry), again as do others. In a very simplified and sometimes contested form, such reductionism is said to imply that a system is nothing but the sum of its parts. However, a more nuanced opinion is that a system is composed entirely of its parts, but the system will have features that none of the parts have. "The point of mechanistic explanations is usually showing how the higher level features arise from the parts."
Other definitions are used by other authors. For example, what John Polkinghorne terms 'conceptual' or 'epistemological' reductionism is the definition provided by Simon Blackburn and by Jaegwon Kim: that form of reductionism concerning a program of replacing the facts or entities entering statements claimed to be true in one type of discourse with other facts or entities from another type, thereby providing a relationship between them. Such an association is provided where the same idea can be expressed by "levels" of explanation, with higher levels reducible if need be to lower levels. This use of levels of understanding in part expresses our human limitations in remembering detail. However, "most philosophers would insist that our role in conceptualizing reality [our need for an hierarchy of "levels" of understanding] does not change the fact that different levels of organization in reality do have different 'properties'."
Reductionism strongly represents a certain perspective of causality. In a reductionist framework, the phenomena that can be explained completely in terms of relations between other more fundamental phenomena, are termed epiphenomena. Often there is an implication that the epiphenomenon exerts no causal agency on the fundamental phenomena that explain it. The epiphenomena are sometimes said to be "nothing but" the outcome of the workings of the fundamental phenomena, although the epiphenomena might be more clearly and efficiently described in very different terms. There is a tendency to avoid considering an epiphenomenon as being important in its own right. This attitude may extend to cases where the fundamentals are not obviously able to explain the epiphenomena, but are expected to by the speaker. In this way, for example, morality can be deemed to be "nothing but" evolutionary adaptation, and consciousness can be considered "nothing but" the outcome of neurobiological processes.
Reductionism should be distinguished from eliminationism: reductionists do not deny the existence of phenomena, but explain them in terms of another reality; eliminationists deny the existence of the phenomena themselves. For example, eliminationists deny the existence of life by their explanation in terms of physical and chemical processes.
Reductionism also does not preclude the existence of what might be termed emergent phenomena, but it does imply the ability to understand those phenomena completely in terms of the processes from which they are composed. This reductionist understanding is very different from emergentism, which intends that what emerges in "emergence" is more than the sum of the processes from which it emerges.
Most philosophers delineate three types of reductionism and anti-reductionism.
Ontological reductionism is the belief that reality is composed of a minimum number of kinds of entities or substances. This claim is usually metaphysical, and is most commonly a form of monism, in effect claiming that all objects, properties and events are reducible to a single substance. (A dualist who is an ontological reductionist would believe that everything is reducible to two substances—as one possible example, a dualist might claim that reality is composed of "matter" and "spirit".)
Richard Jones divides ontological reductionism into two: the reductionism of substances (e.g., the reduction of mind to matter) and the reduction of the number of structures operating in nature (e.g., the reduction of one physical force to another). This permits scientists and philosophers to affirm the former while being anti-reductionists regarding the latter.
Nancey Murphy has claimed that there are two species of ontological reductionism: one that denies that wholes are anything more than their parts; and the stronger thesis of atomist reductionism that wholes are not "really real". She admits that the phrase "really real" is apparently senseless but nonetheless has tried to explicate the supposed difference between the two.
Ontological reductionism denies the idea of ontological emergence, and claims that emergence is an epistemological phenomenon that only exists through analysis or description of a system, and does not exist fundamentally.
Ontological reductionism takes two different forms: token ontological reductionism and type ontological reductionism.
Token ontological reductionism is the idea that every item that exists is a sum item. For perceivable items, it affirms that every perceivable item is a sum of items with a lesser degree of complexity. Token ontological reduction of biological things to chemical things is generally accepted.
Type ontological reductionism is the idea that every type of item is a sum type of item, and that every perceivable type of item is a sum of types of items with a lesser degree of complexity. Type ontological reduction of biological things to chemical things is often rejected.
Methodological reductionism is the position that the best scientific strategy is to attempt to reduce explanations to the smallest possible entities. Methodological reductionism would thus include the claim that the atomic explanation of a substance's boiling point is preferable to the chemical explanation, and that an explanation based on even smaller particles (quarks and leptons, perhaps) would be even better. Methodological reductionism, therefore, is the opinion that all scientific theories either can or should be reduced to a single super~theory through the process of theoretical reduction.
Theory reduction is the process by which one theory absorbs another. For example, both Kepler's laws of the motion of the planets and Galileo's theories of motion formulated for terrestrial objects are reducible to Newtonian theories of mechanics because all the explanatory power of the former are contained within the latter. Furthermore, the reduction is considered to be beneficial because Newtonian mechanics is a more general theory—that is, it explains more events than Galileo's or Kepler's. Theoretical reduction, therefore, is the reduction of one explanation or theory to another—that is, it is the absorption of one of our ideas about a particular item into another idea.
Reductionist thinking and methods form the basis for many of the well-developed topics of modern science, including much of physics, chemistry and cell biology. Classical mechanics in particular is seen as a reductionist framework, and statistical mechanics can be considered as a reconciliation of macroscopic thermodynamic laws with the reductionist method of explaining macroscopic properties in terms of microscopic components.
In science, reductionism implies that certain topics of study are based on areas that study smaller spatial scales or organizational units. While it is commonly accepted that the foundations of chemistry are based in physics, and molecular biology is based on chemistry, similar statements become controversial when one considers less rigorously defined intellectual pursuits. For example, claims that sociology is based on psychology, or that economics is based on sociology and psychology would be met with reservations. These claims are difficult to substantiate even though there are obvious associations between these topics (for instance, most would agree that psychology can affect and inform economics). The limit of reductionism's usefulness stems from emergent properties of complex systems, which are more common at certain levels of organization. For example, certain aspects of evolutionary psychology and sociobiology are rejected by some who claim that complex systems are inherently irreducible and that a holistic method is needed to understand them.
Some strong reductionists believe that the behavioral sciences should become "genuine" scientific disciplines based on genetic biology, and on the systematic study of culture (see Richard Dawkins's concept of memes). In his book The Blind Watchmaker, Dawkins introduced the term "hierarchical reductionism" to describe the opinion that complex systems can be described with a hierarchy of organizations, each of which is only described in terms of objects one level down in the hierarchy. He provides the example of a computer, which using hierarchical reductionism is explained in terms of the operation of hard drives, processors, and memory, but not on the level of logic gates, or on the even simpler level of electrons in a semiconductor medium.
Others argue that inappropriate use of reductionism limits our understanding of complex systems. In particular, ecologist Robert Ulanowicz says that science must develop techniques to study ways in which larger scales of organization influence smaller ones, and also ways in which feedback loops create structure at a given level, independently of details at a lower level of organization. He advocates (and uses) information theory as a framework to study propensities in natural systems. Ulanowicz attributes these criticisms of reductionism to the philosopher Karl Popper and biologist Robert Rosen.
The idea that phenomena such as emergence and work within the topic of complex systems theory pose limits to reductionism has been advocated by Stuart Kauffman. Emergence is especially relevant when systems exhibit historicity. Emergence is strongly related to nonlinearity. The limits of the application of reductionism are claimed to be especially evident at levels of organization with higher amounts of complexity, including living cells, neural networks, ecosystems, society, and other systems formed from assemblies of large numbers of diverse components linked by multiple feedback loops.
Nobel laureate Philip Warren Anderson used the idea that symmetry breaking is an example of an emergent phenomenon in his 1972 Science paper "More is different" to make an argument about the limitations of reductionism. One observation he made was that the sciences can be arranged roughly in a linear hierarchy—particle physics, solid state physics, chemistry, molecular biology, cellular biology, physiology, psychology, social sciences—in that the elementary entities of one science obeys the principles of the science that precedes it in the hierarchy; yet this does not imply that one science is just an applied version of the science that precedes it. He writes that "At each stage, entirely new laws, concepts and generalizations are necessary, requiring inspiration and creativity to just as great a degree as in the previous one. Psychology is not applied biology nor is biology applied chemistry."
Disciplines such as cybernetics and systems theory imply non-reductionism, sometimes to the extent of explaining phenomena at a given level of hierarchy in terms of phenomena at a higher level, in a sense, the opposite of reductionism.
In mathematics, reductionism can be interpreted as the philosophy that all mathematics can (or ought to) be based on a common foundation, which for modern mathematics is usually axiomatic set theory. Ernst Zermelo was one of the major advocates of such an opinion; he also developed much of axiomatic set theory. It has been argued that the generally accepted method of justifying mathematical axioms by their usefulness in common practice can potentially weaken Zermelo's reductionist claim.
Jouko Väänänen has argued for second-order logic as a foundation for mathematics instead of set theory, whereas others have argued for category theory as a foundation for certain aspects of mathematics.
The incompleteness theorems of Kurt Gödel, published during 1931, caused doubt about the attainability of an axiomatic foundation for all of mathematics. Any such foundation would have to include axioms powerful enough to describe the arithmetic of the natural numbers (a subset of all mathematics). Yet Gödel proved that for any self-consistent recursive axiomatic system powerful enough to describe the arithmetic of the natural numbers, there are propositions about the natural numbers that cannot be proved from the axioms, but which we can prove in the natural language with which we described the axioms. Such propositions are known as formally undecidable propositions. For example, the continuum hypothesis is undecidable in the Zermelo-Fraenkel set theory as shown by Cohen.
Religious reductionism generally attempts to explain religion by explaining it in terms of nonreligious causes. A few examples of reductionistic explanations for the presence of religion are: that religion can be reduced to humanity's conceptions of right and wrong, that religion is fundamentally a primitive attempt at controlling our environments, that religion is a way to explain the existence of a physical world, and that religion confers an enhanced survivability for members of a group and so is reinforced by natural selection. Anthropologists Edward Burnett Tylor and James George Frazer employed some religious reductionist arguments. Sigmund Freud held that religion is nothing more than an illusion, or even a mental illness, and Marx claimed that religion is "the sigh of the oppressed," and the opium of the people providing only "the illusory happiness of the people," thus providing two influential examples of reductionistic views against the idea of religion.
Linguistic reductionism is the idea that everything can be described or explained by a language with a limited number of concepts, and combinations of those concepts. An example is the language Toki Pona.
The concept of downward causation poses an alternative to reductionism within philosophy. This opinion is developed by Peter Bøgh Andersen, Claus Emmeche, Niels Ole Finnemann, and Peder Voetmann Christiansen, among others. These philosophers explore ways in which one can talk about phenomena at a larger-scale level of organization exerting causal influence on a smaller-scale level, and find that some, but not all proposed types of downward causation are compatible with science. In particular, they find that constraint is one way in which downward causation can operate. The notion of causality as constraint has also been explored as a way to shed light on scientific concepts such as self-organization, natural selection, adaptation, and control.
Philosophers of the Enlightenment worked to insulate human free will from reductionism. Descartes separated the material world of mechanical necessity from the world of mental free will. German philosophers introduced the concept of the "noumenal" realm that is not governed by the deterministic laws of "phenomenal" nature, where every event is completely determined by chains of causality. The most influential formulation was by Immanuel Kant, who distinguished between the causal deterministic framework the mind imposes on the world—the phenomenal realm—and the world as it exists for itself, the noumenal realm, which, as he believed, included free will. To insulate theology from reductionism, 19th century post-Enlightenment German theologians, especially Friedrich Schleiermacher and Albrecht Ritschl, used the Romantic method of basing religion on the human spirit, so that it is a person's feeling or sensibility about spiritual matters that comprises religion.
The anti-reductionist considers as minimum requirement upon the reductionist: "At the very least the anti-reductionist is owed an account of why the intuitions arise if they are not accurate."
A contrast to reductionism is holism or emergentism. Holism is the idea that items can have properties, (emergent properties), as a whole that are not explainable from the sum of their parts. The principle of holism was summarized concisely by Aristotle in the Metaphysics: "The whole is more than the sum of its parts".
The development of systems thinking has provided methods for describing issues in a holistic rather than a reductionist way, and many scientists use a holistic paradigm. When the terms are used in a scientific context, holism and reductionism refer primarily to what sorts of models or theories offer valid explanations of the natural world; the scientific method of falsifying hypotheses, checking empirical data against theory, is largely unchanged, but the method guides which theories are considered. The conflict between reductionism and holism in science is not universal—it usually concerns whether or not a holistic or reductionist method is appropriate in the context of studying a specific system or phenomenon.
In many cases (such as the kinetic theory of gases), given a good understanding of the components of the system, one can predict all the important properties of the system as a whole. In other systems, emergent properties of the system are said to be almost impossible to predict from knowledge of the parts of the system. Complexity theory studies systems and properties of the latter type.
Alfred North Whitehead's metaphysics opposed reductionism. He refers to this as the "fallacy of the misplaced concreteness". His scheme was to frame a rational, general understanding of phenomena, derived from our reality.
Sven Erik Jorgensen, an ecologist, states both theoretical and practical arguments for a holistic method in certain topics of science, especially ecology. He argues that many systems are so complex that it will not ever be possible to describe all their details. Making an analogy to the Heisenberg uncertainty principle in physics, he argues that many interesting and relevant ecological phenomena cannot be replicated in laboratory conditions, and thus cannot be measured or observed without influencing and changing the system in some way. He also indicates the importance of interconnectedness in biological systems. His opinion is that science can only progress by outlining what questions are unanswerable and by using models that do not attempt to explain everything in terms of smaller hierarchical levels of organization, but instead model them on the scale of the system itself, taking into account some (but not all) factors from levels both higher and lower in the hierarchy.
In cognitive psychology, George Kelly developed "constructive alternativism" as a form of personal construct psychology, this provided an alternative to what he considered "accumulative fragmentalism". For this theory, knowledge is seen as the construction of successful mental models of the exterior world, rather than the accumulation of independent "nuggets of truth".
Fragmentalism is an alternative term for ontological reductionism, although fragmentalism is frequently used in a pejorative sense. Anti-realists use the term fragmentalism in arguments that the world does not exist of separable entities, instead consisting of wholes. For example, advocates of this idea claim that:
The linear deterministic approach to nature and technology promoted a fragmented perception of reality, and a loss of the ability to foresee, to adequately evaluate, in all their complexity, global crises in ecology, civilization and education.
The term "fragmentalism" is usually applied to reductionist modes of thought, frequently with the related pejorative term of scientism. This usage is popular amongst some ecological activists:
Such opinions also motivate many criticisms of the scientific method:
The scientific method only acknowledges monophasic consciousness. The method is a specialized system that emphasizes studying small and distinctive parts in isolation, which results in fragmented knowledge.
A New Philosophy of Society: Assemblage Theory and Social Complexity is a 2006 book by Manuel DeLanda. The book is an attempt to loosely define a new ontology for use by social theorists — one that challenges the existing paradigm of meaningful social analyses being possible only on the level of either individuals (micro-reductionism) or "society as a whole" (macro-reductionism). Instead, the book employs Gilles Deleuze's theory of assemblages from A Thousand Plateaus (1980) to posit social entities on all scales (from sub-individual to transnational) that are best analysed through their components (themselves assemblages).Anti-realism
In analytic philosophy, anti-realism is an epistemological position first articulated by British philosopher Michael Dummett. The term was coined as an argument against a form of realism Dummett saw as 'colorless reductionism'.In anti-realism, the truth of a statement rests on its demonstrability through internal logic mechanisms, such as the context principle or intuitionistic logic, in direct opposition to the realist notion that the truth of a statement rests on its correspondence to an external, independent reality. In anti-realism, this external reality is hypothetical and is not assumed.Because it encompasses statements containing abstract ideal objects (i.e. mathematical objects), anti-realism may apply to a wide range of philosophic topics, from material objects to the theoretical entities of science, mathematical statement, mental states, events and processes, the past and the future.Antireductionism
Antireductionism is the position in science and metaphysics that stands in contrast to reductionism (anti-holism) by advocating that not all properties of a system can be explained in terms of its constituent parts and their interactions.Antiscience
Antiscience is a position that rejects science and the scientific method. People holding antiscientific views do not accept science as an objective method that can generate universal knowledge. They also contend that scientific reductionism in particular is an inherently limited means to reach understanding of a complex world.Emergentism
In philosophy, emergentism is the belief in emergence, particularly as it involves consciousness and the philosophy of mind, and as it contrasts (or not) with reductionism. A property of a system is said to be emergent if it is a new outcome of some other properties of the system and their interaction, while it is itself different from them. Emergent properties are not identical with, reducible to, or deducible from the other properties. The different ways in which this independence requirement can be satisfied lead to variant types of emergence.Epistemological pluralism
Epistemological pluralism is a term used in philosophy, economics, and virtually any field of study to refer to different ways of knowing things, different epistemological methodologies for attaining a fuller description of a particular field. A particular form of epistemological pluralism is dualism, for example, the separation of methods for investigating mind from those appropriate to matter (see mind–body problem). By contrast, monism is the restriction to a single approach, for example, reductionism, which asserts the study of all phenomena can be seen as finding relations to some few basic entities.Epistemological pluralism is to be distinguished from ontological pluralism, the study of different modes of being, for example, the contrast in the mode of existence exhibited by "numbers" with that of "people" or "cars".In the philosophy of science epistemological pluralism arose in opposition to reductionism to express the contrary view that at least some natural phenomena cannot be fully explained by a single theory or fully investigated using a single approach.In mathematics, the variety of possible epistemological approaches includes platonism ("mathematics as an objective study of abstract reality, no more created by human thought than the galaxies") radical constructivism (with restriction upon logic, banning the proof by reductio ad absurdum and other limitations), and many other schools of thought.In economics controversy exists between a single epistemological approach to economics and a variety of approaches. "At midcentury, the neoclassical approach achieved near-hegemonic status (at least in the United States), and its proponents sought to bring all kinds of social phenomena under its uniform explanatory umbrella. The resistance of some phenomena to neoclassical treatment has led a number of economists to think that alternative approaches are necessary for at least some phenomena and thus also to advocate pluralism." An extensive history of these attempts is provided by Sent.Fallacy of the single cause
The fallacy of the single cause, also known as complex cause, causal oversimplification, causal reductionism, and reduction fallacy, is a fallacy of questionable cause that occurs when it is assumed that there is a single, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes.
It can be logically reduced to: " X caused Y; therefore, X was the only cause of Y" (although A,B,C...etc. also contributed to Y.)Causal oversimplification is a specific kind of false dilemma where conjoint possibilities are ignored. In other words, the possible causes are assumed to be "A or B or C" when "A and B and C" or "A and B and not C" (etc.) are not taken into consideration.Greedy reductionism
Greedy reductionism, identified by Daniel Dennett, in his 1995 book Darwin's Dangerous Idea, is a kind of erroneous reductionism. Whereas "good" reductionism means explaining a thing in terms of what it reduces to (for example, its parts and their interactions), greedy reductionism occurs when "in their eagerness for a bargain, in their zeal to explain too much too fast, scientists and philosophers ... underestimate the complexities, trying to skip whole layers or levels of theory in their rush to fasten everything securely and neatly to the foundation". Using the terminology of "cranes" (legitimate, mechanistic explanations) and "skyhooks" (essentially, fake—e.g. supernaturalistic—explanations) built up earlier in the chapter, Dennett recapitulates his initial definition of the term in the chapter summary on p. 83: "Good reductionists suppose that all Design can be explained without skyhooks; greedy reductionists suppose it can all be explained without cranes."Hitchens's razor
Hitchens's razor is an epistemological razor asserting that the burden of proof regarding the truthfulness of a claim lies with the one who makes the claim, and if this burden is not met, the claim is unfounded, and its opponents need not argue further in order to dismiss it.Holism in science
Holism in science, or holistic science, is an approach to research that emphasizes the study of complex systems. Systems are approached as coherent wholes whose component parts are best understood in context and in relation to one another and to the whole. This practice is in contrast to a purely analytic tradition (sometimes called reductionism) which aims to gain understanding of systems by dividing them into smaller composing elements and gaining understanding of the system through understanding their elemental properties. The holism-reductionism dichotomy is often evident in conflicting interpretations of experimental findings and in setting priorities for future research.Immigration reduction in the United States
Immigration reduction refers to a movement in the United States that advocates a reduction in the amount of immigration allowed into the country. Steps advocated for reducing the numbers of immigrants include advocating stronger action to prevent illegal entry and illegal immigration, and reductions in non-immigrant temporary work visas (such as H-1B and L-1). Some advocate a tightening of the requirements for legal immigration requirements to reduce total numbers, or move the proportions of legal immigrants away from those on family reunification programs to skills-based criteria. What separates it from others who want immigration reform is that reductionists see immigration- or one of its forms- as being a significant source of social, economic, and environmental problems, and wish to cut current immigration levels.
Many immigration reformists support continued legal immigration, only opposing illegal immigration. Some immigration reductionists want legal immigration to be set at a percentage of current levels until fewer adverse effects are created by legal immigration.The related terminology "self-deportation" or "to self-deport" refers to the viewpoint that social policy that illegal immigration to the U.S. can be reduced by causing residents to leave the U.S. on their own, thus creating a reduction.Methodological individualism
In the social sciences, methodological individualism is the principle that subjective individual motivation explains social phenomena, rather than class or group dynamics which are (according to proponents of individualistic principles) illusory or artificial and therefore cannot truly explain market or social phenomena. Methodological individualism is often contrasted with methodological holism.Occam's razor
Occam's razor (also Ockham's razor or Ocham's razor (Latin: novacula Occami); further known as the law of parsimony (Latin: lex parsimoniae)) is the problem-solving principle that essentially states that "simpler solutions are more likely to be correct than complex ones." When presented with competing hypotheses to solve a problem, one should select the solution with the fewest assumptions. The idea is attributed to English Franciscan friar William of Ockham (c. 1287–1347), a scholastic philosopher and theologian.
In science, Occam's razor is used as an abductive heuristic in the development of theoretical models, rather than as a rigorous arbiter between candidate models. In the scientific method, Occam's razor is not considered an irrefutable principle of logic or a scientific result; the preference for simplicity in the scientific method is based on the falsifiability criterion. For each accepted explanation of a phenomenon, there may be an extremely large, perhaps even incomprehensible, number of possible and more complex alternatives. Since one can always burden failing explanations with ad hoc hypotheses to prevent them from being falsified, simpler theories are preferable to more complex ones because they are more testable.Physicalism
In philosophy, physicalism is the metaphysical thesis that "everything is physical", that there is "nothing over and above" the physical, or that everything supervenes on the physical. Physicalism is a form of ontological monism—a "one substance" view of the nature of reality as opposed to a "two-substance" (dualism) or "many-substance" (pluralism) view. Both the definition of "physical" and the meaning of physicalism have been debated.
Physicalism is closely related to materialism. Physicalism grew out of materialism with advancements of the physical sciences in explaining observed phenomena. The terms are often used interchangeably, although they are sometimes distinguished, for example on the basis of physics describing more than just matter (including energy and physical law). Common arguments against physicalism include both the philosophical zombie argument and the multiple observers argument, that the existence of a physical being may imply zero or more distinct conscious entities.Separation of concerns
In computer science, separation of concerns (SoC) is a design principle for separating a computer program into distinct sections, so that each section addresses a separate concern. A concern is a set of information that affects the code of a computer program. A concern can be as general as the details of the hardware the code is being optimized for, or as specific as the name of a class to instantiate. A program that embodies SoC well is called a modular program. Modularity, and hence separation of concerns, is achieved by encapsulating information inside a section of code that has a well-defined interface. Encapsulation is a means of information hiding. Layered designs in information systems are another embodiment of separation of concerns (e.g., presentation layer, business logic layer, data access layer, persistence layer).Separation of concerns results in higher degrees of freedom for some aspect of the program's design, deployment, or usage. Common among these is higher degrees of freedom for simplification and maintenance of code. When concerns are well-separated, there are higher degrees of freedom for module reuse as well as independent development and upgrade. Because modules hide the details of their concerns behind interfaces, increased freedom results to later improve or modify a single concern's section of code without having to know the details of other sections, and without having to make corresponding changes to those sections. Modules can also expose different versions of an interface, which increases the freedom to upgrade a complex system in piecemeal fashion without interim loss of functionality.
Separation of concerns is a form of abstraction. As with most abstractions, interfaces must be added and there is generally more net code to be executed. So despite the many benefits of well separated concerns, there is often an associated execution penalty.Special sciences
Special sciences are those sciences other than fundamental physics, that are presumed to be reducible to fundamental physics, at least in principle. In this view, chemistry, biology, and neuroscience—indeed, all sciences except fundamental physics—are special sciences. The status of the special sciences, and their relation to physics, is unresolved in the philosophy of science. Jerry Fodor, for instance, has argued for strong autonomy, concluding that the special sciences are not even in principle reducible to physics.Symmetry breaking
In physics, symmetry breaking is a phenomenon in which (infinitesimally) small fluctuations acting on a system crossing a critical point decide the system's fate, by determining which branch of a bifurcation is taken. To an outside observer unaware of the fluctuations (or "noise"), the choice will appear arbitrary. This process is called symmetry "breaking", because such transitions usually bring the system from a symmetric but disorderly state into one or more definite states. Symmetry breaking is thought to play a major role in pattern formation.
In 1972, Nobel laureate P.W. Anderson used the idea of symmetry breaking to show some of the drawbacks of reductionism in his paper titled "More is different" in Science.Symmetry breaking can be distinguished into two types, explicit symmetry breaking and spontaneous symmetry breaking, characterized by whether the equations of motion fail to be invariant or the ground state fails to be invariant.Two Dogmas of Empiricism
"Two Dogmas of Empiricism" is a paper by analytic philosopher Willard Van Orman Quine published in 1951. According to University of Sydney professor of philosophy Peter Godfrey-Smith, this "paper [is] sometimes regarded as the most important in all of twentieth-century philosophy". The paper is an attack on two central aspects of the logical positivists' philosophy. One is the analytic–synthetic distinction between analytic truths and synthetic truths, explained by Quine as truths grounded only in meanings and independent of facts, and truths grounded in facts. The other is reductionism, the theory that each meaningful statement gets its meaning from some logical construction of terms that refers exclusively to immediate experience.
"Two Dogmas" has six sections. The first four focus on analyticity, the last two on reductionism. There, Quine turns the focus to the logical positivists' theory of meaning. He also presents his own holistic theory of meaning.Type physicalism
Type physicalism (also known as reductive materialism, type identity theory, mind–brain identity theory and identity theory of mind) is a physicalist theory, in the philosophy of mind. It asserts that mental events can be grouped into types, and can then be correlated with types of physical events in the brain. For example, one type of mental event, such as "mental pains" will, presumably, turn out to be describing one type of physical event (like C-fiber firings).
Type physicalism is contrasted by token identity physicalism, which argues that mental events are unlikely to have "steady" or categorical biological correlates. These positions make use of the philosophical type–token distinction (e.g., Two persons having the same "type" of car need not mean that they share a "token", a single vehicle). Type physicalism can now be understood to argue that there is identicalness between types, whereas token identity physicalism says one can only describe a particular, unique, brain event.
There are other ways a physicalist might criticize type physicalism; eliminative materialism and revisionary materialism question whether science is currently using the best categorisations. In the same way talk of demonic possession was questioned with scientific advance, categorisations like "pain" may need to be revised.