Instrumentalism

In philosophy of science and in epistemology, Instrumentalism is a methodological view that ideas are useful instruments, and that the worth of an idea is based on how effective they are in explaining and predicting phenomena. [1] Instrumentalism is a pragmatic philosophy of John Dewey that thought is an instrument for solving practical problems, and that truth is not fixed but changes as problems change. Instrumentalism is the view that scientific theories are useful tools for predicting phenomena instead of true or approximately true descriptions.[2]

The truth of an idea is determined by its success in the active solution of a problem. [3] A successful scientific theory reveals nothing known either true or false about nature's unobservable objects, properties or processes.[4] Scientific theories are assessed on their usefulness in generating predictions and in confirming those predictions in data and observations, and not on their ability to explain the truth value of some unobservable phenomenon. The question of "truth" is not taken into account one way or the other. According to instrumentalists, scientific theory is merely a tool whereby humans predict observations in a particular domain of nature by formulating laws, which state or summarize regularities, while theories themselves do not reveal supposedly hidden aspects of nature that somehow explain these laws.[5] Initially a novel perspective introduced by Pierre Duhem in 1906, instrumentalism is largely the prevailing theory that underpins the practice of physicists today.[5]

Rejecting scientific realism's ambitions to uncover metaphysical truth about nature,[5] instrumentalism is usually categorized as an antirealism, although its mere lack of commitment to scientific theory's realism can be termed nonrealism. Instrumentalism merely bypasses debate concerning whether, for example, a particle spoken about in particle physics is a discrete entity enjoying individual existence, or is an excitation mode of a region of a field, or is something else altogether.[6][7][8] Instrumentalism holds that theoretical terms need only be useful to predict the phenomena, the observed outcomes.[6]

There are multiple versions of instrumentalism. Instrumentalism is a variety of scientific anti-realism.

Recent developments

Philosopher at University of California at Irvine, Kyle Stanford, wrote a book about this, 'Physical Theory: Method and Interpretation'. [9]

In Philosophy of Mind

In Philosophy of Mind, Instrumentalism is view that propositional attitudes like beliefs are not actually concepts on which we can base scientific investigations of mind and brain, but that the view should be that other beings as having beliefs. [10]

Relation to Pragmatism

Instrumentalism is closely related to Pragmatism, that practical consequences is an essential basis for determining meaning, truth or value [11]

History

British empiricism

Newton's theory of motion, whereby any object instantly interacts with all other objects across the universe, motivated the founder of British empiricism, John Locke, to speculate that matter is capable of thought.[12] The next leading British empiricist, George Berkeley, argued that an object's putative primary qualities as recognized by scientists, such as shape, extension, and impenetrability, are inconceivable without the putative secondary qualities of color, hardness, warmth, and so on. He also posed the question how or why an object could be properly conceived to exist independently of any perception of it.[13] Berkeley did not object to everyday talk about the reality of objects, but instead took issue with philosophers' talk, who spoke as if they knew something beyond sensory impressions that ordinary folk did not.[14]

For Berkeley, a scientific theory does not state causes or explanations, but simply identifies perceived types of objects and traces their typical regularities.[14] Berkeley thus anticipated the basis of what Auguste Comte in the 1830s called positivism,[14] although Comtean positivism added other principles concerning the scope, method, and uses of science that Berkeley would have disavowed. Berkeley also noted the usefulness of a scientific theory having terms that merely serve to aid calculations without their having to refer to anything in particular, so long as they proved useful in practice.[14] Berkeley thus predated the insight that Logical Positivists - who originated in the late 1920s, but who, by the 1950s, had softened into Logical Empiricists - would be compelled to accept: theoretical terms in science do not always translate into observational terms.[15]

The last great British empiricist, David Hume, posed a number of challenges to Bacon's inductivism, which had been the prevailing, or at least the professed view concerning the attainment of scientific knowledge. Regarding himself as having placed his own theory of knowledge on par with Newton's theory of motion, Hume supposed that he had championed inductivism over scientific realism. Upon reading Hume's work, Immanuel Kant was "awakened from dogmatic slumber", and thus sought to neutralise any threat to science posed by Humean empiricism. Kant would develop the first stark philosophy of physics.[16]

German idealism

To save Newton's law of universal gravitation, Immanuel Kant reasoned that the mind is the precondition of experience and so, as the bridge from the noumena, which are how the world's things exist in themselves, to the phenomena, which are humans' recognized experiences. And so mind itself contains the structure that determines space, time, and substance, how mind's own categorization of noumena renders space Euclidean, time constant, and objects' motions exhibiting the very determinism predicted by Newtonian physics. Kant apparently presumed that the human mind, rather than a phenomenon itself that had evolved, had been predetermined and set forth upon the formation of humankind. In any event, the mind also was the veil of appearance that scientific methods could never lift. And yet the mind could ponder itself and discover such truths, although not on a theoretical level, but only by means of ethics. Kant's metaphysics, then, transcendental idealism, secured science from doubt - in that it was a case of “synthetic a priori” knowledge (“universal, necessary and informative”) - and yet discarded hope of scientific realism. Meanwhile, it was a watershed for idealist metaphysics, and launched German idealism, most influentially Hegel's absolute idealism or objective idealism, or at least interpretations, often misinterpretations, and political misuses, of it.

Logical empiricism

Since the mind has virtually no power to know anything beyond direct sensory experience, Ernst Mach's early version of logical positivism (empirio-criticism) verged on idealism. It was alleged to even be a surreptitious solipsism, whereby all that exists is one's own mind. Mach's positivism also strongly asserted the ultimate unity of the empirical sciences. Mach's positivism asserted phenomenalism as to new basis of scientific theory, all scientific terms to refer to either actual or potential sensations, thus eliminating hypotheses while permitting such seemingly disparate scientific theories as physical and psychological to share terms and forms. Phenomenalism was insuperably difficult to implement, yet heavily influenced a new generation of philosophers of science, who emerged in the 1920s while terming themselves logical positivists while pursuing a program termed verificationism. Logical positivists aimed not to instruct or restrict scientists, but to enlighten and structure philosophical discourse to render scientific philosophy that would verify philosophical statements as well as scientific theories, and align all human knowledge into a scientific worldview, freeing humankind from so many of its problems due to confused or unclear language.

The verificationists expected a strict gap between theory versus observation, mirrored by a theory's theoretical terms versus observable terms. Believing a theory's posited unobservables to always correspond to observations, the verificationists viewed a scientific theory's theoretical terms, such as electron, as metaphorical or elliptical at observations, such as white streak in cloud chamber. They believed that scientific terms lacked meanings unto themselves, but acquired meanings from the logical structure that was the entire theory that in turn matched patterns of experience. So by translating theoretical terms into observational terms and then decoding the theory's mathematical/logical structure, one could check whether the statement indeed matched patterns of experience, and thereby verify the scientific theory false or true. Such verification would be possible, as never before in science, since translation of theoretical terms into observational terms would make the scientific theory purely empirical, none metaphysical. Yet the logical positivists ran into insuperable difficulties. Moritz Schlick debated with Otto Neurath over foundationalism—the traditional view traced to Descartes as founder of modern Western philosophy—whereupon only nonfoundationalism was found tenable. Science, then, could not find a secure foundation of indubitable truth.

And since science aims to reveal not private but public truths, verificationists switched from phenomenalism to physicalism, whereby scientific theory refers to objects observable in space and at least in principle already recognizable by physicists. Finding strict empiricism untenable, verificationism underwent "liberalization of empiricism". Rudolf Carnap even suggested that empiricism's basis was pragmatic. Recognizing that verification—proving a theory false or true—was unattainable, they discarded that demand and focused on confirmation theory. Carnap sought simply to quantify a universal law's degree of confirmation—its probable truth—but, despite his great mathematical and logical skill, discovered equations never operable to yield over zero degree of confirmation. Carl Hempel found the paradox of confirmation. By the 1950s, the verificationists had established philosophy of science as subdiscipline within academia's philosophy departments. By 1962, verificationists had asked and endeavored to answer seemingly all the great questions about scientific theory. Their discoveries showed that the idealized scientific worldview was naively mistaken. By then the leader of the legendary venture, Hempel raised the white flag that signaled verificationism's demise. Suddenly striking Western society, then, was Kuhn's landmark thesis, introduced by none other than Carnap, verificationism's greatest firebrand. Instrumentalism exhibited by scientists often does not even discern unobservable from observable entities.[6]

Historical turn

From the 1930s until Thomas Kuhn's 1962 The Structure of Scientific Revolutions, there were roughly two prevailing views about the nature of science. The popular view was scientific realism, which usually involved a belief that science was progressively unveiling a truer view, and building a better understanding, of nature. The professional approach was logical empiricism, wherein a scientific theory was held to be a logical structure whose terms all ultimately refer to some form of observation, while an objective process neutrally arbiters theory choice, compelling scientists to decide which scientific theory was superior. Physicists knew better, but, busy developing the Standard Model, were so steeped in developing quantum field theory, that their talk, largely metaphorical, perhaps even metaphysical, was unintelligible to the public, while the steep mathematics warded off philosophers of physics.[7] By the 1980s, physicists regarded not particles, but fields as the more fundamental, and no longer even hoped to discover what entities and processes might be truly fundamental to nature, perhaps not even the field.[7][8] Kuhn had not claimed to have developed a novel thesis, but instead hoped to synthesize more usefully recent developments in the philosophy of science.

In 1906, Duhem had introduced the problem of the underdetermination of theory by data, since any dataset could be consistent with several different explanations, how the success of any prediction does not, by affirming the consequent, a deductive fallacy, logically confirm the truth of the theory in question. In the 1930s, Ludwik Fleck had explained the role of perspectivism (logology) in science whereby scientists are trained in thought collectives to adopt particular thought styles setting expectations for a proper scientific question, scientific experiment, and scientific data. Scientists manipulate experimental conditions to obtain results that cohere with their own expectations—what the scientists presuppose is realistic—and as a result might be tempted to invoke the experimenter's regress in order to reject unexpected results. They would then redo these experiments under what were supposedly better and more conducive conditions. By the 1960s, physicists recognized two, differing roles of physical theory, formalism and interpretation. Formalism involved mathematical equations and axioms that, upon input of physical data, yielded certain predictions. Interpretation sought to explain why they succeeded.

Widely read, Kuhn's 1962 thesis seemed to shatter logical empiricism, whose paradigmatic science was physics and which championed instrumentalism. Yet scientific realists, who were far more tenacious, responded by attacking Kuhn's thesis, perennially depicted thereafter as either illuminated or infamous. Kuhn later indicated that his thesis had been so widely misunderstood that he himself was not a Kuhnian. With logical empiricism's demise, Karl Popper's falsificationism was in the ascendancy, and Popper was knighted in 1965. Yet in 1961, the molecular biology research program had made its first major empirical breakthrough in cracking the genetic code. By the 1970s, molecular genetics' research tools could also be used for genetic engineering. In 1975, philosopher of science Hilary Putnam famously resurrected scientific realism with his no miracles argument, whereby the best scientific theories' predictive successes would appear miraculous if those theories were not at least approximately true about reality as it exists in and of itself beyond human perception. Antirealist arguments were formulated in response.

Karl Popper's scientific realism

By rejecting all variants of positivism via its focus on sensations rather than realism, Karl Popper asserted his commitment to scientific realism, merely via the necessary uncertainty of his own falsificationism. Popper alleged that instrumentalism reduces basic science to what is merely applied science.[17]

Constructive empiricism as a form of instrumentalism

Bas van Fraassen's (1980)[18] project of constructive empiricism focuses on belief in the domain of the observable, so for this reason it is described as a form of instrumentalism.[19]

See also

Notable proponents of instrumentalism

Notes

  1. ^ https://www.philosophybasics.com/branch_instrumentalism.html. Missing or empty |title= (help)
  2. ^ http://www.newworldencyclopedia.org/entry/Instrumentalism. Missing or empty |title= (help)
  3. ^ https://en.wikipedia.org/wiki/Instrumentalism#cite_note-1. Missing or empty |title= (help)
  4. ^
    • Anjan Chakravartty, "Scientific realism", §4 "Antirealism: Foils for scientific realism: §4.1: "Empiricism", in Edward N. Zalta, ed, The Stanford Encyclopedia of Philosophy, Summer 2013 edn: "Traditionally, instrumentalists maintain that terms for unobservables, by themselves, have no meaning; construed literally, statements involving them are not even candidates for truth or falsity. The most influential advocates of instrumentalism were the logical empiricists (or logical positivists), including Carnap and Hempel, famously associated with the Vienna Circle group of philosophers and scientists as well as important contributors elsewhere. In order to rationalize the ubiquitous use of terms which might otherwise be taken to refer to unobservables in scientific discourse, they adopted a non-literal semantics according to which these terms acquire meaning by being associated with terms for observables (for example, 'electron' might mean 'white streak in a cloud chamber'), or with demonstrable laboratory procedures (a view called 'operationalism'). Insuperable difficulties with this semantics led ultimately (in large measure) to the demise of logical empiricism and the growth of realism. The contrast here is not merely in semantics and epistemology: a number of logical empiricists also held the neo-Kantian view that ontological questions 'external' to the frameworks for knowledge represented by theories are also meaningless (the choice of a framework is made solely on pragmatic grounds), thereby rejecting the metaphysical dimension of realism (as in Carnap 1950)".
    • Samir Okasha, Philosophy of Science: A Very Short Introduction (New York: Oxford University Press, 2002), p. 62: "Strictly we should distinguish two sorts of anti-realism. According to the first sort, talk of unobservable entities is not to be understood literally at all. So when a scientist pus forward a theory about electrons, for example, we should not take him to be asserting the existence of entities called 'electrons'. Rather, his talk of electrons is metaphorical. This form of anti-realism was popular in the first half of the 20th century, but few people advocate it today. It was motivated largely by a doctrine in the philosophy of language, according to which it is not possible to make meaningful assertions about things that cannot in principle be observed, a doctrine that few contemporary philosophers accept. The second sort of anti-realism accepts that talk of unobservable entities should be taken at face value: if a theory says that electrons are negatively charged, it is true if electrons do exist and are negatively charged, but false otherwise. But we will never know which, says the anti-realist. So the correct attitude towards the claims that scientists make about unobservable reality is one of total agnosticism. They are either true or false, but we are incapable of finding out which. Most modern anti-realism is of this second sort".
  5. ^ a b c Roberto Torretti, The Philosophy of Physics (Cambridge: Cambridge University Press, 1999), pp. 242–43: "Like Whewell and Mach, Duhem was a practicing scientist who devoted an important part of his adult life to the history and philosophy of physics. ... His philosophy is contained in La théorie physique: son objet, sa structure [The Aim and Structure of Physical Theory] (1906), which may well be, to this day, the best overall book on the subject. Its main theses, although quite novel when first put forward, have in the meantime become commonplace, so I shall review them summarily without detailed argument, just to associate them with his name. But first I ought to say that neither in the first nor in the second (1914) edition of his book did Duhem take into account—or even so much as mention—the deep changes that were then taking place in physics. Still, the subsequent success and current entrenchment of Duhem's ideas are due above all to their remarkable agreement with—and the light they throw on—the practice of mathematical physics in the twentieth century. In the first part of La théorie physique, Duhem contrasts two opinions concerning the aim of physical theory. For some authors, it ought to furnish 'the explanation of a set of experimentally established laws', while for others it is 'an abstract system whose aim is to summarize and logically classify a set of experimental laws, without pretending to explain these laws' (Duhem 1914, p. 3). Duhem resolutely sides with the latter. His rejection of the former rests on his understanding of 'explanation' ('explication' in French), which he expresses as follows: 'To explain, explicare, is to divest reality from the appearances which enfold it like veils, in order to see the reality face to face' (pp 3–4). Authors in the first group expect from physics the true vision of things-in-themselves that religious myth and philosophical speculation have hitherto been unable to supply. Their explanation makes no sense unless (i) there is, 'beneath the sense appearances revealed to us by our perceptions, [...] a reality different from these appearances' and (ii) we know 'the nature of the elements which constitute' that reality (p 7). Thus, physical theory cannot explain—in the stated sense—the laws established by experiment unless it depends on metaphysics and thus remains subject to the interminable disputes of metaphysicians. Worse still, the teachings of no metaphysical school are sufficiently detailed and precise to account for all of the elements of physical theory (p 18). Duhem instead assigns to physical theories a more modest but autonomous and readily attainable aim: 'A physical theory is not an explanation. It is a system of mathematical propositions, derived from a small number of principles, whose purpose is to represent a set of experimental laws as simply, as completely, and as exactly as possible (Duhem 1914, p. 24)".
  6. ^ a b c P Kyle Stanford, Exceeding Our Grasp: Science, History, and the Problem of Unconceived Alternatives (New York: Oxford University Press, 2006), p. 198.
  7. ^ a b c Roberto Torretti, The Philosophy of Physics (Cambridge: Cambridge University Press, 1999), pp. 396–97, including quote: "First, quantum field theories have been the working theories at the frontline of physics for over 30 years. Second, these theories appear to do away with the familiar conception of physical systems as aggregates of substantive individual particles. This conception was already undermined by Bose–Einstein and Fermi–Dirac statistics (§6.1.4), according to which the so-called particles cannot be assigned a definite trajectory in ordinary space. But quantum field theories go a long step further and—or so it would seem—conceive 'particles' as excitation modes of the field. This, I presume, motivated Howard Stein's saying that 'the quantum theory of fields is the contemporary locus of metaphysical research' (1970, p. 285). Finally, the very fact that physicists conspicuously and fruitfully resort to unperspicacious theories can teach us something about the aim and reach of science. Here is how physicists work, dirty-handed, in their everyday practice, a far cry from what is taught at the Sunday school of the 'scientific worldview' ".
  8. ^ a b Meinard Kuhlmann, "Physicists debate whether the world is made of particles or fields—or something else entirely, Scientific American, 2013 Aug;309(2).
  9. ^ https://www.patheos.com/blogs/driventoabstraction/2018/09/scientific-realist-instrumentalist-theories/. Missing or empty |title= (help)
  10. ^ https://www.philosophybasics.com/branch_instrumentalism.html. Missing or empty |title= (help)
  11. ^ https://www.philosophybasics.com/branch_instrumentalism.html. Missing or empty |title= (help)
  12. ^ Torretti 1999 p. 75.
  13. ^ Torretti 1999 p. 101–02.
  14. ^ a b c d Torretti 1999 p. 102.
  15. ^ Torretti 1999 p. 103.
  16. ^ Torretti 1999 p. 98: "I shall dwell at some length on Kant's conception of the sources and scope of Newton's conceptual frame, for it was the first full-blown philosophy of physics and remains to this day the most significant".
  17. ^ Karl R Popper, Conjectures and Refutations: The Growth of Scientific Knowledge (London: Routledge, 2003 [1963]), ISBN 0-415-28594-1, quote: "Instrumentalism can be formulated as the thesis that scientific theories—the theories of the so-called 'pure' sciences—are nothing but computational rules (or inference rules); of the same character, fundamentally, as the computation rules of the so-called 'applied' sciences. (One might even formulate it as the thesis that "pure" science is a misnomer, and that all science is 'applied'.) Now my reply to instrumentalism consists in showing that there are profound differences between "pure" theories and technological computation rules, and that instrumentalism can give a perfect description of these rules but is quite unable to account for the difference between them and the theories".
  18. ^ van Fraassen, Bas C., 1980, The Scientific Image, Oxford: Oxford University Press.
  19. ^ Scientific Realism (Stanford Encyclopedia of Philosophy)
  20. ^ a b Gouinlock, James, "What is the Legacy of Instrumentalism? Rorty's Interpretation of Dewey." In Herman J. Saatkamp, ed., Rorty and Pragmatism. Nashville, TN: Vanderbilt University Press, 1995.

Sources

  • Torretti, Roberto, The Philosophy of Physics (Cambridge: Cambridge University Press, 1999), Berkeley, pp. 98, 101–4.
Anti-realism

In analytic philosophy, anti-realism is an epistemological position first articulated by British philosopher Michael Dummett. The term was coined as an argument against a form of realism Dummett saw as 'colorless reductionism'.In anti-realism, the truth of a statement rests on its demonstrability through internal logic mechanisms, such as the context principle or intuitionistic logic, in direct opposition to the realist notion that the truth of a statement rests on its correspondence to an external, independent reality. In anti-realism, this external reality is hypothetical and is not assumed.Because it encompasses statements containing abstract ideal objects (i.e. mathematical objects), anti-realism may apply to a wide range of philosophic topics, from material objects to the theoretical entities of science, mathematical statement, mental states, events and processes, the past and the future.

Berlin Circle

The Berlin Circle (German: die Berliner Gruppe) was a group that maintained logical empiricist views about philosophy.

Constructive empiricism

In philosophy, constructive empiricism (also empiricist structuralism) is a form of empiricism.

Elliott Sober

Elliott R. Sober (born 6 June 1948, Baltimore) is Hans Reichenbach Professor and William F. Vilas Research Professor in the Department of Philosophy at University of Wisconsin–Madison. Sober is noted for his work in philosophy of biology and general philosophy of science.

Epistemological idealism

Epistemological idealism is a subjectivist position in epistemology that holds that what one knows about an object exists only in one's mind. It is opposed to epistemological realism.

Experimental Musical Instruments (magazine)

Experimental Musical Instruments was a periodical edited and published by Bart Hopkin, an instrument builder and writer about 20th century experimental music design and custom made instrument construction. Though no longer in print, back issues are still available. The material and approach of EMI can now be found electronically on their site hosted by Bart Hopkin. This site is, together with www.oddmusic.com the main source on the internet for experimental musical instrumentalism.Although only old editions of the magazine are still available and no newer editions appear, the name is still in use as the publisher for many of the book written by Bart Hopkin and co-writers.

Experimentalism

Experimentalism is the philosophical belief that the way to truth is through experiments and empiricism. It is also associated with instrumentalism, the belief that truth should be evaluated based upon its demonstrated usefulness. Deborah Mayo suggests that we should focus on how experimental knowledge is actually arrived at and how it functions in science. Mayo also suggests that the reason New Experimentalists have come up short, is that the part of experiments that have the most to offer in building an account of inference and evidence that are left untapped: designing, generating, modelling and analysing experiments and data.

Less formally, artists often pursue their visions through trial and error; this form of experimentalism has been practiced in every field, from music to film and from literature to theatre.

Georges Rey

Georges Rey is a professor of philosophy at the University of Maryland.

His book Contemporary Philosophy of Mind discusses the topic of philosophy of mind. One major focus of Rey's exposition relates to eliminativism and instrumentalism, particularly with respect to the mental states that we are subjectively aware of by way of introspection. Rey is also the author of the current article on philosophy of mind at Encyclopædia Britannica.

Index of philosophy of mind articles

This is a list of philosophy of mind articles.

Alan Turing

Alexius Meinong

Anomalous monism

Anthony Kenny

Arnold Geulincx

Association for the Scientific Study of Consciousness

Australian materialism

Baruch Spinoza

Biological naturalism

Brain in a vat

C. D. Broad

Chinese room

Conscience

Consciousness

Consciousness Explained

Critical realism (philosophy of perception)

Daniel Dennett

David Hartley (philosopher)

David Kellogg Lewis

David Malet Armstrong

Direct realism

Direction of fit

Disquisitions relating to Matter and Spirit

Donald Davidson (philosopher)

Dream argument

Dualism (philosophy of mind)

Duration (Bergson)

Edmund Husserl

Eliminative materialism

Embodied philosophy

Emergent materialism

Evil demon

Exclusion principle (philosophy)

Frank Cameron Jackson

Fred Dretske

Functionalism (philosophy of mind)

G. E. M. Anscombe

Georg Henrik von Wright

George Edward Moore

Gilbert Harman

Gilbert Ryle

Gottfried Leibniz

Hard problem of consciousness

Henri Bergson

Hilary Putnam

Idealism

Immaterialism

Indefinite monism

Instrumentalism

Internalism and externalism

Intuition pump

J. J. C. Smart

Jaegwon Kim

Jerry Fodor

John Perry (philosopher)

John Searle

Karl Popper

Kendall Walton

Kenneth Allen Taylor

Ludwig Wittgenstein

Mad pain and Martian pain

Mental property

Methodological solipsism

Michael Tye (philosopher)

Mind

Mind-body dichotomy

Monism

Multiple Drafts Model

Multiple realizability

Naming and Necessity

Naïve realism

Neurophenomenology

Neutral monism

Noam Chomsky

Parallelism (philosophy)

Personal identity

Phenomenalism

Philosophy of artificial intelligence

Philosophy of mind

Philosophy of perception

Physicalism

Pluralism (philosophy)

Privileged access

Problem of other minds

Property dualism

Psychological nominalism

Qualia

Reflexive monism

René Descartes

Representational theory of mind

Richard Rorty

Ron McClamrock

Self (philosophy)

Society of Mind

Solipsism

Stephen Stich

Subjective idealism

Supervenience

Sydney Shoemaker

Tad Schmaltz

The Concept of Mind

The Meaning of Meaning

Thomas Nagel

Turing test

Type physicalism

Unconscious mind

Wilfrid Sellars

William Hirstein

William James

Internationalization of higher education

Internationalization of higher education in theory is "the process of integrating an international, intercultural, or global dimension into the purpose, functions or delivery of postsecondary education." Internationalization of higher education in practice is "the process of commercializing research and postsecondary education, and international competition for the recruitment of foreign students from wealthy and privileged countries in order to generate revenue, secure national profile, and build international reputation." The main components of internationalization of higher education are recruitment of international students, development of international branch campuses, students, staff and scholars exchange programs, internationalization of the curriculum, and research and education partnerships between institutions regionally and internationally.There are specific rationales which are driving the internationalisation and strategies which are being used in the internationalisation of the high education institutions (HEIs).

Interpretations of quantum mechanics

An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics "corresponds" to reality. Although quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments (not one prediction from quantum mechanics is found to be contradicted by experiments), there exist a number of contending schools of thought over their interpretation. (These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or random, which elements of quantum mechanics can be considered "real", and what is the nature of measurement, among other matters.)

Despite nearly a century of debate and experiment, no consensus has been reached amongst physicists and philosophers of physics concerning which interpretation best "represents" reality.

Involuntary park

Involuntary park is a neologism coined by science fiction author and environmentalist Bruce Sterling to describe previously inhabited areas that for environmental, economic, or political reasons have, in Sterling's words, "lost their value for technological instrumentalism" and been allowed to return to an overgrown, feral state.

John Dewey

John Dewey (; October 20, 1859 – June 1, 1952) was an American philosopher, psychologist, and educational reformer whose ideas have been influential in education and social reform. Dewey is one of the primary figures associated with the philosophy of pragmatism and is considered one of the fathers of functional psychology. A Review of General Psychology survey, published in 2002, ranked Dewey as the 93rd most cited psychologist of the 20th century. A well-known public intellectual, he was also a major voice of progressive education and liberalism. Although Dewey is known best for his publications about education, he also wrote about many other topics, including epistemology, metaphysics, aesthetics, art, logic, social theory, and ethics. He was a major educational reformer for the 20th century.

The overriding theme of Dewey's works was his profound belief in democracy, be it in politics, education, or communication and journalism. As Dewey himself stated in 1888, while still at the University of Michigan, "Democracy and the one, ultimate, ethical ideal of humanity are to my mind synonymous."Known for his advocacy of democracy, Dewey considered two fundamental elements—schools and civil society—to be major topics needing attention and reconstruction to encourage experimental intelligence and plurality. Dewey asserted that complete democracy was to be obtained not just by extending voting rights but also by ensuring that there exists a fully formed public opinion, accomplished by communication among citizens, experts, and politicians, with the latter being accountable for the policies they adopt.

Legal formalism

Legal formalism is both a descriptive theory and a normative theory of how judges should decide cases. In its descriptive sense, formalists believe that judges reach their decisions by applying uncontroversial principles to the facts. Although the large number of decided cases implies a large number of principles, formalists believe that there is an underlying logic to these principles that is straightforward and which legal experts can readily discover. The ultimate goal of formalism would be to formalise the underlying principles in a single and determinate system that could be applied mechanically (hence the label 'mechanical jurisprudence'). Formalism has been called 'the official theory of judging'. It is the thesis to which legal realism is the antithesis.

As a normative theory, formalism is the view that judges should decide cases by the application of uncontroversial principles to the facts.

Mark Ledford

Mark Ledford (born 1960 – November 1, 2004) was an American trumpeter, singer, and guitarist. He was known for his multi-instrumentalism and his membership in the Pat Metheny Group.

Means End

Means End was a Swedish progressive metal band from Stockholm whose sound is spearheaded by SATB choirs, operatic vocals, 8-string guitars, and jazz-fusion compositions. Their music is likened to an amalgamation of artists such as Eric Whitacre, Devin Townsend, Meshuggah, Yellowjackets. They are notable for bridging the gap between classical composition and contemporary instrumentalism.

Mike Oldfield

Michael Gordon Oldfield (born 15 May 1953) is an English multi-instrumentalist and composer. His work blends progressive rock with world, folk, classical, electronic, ambient, and new-age music. His biggest commercial success is the 1973 album Tubular Bells – which launched Virgin Records and became a hit in America after its opening was used as the theme for the film The Exorcist. He recorded the 1983 hit single "Moonlight Shadow" and a rendition of the Christmas piece "In Dulci Jubilo".

Oldfield has released more than 20 albums with the most recent being a sequel to his 1975 album Ommadawn titled Return to Ommadawn, released on 20 January 2017.

Scientific realism

Scientific realism is the view that the universe described by science is real regardless of how it may be interpreted.

Within philosophy of science, this view is often an answer to the question "how is the success of science to be explained?" The discussion on the success of science in this context centers primarily on the status of unobservable entities apparently talked about by scientific theories. Generally, those who are scientific realists assert that one can make valid claims about unobservables (viz., that they have the same ontological status) as observables, as opposed to instrumentalism.

Unobservable

An unobservable (also called impalpable) is an entity whose existence, nature, properties, qualities or relations are not directly observable by humans. In philosophy of science, typical examples of "unobservables" are the force of gravity, causation and beliefs or desires.However, some philosophers (George Berkeley for example) also characterize all objects—trees, tables, other minds, microorganisms, every thing to which humans ascribe as the thing causing their perception—as unobservable.

"Unobservables" is a reference similar to Immanuel Kant's distinction between noumena (things-in-themselves, i.e., raw things in their necessarily unknowable state, before they pass through the formalizing apparatus of the senses and the mind in order to become perceived objects) and phenomena (the perceived object). According to Kant, humans can never know noumena; all that humans know is the phenomena. Kant's distinction is similar to John Locke's distinction between primary and secondary qualities. Secondary qualities are what humans perceive such as redness, chirping, heat, mustiness or sweetness. Primary qualities would be the actual qualities of the things themselves which give rise to the secondary qualities which humans perceive.

The ontological nature and epistemological issues concerning unobservables is a central topic in philosophy of science. The notion that a given unobservable exists is referred to as scientific realism, in contrast to instrumentalism, the notion that unobservables such as atoms are useful models but don't necessarily exist.

W. V. Metcalf distinguishes three kinds of unobservables. One is the logically unobservable, which involves a contradiction. An example would be a length which is both longer and shorter than a given length. The second is the practically unobservable, that which we can conceive of as observable by the known sense-faculties of man but we are prevented from observing by practical difficulties. The third kind is the physically unobservable, that which can never be observed by any existing sense-faculties of man.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.