Knowledge is a familiarity, awareness, or understanding of someone or something, such as facts, information, descriptions, or skills, which is acquired through experience or education by perceiving, discovering, or learning.

Knowledge can refer to a theoretical or practical understanding of a subject. It can be implicit (as with practical skill or expertise) or explicit (as with the theoretical understanding of a subject); it can be more or less formal or systematic.[1] In philosophy, the study of knowledge is called epistemology; the philosopher Plato famously defined knowledge as "justified true belief", though this definition is now thought by some analytic philosophers to be problematic because of the Gettier problems, while others defend the platonic definition.[2] However, several definitions of knowledge and theories to explain it exist.

Knowledge acquisition involves complex cognitive processes: perception, communication, and reasoning;[3] while knowledge is also said to be related to the capacity of acknowledgement in human beings.[4]

Theories of knowledge

Robert Reid, Knowledge (1896). Thomas Jefferson Building, Washington, D.C.

The eventual demarcation of philosophy from science was made possible by the notion that philosophy's core was "theory of knowledge," a theory distinct from the sciences because it was their foundation... Without this idea of a "theory of knowledge," it is hard to imagine what "philosophy" could have been in the age of modern science.

The definition of knowledge is a matter of ongoing debate among philosophers in the field of epistemology. The classical definition, described but not ultimately endorsed by Plato,[5] specifies that a statement must meet three criteria in order to be considered knowledge: it must be justified, true, and believed. Some claim that these conditions are not sufficient, as Gettier case examples allegedly demonstrate. There are a number of alternatives proposed, including Robert Nozick's arguments for a requirement that knowledge 'tracks the truth' and Simon Blackburn's additional requirement that we do not want to say that those who meet any of these conditions 'through a defect, flaw, or failure' have knowledge. Richard Kirkham suggests that our definition of knowledge requires that the evidence for the belief necessitates its truth.[6]

In contrast to this approach, Ludwig Wittgenstein observed, following Moore's paradox, that one can say "He believes it, but it isn't so," but not "He knows it, but it isn't so."[7] He goes on to argue that these do not correspond to distinct mental states, but rather to distinct ways of talking about conviction. What is different here is not the mental state of the speaker, but the activity in which they are engaged. For example, on this account, to know that the kettle is boiling is not to be in a particular state of mind, but to perform a particular task with the statement that the kettle is boiling. Wittgenstein sought to bypass the difficulty of definition by looking to the way "knowledge" is used in natural languages. He saw knowledge as a case of a family resemblance. Following this idea, "knowledge" has been reconstructed as a cluster concept that points out relevant features but that is not adequately captured by any definition.[8]

Communicating knowledge

Los portadores de la antorcha
Los portadores de la antorcha (The Torch-Bearers) – Sculpture by Anna Hyatt Huntington symbolizing the transmission of knowledge from one generation to the next (Ciudad Universitaria, Madrid, Spain)

Symbolic representations can be used to indicate meaning and can be thought of as a dynamic process. Hence the transfer of the symbolic representation can be viewed as one ascription process whereby knowledge can be transferred. Other forms of communication include observation and imitation, verbal exchange, and audio and video recordings. Philosophers of language and semioticians construct and analyze theories of knowledge transfer or communication.

While many would agree that one of the most universal and significant tools for the transfer of knowledge is writing and reading (of many kinds), argument over the usefulness of the written word exists nonetheless, with some scholars skeptical of its impact on societies. In his collection of essays Technopoly, Neil Postman demonstrates the argument against the use of writing through an excerpt from Plato's work Phaedrus (Postman, Neil (1992) Technopoly, Vintage, New York, p. 73). In this excerpt, the scholar Socrates recounts the story of Thamus, the Egyptian king and Theuth the inventor of the written word. In this story, Theuth presents his new invention "writing" to King Thamus, telling Thamus that his new invention "will improve both the wisdom and memory of the Egyptians" (Postman, Neil (1992) Technopoly, Vintage, New York, p. 74). King Thamus is skeptical of this new invention and rejects it as a tool of recollection rather than retained knowledge. He argues that the written word will infect the Egyptian people with fake knowledge as they will be able to attain facts and stories from an external source and will no longer be forced to mentally retain large quantities of knowledge themselves (Postman, Neil (1992) Technopoly, Vintage, New York, p. 74).

Classical early modern theories of knowledge, especially those advancing the influential empiricism of the philosopher John Locke, were based implicitly or explicitly on a model of the mind which likened ideas to words.[9] This analogy between language and thought laid the foundation for a graphic conception of knowledge in which the mind was treated as a table, a container of content, that had to be stocked with facts reduced to letters, numbers or symbols. This created a situation in which the spatial alignment of words on the page carried great cognitive weight, so much so that educators paid very close attention to the visual structure of information on the page and in notebooks.[10]

Major libraries today can have millions of books of knowledge (in addition to works of fiction). It is only recently that audio and video technology for recording knowledge have become available and the use of these still requires replay equipment and electricity. Verbal teaching and handing down of knowledge is limited to those who would have contact with the transmitter or someone who could interpret written work. Writing is still the most available and most universal of all forms of recording and transmitting knowledge. It stands unchallenged as mankind's primary technology of knowledge transfer down through the ages and to all cultures and languages of the world.

Haraway on situated knowledge

Situated knowledge is knowledge specific to a particular situation. It was used by Donna Haraway as an extension of the feminist approaches of "successor science" suggested by Sandra Harding, one which "offers a more adequate, richer, better account of a world, in order to live in it well and in critical, reflexive relation to our own as well as others' practices of domination and the unequal parts of privilege and oppression that makes up all positions."[11] This situation partially transforms science into a narrative, which Arturo Escobar explains as, "neither fictions nor supposed facts." This narrative of situation is historical textures woven of fact and fiction, and as Escobar explains further, "even the most neutral scientific domains are narratives in this sense," insisting that rather than a purpose dismissing science as a trivial matter of contingency, "it is to treat (this narrative) in the most serious way, without succumbing to its mystification as 'the truth' or to the ironic skepticism common to many critiques."[12]

Haraway's argument stems from the limitations of the human perception, as well as the overemphasis of the sense of vision in science. According to Haraway, vision in science has been, "used to signify a leap out of the marked body and into a conquering gaze from nowhere." This is the "gaze that mythically inscribes all the marked bodies, that makes the unmarked category claim the power to see and not be seen, to represent while escaping representation."[11] This causes a limitation of views in the position of science itself as a potential player in the creation of knowledge, resulting in a position of "modest witness". This is what Haraway terms a "god trick", or the aforementioned representation while escaping representation.[13] In order to avoid this, "Haraway perpetuates a tradition of thought which emphasizes the importance of the subject in terms of both ethical and political accountability".[14]

Some methods of generating knowledge, such as trial and error, or learning from experience, tend to create highly situational knowledge. Situational knowledge is often embedded in language, culture, or traditions. This integration of situational knowledge is an allusion to the community, and its attempts at collecting subjective perspectives into an embodiment "of views from somewhere." [11]

Even though Haraway's arguments are largely based on feminist studies,[11] this idea of different worlds, as well as the skeptic stance of situated knowledge is present in the main arguments of post-structuralism. Fundamentally, both argue the contingency of knowledge on the presence of history; power, and geography, as well as the rejection of universal rules or laws or elementary structures; and the idea of power as an inherited trait of objectification.[15]

Partial knowledge

Blind men and elephant2
The parable of Blind men and an elephant suggests that people tend to project their partial experiences as the whole truth

One discipline of epistemology focuses on partial knowledge. In most cases, it is not possible to understand an information domain exhaustively; our knowledge is always incomplete or partial. Most real problems have to be solved by taking advantage of a partial understanding of the problem context and problem data, unlike the typical math problems one might solve at school, where all data is given and one is given a complete understanding of formulas necessary to solve them.

This idea is also present in the concept of bounded rationality which assumes that in real life situations people often have a limited amount of information and make decisions accordingly.

Intuition is the ability to acquire partial knowledge without inference or the use of reason.[16] An individual may "know" about a situation and be unable to explain the process that led to their knowledge.

Scientific knowledge

The development of the scientific method has made a significant contribution to how knowledge of the physical world and its phenomena is acquired.[17] To be termed scientific, a method of inquiry must be based on gathering observable and measurable evidence subject to specific principles of reasoning and experimentation.[18] The scientific method consists of the collection of data through observation and experimentation, and the formulation and testing of hypotheses.[19] Science, and the nature of scientific knowledge have also become the subject of Philosophy. As science itself has developed, scientific knowledge now includes a broader usage[20] in the soft sciences such as biology and the social sciences – discussed elsewhere as meta-epistemology, or genetic epistemology, and to some extent related to "theory of cognitive development". Note that "epistemology" is the study of knowledge and how it is acquired. Science is "the process used everyday to logically complete thoughts through inference of facts determined by calculated experiments." Sir Francis Bacon was critical in the historical development of the scientific method; his works established and popularized an inductive methodology for scientific inquiry. His famous aphorism, "knowledge is power", is found in the Meditations Sacrae (1597).[21]

Until recent times, at least in the Western tradition, it was simply taken for granted that knowledge was something possessed only by humans – and probably adult humans at that. Sometimes the notion might stretch to Society-as-such, as in (e. g.) "the knowledge possessed by the Coptic culture" (as opposed to its individual members), but that was not assured either. Nor was it usual to consider unconscious knowledge in any systematic way until this approach was popularized by Freud.[22]

Other biological domains where "knowledge" might be said to reside, include: (iii) the immune system, and (iv) in the DNA of the genetic code. See the list of four "epistemological domains": Popper, (1975);[23] and Traill (2008:[24] Table S, p. 31) – also references by both to Niels Jerne.

Such considerations seem to call for a separate definition of "knowledge" to cover the biological systems. For biologists, knowledge must be usefully available to the system, though that system need not be conscious. Thus the criteria seem to be:

  • The system should apparently be dynamic and self-organizing (unlike a mere book on its own).
  • The knowledge must constitute some sort of representation of "the outside world",[25] or ways of dealing with it (directly or indirectly).
  • Some way must exist for the system to access this information quickly enough for it to be useful.

Scientific knowledge may not involve a claim to certainty, maintaining skepticism means that a scientist will never be absolutely certain when they are correct and when they are not. It is thus an irony of proper scientific method that one must doubt even when correct, in the hopes that this practice will lead to greater convergence on the truth in general.[26]

Religious meaning of knowledge

In many expressions of Christianity, such as Catholicism and Anglicanism, knowledge is one of the seven gifts of the Holy Spirit.[27]

The Old Testament's tree of the knowledge of good and evil contained the knowledge that separated Man from God: "And the LORD God said, Behold, the man is become as one of us, to know good and evil..." (Genesis 3:22)

In Gnosticism, divine knowledge or gnosis is hoped to be attained.

विद्या दान (Vidya Daan) i.e. knowledge sharing is a major part of Daan, a tenet of all Dharmic Religions.[28] Hindu Scriptures present two kinds of knowledge, Paroksh Gyan and Prataksh Gyan. Paroksh Gyan (also spelled Paroksha-Jnana) is secondhand knowledge: knowledge obtained from books, hearsay, etc. Pratyaksh Gyan (also spelled Pratyaksha-Jnana) is the knowledge borne of direct experience, i.e., knowledge that one discovers for oneself.[29] Jnana yoga ("path of knowledge") is one of three main types of yoga expounded by Krishna in the Bhagavad Gita. (It is compared and contrasted with Bhakti Yoga and Karma yoga.)

In Islam, knowledge (Arabic: علم, ʿilm) is given great significance. "The Knowing" (al-ʿAlīm) is one of the 99 names reflecting distinct attributes of God. The Qur'an asserts that knowledge comes from God (2:239) and various hadith encourage the acquisition of knowledge. Muhammad is reported to have said "Seek knowledge from the cradle to the grave" and "Verily the men of knowledge are the inheritors of the prophets". Islamic scholars, theologians and jurists are often given the title alim, meaning "knowledgeble".

In Jewish tradition, knowledge (Hebrew: דעת da'ath) is considered one of the most valuable traits a person can acquire. Observant Jews recite three times a day in the Amidah "Favor us with knowledge, understanding and discretion that come from you. Exalted are you, Existent-One, the gracious giver of knowledge." The Tanakh states, "A wise man gains power, and a man of knowledge maintains power", and "knowledge is chosen above gold".

As a measure of religiosity in sociology of religion

According to the sociologist Mervin F. Verbit, knowledge may be understood as one of the key components of religiosity. Religious knowledge itself may be broken down into four dimensions:

  • content
  • frequency
  • intensity
  • centrality

The content of one's religious knowledge may vary from person to person, as will the degree to which it may occupy the person's mind (frequency), the intensity of the knowledge, and the centrality of the information (in that religious tradition, or to that individual).[30][31][32]

See also


  1. ^ "knowledge: definition of knowledge in Oxford dictionary (American English) (US)". Archived from the original on 2010-07-14.
  2. ^ Paul Boghossian (2007), Fear of Knowledge: Against relativism and constructivism, Oxford: Clarendon Press, ISBN 978-0199230419, Chapter 7, pp. 95–101.
  3. ^ Dekel, Gil. "Methodology". Retrieved 3 July 2006.
  4. ^ Stanley Cavell, "Knowing and Acknowledging", Must We Mean What We Say? (Cambridge University Press, 2002), 238–266.
  5. ^ In Plato's Theaetetus, Socrates and Theaetetus discuss three definitions of knowledge: knowledge as nothing but perception, knowledge as true judgment, and, finally, knowledge as a true judgment with an account. Each of these definitions is shown to be unsatisfactory.
  6. ^ Kirkham, Richard L. (October 1984). "Does the Gettier Problem Rest on a Mistake?". Mind. New Series. 93 (372): 501–513. JSTOR 2254258. jstor (subscription required)
  7. ^ Ludwig Wittgenstein, On Certainty, remark 42
  8. ^ Gottschalk-Mazouz, N. (2008): "Internet and the flow of knowledge," in: Hrachovec, H.; Pichler, A. (Hg.): Philosophy of the Information Society. Proceedings of the 30. International Ludwig Wittgenstein Symposium Kirchberg am Wechsel, Austria 2007. Volume 2, Frankfurt, Paris, Lancaster, New Brunswik: Ontos, S. 215–232. "Archived copy" (PDF). Archived from the original (PDF) on 2015-05-24. Retrieved 2015-05-24.CS1 maint: Archived copy as title (link)
  9. ^ Hacking, Ian (1975). Why Does Language Matter to Philosophy?. Cambridge: Cambridge University Press. ISBN 978-0521099981.
  10. ^ Eddy, Matthew Daniel (2013). "The Shape of Knowledge: Children and the Visual Culture of Literacy and Numeracy". Science in Context. 26 (2): 215–245. doi:10.1017/s0269889713000045.
  11. ^ a b c d "Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective". Haraway, Donna. Feminist Studies Vol. 14, No. 3. pp. 575–599. 1988.
  12. ^ "Introduction: Development and the Anthropology of Modernity". Escobar, Arturo. Encountering Development: The Making and Unmaking of the Third World.
  13. ^ Chapter 1. Haraway, Donna. Modest_Witness@Second_Millennium. FemaleMan© Meets_OncoMouse2. Feminism and Technoscience. 1997.
  14. ^ "Posthuman, All Too Human: Towards a New Process Ontology". Braidotti, Rosi. Theory Culture Vol. 23. pp. 197–208. 2006.
  15. ^ "The Subject and Power". Foucault, Michel. Critical Inquiry Volume 9, No. 4. pp. 777–795. 1982
  16. ^ Oxford English Dictionary
  17. ^ "Science – Definition of science by Merriam-Webster".
  18. ^ "[4] Rules for the study of natural philosophy", Newton 1999, pp. 794–796, from the General Scholium, which follows Book 3, The System of the World.
  19. ^ scientific method, Merriam-Webster Dictionary.
  20. ^ Wilson, Timothy D. (12 July 2012). "Stop bullying the 'soft' sciences". Los Angeles Times.
  21. ^ "Sir Francis Bacon –". Retrieved 2009-07-08.
  22. ^ There is quite a good case for this exclusive specialization used by philosophers, in that it allows for in-depth study of logic-procedures and other abstractions which are not found elsewhere. However this may lead to problems whenever the topic spills over into those excluded domains – e. g. when Kant (following Newton) dismissed Space and Time as axiomatically "transcendental" and "a priori" – a claim later disproved by Piaget's clinical studies. It also seems likely that the vexed problem of "infinite regress" can be largely (but not completely) solved by proper attention to how unconscious concepts are actually developed, both during infantile learning and as inherited "pseudo-transcendentals" inherited from the trial-and-error of previous generations. See also "Tacit knowledge".
    • Piaget, J., and B.Inhelder (1927/1969). The child's conception of time. Routledge & Kegan Paul: London.
    • Piaget, J., and B. Inhelder (1948/1956). The child's conception of space. Routledge & Kegan Paul: London.
  23. ^ Popper, K.R. (1975). "The rationality of scientific revolutions"; in Rom Harré (ed.), Problems of Scientific Revolution: Scientific Progress and Obstacles to Progress in the Sciences. Clarendon Press: Oxford.
  24. ^ Robert R. Traill. "Thinking by Molecule, Synapse, or both? : From Piaget's Schema, to the Selecting/Editing of ncRNA : Ondwelle short-monograph, No. 2" (PDF). Retrieved 3 February 2019.
  25. ^ This "outside world" could include other subsystems within the same organism – e. g. different "mental levels" corresponding to different Piagetian stages. See Theory of cognitive development.
  26. ^ "philosophy bites".
  27. ^ "Part Three, No. 1831". Catechism of the Catholic Church. Archived from the original on 2007-05-04. Retrieved 2007-04-20.
  28. ^ "विद्या दान ही सबसे बडा दान : विहिप – Vishva Hindu Parishad – Official Website". Archived from the original on 2011-08-20.
  29. ^ Swami Krishnananda. "Chapter 7". The Philosophy of the Panchadasi. The Divine Life Society. Retrieved 2008-07-05.
  30. ^ Verbit, M.F. (1970). The components and dimensions of religious behavior: Toward a reconceptualization of religiosity. American mosaic, 24, 39.
  31. ^ Küçükcan, T. (2010). Multidimensional Approach to Religion: a way of looking at religious phenomena. Journal for the Study of Religions and Ideologies, 4(10), 60–70.
  32. ^ Talip Küçükcan. "CAN RELIGIOSITY BE MEASURED? DIMENSIONS OF RELIGIOUS COMMITMENT : Theories Revisited" (PDF). Retrieved 3 February 2019.

External links

A priori and a posteriori

The Latin phrases a priori (lit. "from the earlier") and a posteriori (lit. "from the later") are philosophical terms popularized by Immanuel Kant's Critique of Pure Reason (first published in 1781, second edition in 1787), one of the most influential works in the history of philosophy. However, in their Latin forms they appear in Latin translations of Euclid's Elements, of about 300 BC, a work widely considered during the early European modern period as the model for precise thinking.

These terms are used with respect to reasoning (epistemology) to distinguish "necessary conclusions from first premises" (i.e., what must come before sense observation) from "conclusions based on sense observation" which must follow it. Thus, the two kinds of knowledge, justification, or argument, may be glossed:

A priori knowledge or justification is independent of experience, as with mathematics (3 + 2 = 5), tautologies ("All bachelors are unmarried"), and deduction from pure reason (e.g., ontological proofs).

A posteriori knowledge or justification depends on experience or empirical evidence, as with most aspects of science and personal knowledge.There are many points of view on these two types of knowledge, and their relationship gives rise to one of the oldest problems in modern philosophy.

The terms a priori and a posteriori are primarily used as adjectives to modify the noun "knowledge" (for example, "a priori knowledge"). However, "a priori" is sometimes used to modify other nouns, such as "truth". Philosophers also may use "apriority" and "aprioricity" as nouns to refer (approximately) to the quality of being "a priori".Although definitions and use of the terms have varied in the history of philosophy, they have consistently labeled two separate epistemological notions. See also the related distinctions: deductive/inductive, analytic/synthetic, necessary/contingent.

Adam and Eve

Adam and Eve, according to the creation myth of the Abrahamic religions, were the first man and woman. They are central to the belief that humanity is in essence a single family, with everyone descended from a single pair of original ancestors. It also provides the basis for the doctrines of the fall of man and original sin that are important beliefs in Christianity, although not held in Judaism or Islam.In the Book of Genesis of the Hebrew Bible, chapters one through five, there are two creation narratives with two distinct perspectives. In the first, Adam and Eve are not named. Instead, God created humankind in God's image and instructed them to multiply and to be stewards over everything else that God had made. In the second narrative, God fashions Adam from dust and places him in the Garden of Eden. Adam is told that he can eat freely of all the trees in the garden, except for a tree of the knowledge of good and evil. Subsequently, Eve is created from one of Adam's ribs to be Adam's companion. They are innocent and unembarrassed about their nakedness. However, a serpent deceives Eve into eating fruit from the forbidden tree, and she gives some of the fruit to Adam. These acts give them additional knowledge, but it gives them the ability to conjure negative and destructive concepts such as shame and evil. God later curses the serpent and the ground. God prophetically tells the woman and the man what will be the consequences of their sin of disobeying God. Then he banishes them from the Garden of Eden.

The story underwent extensive elaboration in later Abrahamic traditions, and it has been extensively analyzed by modern biblical scholars. Interpretations and beliefs regarding Adam and Eve and the story revolving around them vary across religions and sects; for example, the Islamic version of the story holds that Adam and Eve were equally responsible for their sins of hubris, instead of Eve being the first one to be unfaithful. The story of Adam and Eve is often depicted in art, and it has had an important influence in literature and poetry.

The story of the fall of Adam is often considered to be an allegory. There is physical evidence that Adam and Eve never existed; findings in genetics are incompatible with there being a single first pair of human beings.

Artificial intelligence

In computer science, artificial intelligence (AI), sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and animals. Computer science defines AI research as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals. Colloquially, the term "artificial intelligence" is used to describe machines that mimic "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving".As machines become increasingly capable, tasks considered to require "intelligence" are often removed from the definition of AI, a phenomenon known as the AI effect. A quip in Tesler's Theorem says "AI is whatever hasn't been done yet." For instance, optical character recognition is frequently excluded from things considered to be AI, having become a routine technology. Modern machine capabilities generally classified as AI include successfully understanding human speech, competing at the highest level in strategic game systems (such as chess and Go), autonomously operating cars, intelligent routing in content delivery networks, and military simulations.

Artificial intelligence can be classified into three different types of systems: analytical, human-inspired, and humanized artificial intelligence. Analytical AI has only characteristics consistent with cognitive intelligence; generating a cognitive representation of the world and using learning based on past experience to inform future decisions. Human-inspired AI has elements from cognitive and emotional intelligence; understanding human emotions, in addition to cognitive elements, and considering them in their decision making. Humanized AI shows characteristics of all types of competencies (i.e., cognitive, emotional, and social intelligence), is able to be self-conscious and is self-aware in interactions with others.

Artificial intelligence was founded as an academic discipline in 1956, and in the years since has experienced several waves of optimism, followed by disappointment and the loss of funding (known as an "AI winter"), followed by new approaches, success and renewed funding. For most of its history, AI research has been divided into subfields that often fail to communicate with each other. These sub-fields are based on technical considerations, such as particular goals (e.g. "robotics" or "machine learning"), the use of particular tools ("logic" or artificial neural networks), or deep philosophical differences. Subfields have also been based on social factors (particular institutions or the work of particular researchers).The traditional problems (or goals) of AI research include reasoning, knowledge representation, planning, learning, natural language processing, perception and the ability to move and manipulate objects. General intelligence is among the field's long-term goals. Approaches include statistical methods, computational intelligence, and traditional symbolic AI. Many tools are used in AI, including versions of search and mathematical optimization, artificial neural networks, and methods based on statistics, probability and economics. The AI field draws upon computer science, information engineering, mathematics, psychology, linguistics, philosophy, and many other fields.

The field was founded on the claim that human intelligence "can be so precisely described that a machine can be made to simulate it". This raises philosophical arguments about the nature of the mind and the ethics of creating artificial beings endowed with human-like intelligence which are issues that have been explored by myth, fiction and philosophy since antiquity. Some people also consider AI to be a danger to humanity if it progresses unabated. Others believe that AI, unlike previous technological revolutions, will create a risk of mass unemployment.In the twenty-first century, AI techniques have experienced a resurgence following concurrent advances in computer power, large amounts of data, and theoretical understanding; and AI techniques have become an essential part of the technology industry, helping to solve many challenging problems in computer science, software engineering and operations research.


Cognition is "the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses". It encompasses many aspects of intellectual functions and processes such as attention, the formation of knowledge, memory and working memory, judgment and evaluation, reasoning and "computation", problem solving and decision making, comprehension and production of language. Cognitive processes use existing knowledge and generate new knowledge.

The processes are analyzed from different perspectives within different contexts, notably in the fields of linguistics, anesthesia, neuroscience, psychiatry, psychology, education, philosophy, anthropology, biology, systemics, logic, and computer science. These and other different approaches to the analysis of cognition are synthesised in the developing field of cognitive science, a progressively autonomous academic discipline.


Data ( DAY-tə, DAT-ə, DAH-tə) is a set of values of subjects with respect to qualitative or quantitative variables.

Data and information or knowledge are often used interchangeably; however data becomes information when it is viewed in context or in post-analysis .

While the concept of data is commonly associated with scientific research, data is collected by a huge range of organizations and institutions, including businesses (e.g., sales data, revenue, profits, stock price), governments (e.g., crime rates, unemployment rates, literacy rates) and non-governmental organizations (e.g., censuses of the number of homeless people by non-profit organizations).

Data is measured, collected and reported, and analyzed, whereupon it can be visualized using graphs, images or other analysis tools. Data as a general concept refers to the fact that some existing information or knowledge is represented or coded in some form suitable for better usage or processing. Raw data ("unprocessed data") is a collection of numbers or characters before it has been "cleaned" and corrected by researchers. Raw data needs to be corrected to remove outliers or obvious instrument or data entry errors (e.g., a thermometer reading from an outdoor Arctic location recording a tropical temperature). Data processing commonly occurs by stages, and the "processed data" from one stage may be considered the "raw data" of the next stage. Field data is raw data that is collected in an uncontrolled "in situ" environment. Experimental data is data that is generated within the context of a scientific investigation by observation and recording. Data has been described as the new oil of the digital economy.

Data mining

Data mining is the process of discovering patterns in large data sets involving methods at the intersection of machine learning, statistics, and database systems. Data mining is an interdisciplinary subfield of computer science and statistics with an overall goal to extract information (with intelligent methods) from a data set and transform the information into a comprehensible structure for further use. Data mining is the analysis step of the "knowledge discovery in databases" process, or KDD. Aside from the raw analysis step, it also involves database and data management aspects, data pre-processing, model and inference considerations, interestingness metrics, complexity considerations, post-processing of discovered structures, visualization, and online updating. The difference between data analysis and data mining is that data analysis is used to test models and hypotheses on the dataset, e.g., analyzing the effectiveness of a marketing campaign, regardless of the amount of data; in contrast, data mining uses machine-learning and statistical models to uncover clandestine or hidden patterns in a large volume of data.The term "data mining" is in fact a misnomer, because the goal is the extraction of patterns and knowledge from large amounts of data, not the extraction (mining) of data itself. It also is a buzzword and is frequently applied to any form of large-scale data or information processing (collection, extraction, warehousing, analysis, and statistics) as well as any application of computer decision support system, including artificial intelligence (e.g., machine learning) and business intelligence. The book Data mining: Practical machine learning tools and techniques with Java (which covers mostly machine learning material) was originally to be named just Practical machine learning, and the term data mining was only added for marketing reasons. Often the more general terms (large scale) data analysis and analytics – or, when referring to actual methods, artificial intelligence and machine learning – are more appropriate.

The actual data mining task is the semi-automatic or automatic analysis of large quantities of data to extract previously unknown, interesting patterns such as groups of data records (cluster analysis), unusual records (anomaly detection), and dependencies (association rule mining, sequential pattern mining). This usually involves using database techniques such as spatial indices. These patterns can then be seen as a kind of summary of the input data, and may be used in further analysis or, for example, in machine learning and predictive analytics. For example, the data mining step might identify multiple groups in the data, which can then be used to obtain more accurate prediction results by a decision support system. Neither the data collection, data preparation, nor result interpretation and reporting is part of the data mining step, but do belong to the overall KDD process as additional steps.

The related terms data dredging, data fishing, and data snooping refer to the use of data mining methods to sample parts of a larger population data set that are (or may be) too small for reliable statistical inferences to be made about the validity of any patterns discovered. These methods can, however, be used in creating new hypotheses to test against the larger data populations.

Empirical evidence

Empirical evidence is the information received by means of the senses, particularly by observation and documentation of patterns and behavior through experimentation. The term comes from the Greek word for experience, ἐμπειρία (empeiría).

After Immanuel Kant, in philosophy, it is common to call the knowledge gained a posteriori knowledge (in contrast to a priori knowledge).


Epistemology ( (listen); from Greek, Modern ἐπιστήμη, epistēmē, meaning 'knowledge', and λόγος, logos, meaning 'the study of [a certain subject]') is the branch of philosophy concerned with the theory of knowledge.Epistemology is the study of the nature of knowledge, justification, and the rationality of belief. Much debate in epistemology centers on four areas: (1) the philosophical analysis of the nature of knowledge and how it relates to such concepts as truth, belief, and justification, (2) various problems of skepticism, (3) the sources and scope of knowledge and justified belief, and (4) the criteria for knowledge and justification. Epistemology addresses such questions as: "What makes justified beliefs justified?", "What does it mean to say that we know something?", and fundamentally "How do we know that we know?"

Fellow of the Royal Society

Fellowship of the Royal Society (FRS, ForMemRS and HonFRS) is an award granted to individuals that the Royal Society of London judges to have made a 'substantial contribution to the improvement of natural knowledge, including mathematics, engineering science and medical science'.

Fellowship of the Society, the oldest scientific academy in continuous existence, is a significant honour which has been awarded to many eminent scientists from history including Isaac Newton (1672), Charles Darwin (1839), Michael Faraday (1824), Ernest Rutherford (1903), Srinivasa Ramanujan (1918), Albert Einstein (1921), Winston Churchill (1941), Subrahmanyan Chandrasekhar (1944), Dorothy Hodgkin (1947), Alan Turing (1951) and Francis Crick (1959). More recently, fellowship has been awarded to Stephen Hawking (1974), Tim Hunt (1991), Elizabeth Blackburn (1992), Tim Berners-Lee (2001), Venkatraman Ramakrishnan (2003), Atta-ur Rahman (2006), Andre Geim (2007), James Dyson (2015), Ajay Kumar Sood (2015), Subhash Khot (2017), Elon Musk (2018) and around 8,000 others in total, including over 280 Nobel Laureates since 1900. As of October 2018, there are approximately 1689 living Fellows, Foreign and Honorary Members, of which over 60 are Nobel Laureates.Fellowship of the Royal Society has been described by The Guardian newspaper as “the equivalent of a lifetime achievement Oscar” with several institutions celebrating their announcement each year.

Free content

Free content, libre content, or free information, is any kind of functional work, work of art, or other creative content that meets the definition of a free cultural work.


A how-to is an informal, often short, video, writing, or description of how to accomplish a specific task. A how-to is usually meant to help non-experts, may leave out details that are only important to experts, and may also be greatly simplified from an overall discussion of the topic.

One of the earliest how-to books was published in 1569 and entitled, A booke of the arte and maner, how to plant and graffe all sortes of trees: With divers other new practise, by one of the Abbey of Saint Vincent in Fraunce by Leonard Mascall.

Perhaps the best known full-length book in the genre is How to Win Friends and Influence People, written by Dale Carnegie in 1936.

A similar concept can be seen in many of the [topic] For Dummies series of tutorials and also in many other introductory surveys entitled with the suffix "101" (based on academic numberings of entry-level courses).

Knowledge management

Knowledge management (KM) is the process of creating, sharing, using and managing the knowledge and information of an organisation. It refers to a multidisciplinary approach to achieving organisational objectives by making the best use of knowledge.An established discipline since 1991, KM includes courses taught in the fields of business administration, information systems, management, library, and information sciences. Other fields may contribute to KM research, including information and media, computer science, public health and public policy. Several universities offer dedicated master's degrees in knowledge management.

Many large companies, public institutions and non-profit organisations have resources dedicated to internal KM efforts, often as a part of their business strategy, IT, or human resource management departments. Several consulting companies provide advice regarding KM to these organisations.Knowledge management efforts typically focus on organisational objectives such as improved performance, competitive advantage, innovation, the sharing of lessons learned, integration and continuous improvement of the organisation. These efforts overlap with organisational learning and may be distinguished from that by a greater focus on the management of knowledge as a strategic asset and on encouraging the sharing of knowledge. KM is an enabler of organisational learning.


The occult (from the Latin word occultus "clandestine, hidden, secret") is "knowledge of the hidden" or "knowledge of the paranormal", as opposed to facts and "knowledge of the measurable", usually referred to as science. The term is sometimes taken to mean knowledge that "is meant only for certain people" or that "must be kept hidden", but for most practicing occultists it is simply the study of a deeper spiritual reality that extends pure reason and the physical sciences. The terms esoteric and arcane can also be used to describe the occult, in addition to their meanings unrelated to the supernatural.

The term occult sciences was used in the 16th century to refer to astrology, alchemy, and natural magic. The term occultism emerged in 19th-century France, where it came to be associated with various French esoteric groups connected to Éliphas Lévi and Papus, and in 1875 was introduced into the English language by the esotericist Helena Blavatsky. Throughout the 20th century, the term was used idiosyncratically by a range of different authors, but by the 21st century was commonly employed – including by academic scholars of esotericism – to refer to a range of esoteric currents that developed in the mid-19th century and their descendants. Occultism is thus often used to categorise such esoteric traditions as Spiritualism, Theosophy, Anthroposophy, the Hermetic Order of the Golden Dawn, and New Age.

Particularly since the late twentieth century, various authors have used the occult as a substantivized adjective. In this usage, "the occult" is a category into which varied beliefs and practices are placed if they are considered to fit into neither religion nor science. "The occult" in this sense is very broad, encompassing such phenomenon as beliefs in vampires or fairies and movements like Ufology and parapsychology. In that same period, occult and culture were combined to form the neologism occulture. Initially used in the industrial music scene, it was later given scholarly applications.


Philosophy (from Greek φιλοσοφία, philosophia, literally "love of wisdom") is the study of general and fundamental questions about existence, knowledge, values, reason, mind, and language. Such questions are often posed as problems to be studied or resolved. The term was probably coined by Pythagoras (c. 570 – 495 BCE). Philosophical methods include questioning, critical discussion, rational argument, and systematic presentation. Classic philosophical questions include: Is it possible to know anything and to prove it? What is most real? Philosophers also pose more practical and concrete questions such as: Is there a best way to live? Is it better to be just or unjust (if one can get away with it)? Do humans have free will?Historically, "philosophy" encompassed any body of knowledge. From the time of Ancient Greek philosopher Aristotle to the 19th century, "natural philosophy" encompassed astronomy, medicine, and physics. For example, Newton's 1687 Mathematical Principles of Natural Philosophy later became classified as a book of physics. In the 19th century, the growth of modern research universities led academic philosophy and other disciplines to professionalize and specialize. In the modern era, some investigations that were traditionally part of philosophy became separate academic disciplines, including psychology, sociology, linguistics, and economics.

Other investigations closely related to art, science, politics, or other pursuits remained part of philosophy. For example, is beauty objective or subjective? Are there many scientific methods or just one? Is political utopia a hopeful dream or hopeless fantasy? Major sub-fields of academic philosophy include metaphysics ("concerned with the fundamental nature of reality and being"), epistemology (about the "nature and grounds of knowledge [and]...its limits and validity"), ethics, aesthetics, political philosophy, logic and philosophy of science.


A polymath (Greek: πολυμαθής, polymathēs, "having learned much"; Latin: homo universalis, "universal man") is a person whose expertise spans a significant number of subject areas, known to draw on complex bodies of knowledge to solve specific problems.

In Western Europe, the first work to use polymathy in its title (De Polymathia tractatio: integri operis de studiis veterum) was published in 1603 by Johann von Wowern (de), a Hamburg philosopher.Von Wowern defined polymathy as "knowledge of various matters, drawn from all kinds of studies [...] ranging freely through all the fields of the disciplines, as far as the human mind, with unwearied industry, is able to pursue them". Von Wowern lists erudition, literature, philology, philomathy and polyhistory as synonyms. The related term, polyhistor, is an ancient term with similar meaning.Polymaths include the great thinkers of the Renaissance and the Enlightenment who excelled at several fields in science, technology, engineering, mathematics, and the arts. In the Italian Renaissance, the idea of the polymath was expressed by Leon Battista Alberti (1404–1472) in the statement that "a man can do all things if he will".Embodying a basic tenet of Renaissance humanism that humans are limitless in their capacity for development, the concept led to the notion that people should embrace all knowledge and develop their capacities as fully as possible. This is expressed in the term "Renaissance man", often applied to the gifted people of that age who sought to develop their abilities in all areas of accomplishment: intellectual, artistic, social and physical.

The term entered the lexicon in the 20th century and has now been applied to great thinkers living before and after the Renaissance.


Research comprises "creative and systematic work undertaken to increase the stock of knowledge, including knowledge of humans, culture and society, and the use of this stock of knowledge to devise new applications." It is used to establish or confirm facts, reaffirm the results of previous work, solve new or existing problems, support theorems, or develop new theories. A research project may also be an expansion on past work in the field. Research projects can be used to develop further knowledge on a topic, or in the example of a school research project, they can be used to further a student's research prowess to prepare them for future jobs or reports. To test the validity of instruments, procedures, or experiments, research may replicate elements of prior projects or the project as a whole. The primary purposes of basic research (as opposed to applied research) are documentation, discovery, interpretation, or the research and development (R&D) of methods and systems for the advancement of human knowledge. Approaches to research depend on epistemologies, which vary considerably both within and between humanities and sciences. There are several forms of research: scientific, humanities, artistic, economic, social, business, marketing, practitioner research, life, technological, etc.


Science (from the Latin word scientia, meaning "knowledge") is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe.The earliest roots of science can be traced to Ancient Egypt and Mesopotamia in around 3500 to 3000 BCE. Their contributions to mathematics, astronomy, and medicine entered and shaped Greek natural philosophy of classical antiquity, whereby formal attempts were made to explain events of the physical world based on natural causes. After the fall of the Western Roman Empire, knowledge of Greek conceptions of the world deteriorated in Western Europe during the early centuries (400 to 1000 CE) of the Middle Ages but was preserved in the Muslim world during the Islamic Golden Age. The recovery and assimilation of Greek works and Islamic inquiries into Western Europe from the 10th to 13th century revived natural philosophy, which was later transformed by the Scientific Revolution that began in the 16th century as new ideas and discoveries departed from previous Greek conceptions and traditions. The scientific method soon played a greater role in knowledge creation and it was not until the 19th century that many of the institutional and professional features of science began to take shape.Modern science is typically divided into three major branches that consist of the natural sciences (e.g., biology, chemistry, and physics), which study nature in the broadest sense; the social sciences (e.g., economics, psychology, and sociology), which study individuals and societies; and the formal sciences (e.g., logic, mathematics, and theoretical computer science), which study abstract concepts. There is disagreement, however, on whether the formal sciences actually constitute a science as they do not rely on empirical evidence. Disciplines that use existing scientific knowledge for practical purposes, such as engineering and medicine, are described as applied sciences.Science is based on research, which is commonly conducted in academic and research institutions as well as in government agencies and companies. The practical impact of scientific research has led to the emergence of science policies that seek to influence the scientific enterprise by prioritizing the development of commercial products, armaments, health care, and environmental protection.


Skepticism (American English) or scepticism (British English, Australian English, and Canadian English) is generally any questioning attitude or doubt towards one or more items of putative knowledge or belief. It is often directed at domains, such as the supernatural, morality (moral skepticism), religion (skepticism about the existence of God), or knowledge (skepticism about the possibility of knowledge, or of certainty). Formally, skepticism as a topic occurs in the context of philosophy, particularly epistemology, although it can be applied to any topic such as politics, religion, and pseudoscience.

Philosophical skepticism comes in various forms. Radical forms of skepticism deny that knowledge or rational belief is possible and urge us to suspend judgment on many or all controversial matters. More moderate forms of skepticism claim only that nothing can be known with certainty, or that we can know little or nothing about the "big questions" in life, such as whether God exists or whether there is an afterlife. Religious skepticism is "doubt concerning basic religious principles (such as immortality, providence, and revelation)". Scientific skepticism concerns testing beliefs for reliability, by subjecting them to systematic investigation using the scientific method, to discover empirical evidence for them.

Web of Science

Web of Science (previously known as Web of Knowledge) is an online subscription-based scientific citation indexing service originally produced by the Institute for Scientific Information (ISI), later maintained by Clarivate Analytics (previously the Intellectual Property and Science business of Thomson Reuters), that provides a comprehensive citation search. It gives access to multiple databases that reference cross-disciplinary research, which allows for in-depth exploration of specialized sub-fields within an academic or scientific discipline.

  • Abilities
  • Traits
  • Constructs
Models and theories
Areas of research
Related articles

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.