Semantics (from Ancient Greek: σημαντικός sēmantikós, "significant")[a] is the linguistic and philosophical study of meaning, in language, programming languages, formal logics, and semiotics. It is concerned with the relationship between signifiers—like words, phrases, signs, and symbols—and what they stand for in reality, their denotation.
In International scientific vocabulary semantics is also called semasiology. The word semantics was first used by Michel Bréal, a French philologist. It denotes a range of ideas—from the popular to the highly technical. It is often used in ordinary language for denoting a problem of understanding that comes down to word selection or connotation. This problem of understanding has been the subject of many formal enquiries, over a long period of time, especially in the field of formal semantics. In linguistics, it is the study of the interpretation of signs or symbols used in agents or communities within particular circumstances and contexts. Within this view, sounds, facial expressions, body language, and proxemics have semantic (meaningful) content, and each comprises several branches of study. In written language, things like paragraph structure and punctuation bear semantic content; other forms of language bear other semantic content.
The formal study of semantics intersects with many other fields of inquiry, including lexicology, syntax, pragmatics, etymology and others. Independently, semantics is also a well-defined field in its own right, often with synthetic properties. In the philosophy of language, semantics and reference are closely connected. Further related fields include philology, communication, and semiotics. The formal study of semantics can therefore be manifold and complex.
Semantics contrasts with syntax, the study of the combinatorics of units of a language (without reference to their meaning), and pragmatics, the study of the relationships between the symbols of a language, their meaning, and the users of the language. Semantics as a field of study also has significant ties to various representational theories of meaning including truth theories of meaning, coherence theories of meaning, and correspondence theories of meaning. Each of these is related to the general philosophical study of reality and the representation of meaning. In 1960s psychosemantic studies became popular after Osgood's massive cross-cultural studies using his semantic differential (SD) method that used thousands of nouns and adjective bipolar scales. A specific form of the SD, Projective Semantics method uses only most common and neutral nouns that correspond to the 7 groups (factors) of adjective-scales most consistently found in cross-cultural studies (Evaluation, Potency, Activity as found by Osgood, and Reality, Organization, Complexity, Limitation as found in other studies). In this method, seven groups of bipolar adjective scales corresponded to seven types of nouns so the method was thought to have the object-scale symmetry (OSS) between the scales and nouns for evaluation using these scales. For example, the nouns corresponding to the listed 7 factors would be: Beauty, Power, Motion, Life, Work, Chaos, Law. Beauty was expected to be assessed unequivocally as “very good” on adjectives of Evaluation-related scales, Life as “very real” on Reality-related scales, etc. However, deviations in this symmetric and very basic matrix might show underlying biases of two types: scales-related bias and objects-related bias. This OSS design meant to increase the sensitivity of the SD method to any semantic biases in responses of people within the same culture and educational background.
In linguistics, semantics is the subfield that is devoted to the study of meaning, as inherent at the levels of words, phrases, sentences, and larger units of discourse (termed texts, or narratives). The study of semantics is also closely linked to the subjects of representation, reference and denotation. The basic study of semantics is oriented to the examination of the meaning of signs, and the study of relations between different linguistic units and compounds: homonymy, synonymy, antonymy, hypernymy, hyponymy, meronymy, metonymy, holonymy, paronyms. A key concern is how meaning attaches to larger chunks of text, possibly as a result of the composition from smaller units of meaning. Traditionally, semantics has included the study of sense and denotative reference, truth conditions, argument structure, thematic roles, discourse analysis, and the linkage of all of these to syntax.
In the late 1960s, Richard Montague proposed a system for defining semantic entries in the lexicon in terms of the lambda calculus. In these terms, the syntactic parse of the sentence John ate every bagel would consist of a subject (John) and a predicate (ate every bagel); Montague demonstrated that the meaning of the sentence altogether could be decomposed into the meanings of its parts and in relatively few rules of combination. The logical predicate thus obtained would be elaborated further, e.g. using truth theory models, which ultimately relate meanings to a set of Tarskian universals, which may lie outside the logic. The notion of such meaning atoms or primitives is basic to the language of thought hypothesis from the 1970s.
Despite its elegance, Montague grammar was limited by the context-dependent variability in word sense, and led to several attempts at incorporating context, such as:
In Chomskyan linguistics there was no mechanism for the learning of semantic relations, and the nativist view considered all semantic notions as inborn. Thus, even novel concepts were proposed to have been dormant in some sense. This view was also thought unable to address many issues such as metaphor or associative meanings, and semantic change, where meanings within a linguistic community change over time, and qualia or subjective experience. Another issue not addressed by the nativist model was how perceptual cues are combined in thought, e.g. in mental rotation.
This view of semantics, as an innate finite meaning inherent in a lexical unit that can be composed to generate meanings for larger chunks of discourse, is now being fiercely debated in the emerging domain of cognitive linguistics and also in the non-Fodorian camp in philosophy of language. The main challenge is motivated by:
A concrete example of the latter phenomenon is semantic underspecification – meanings are not complete without some elements of context. To take an example of one word, red, its meaning in a phrase such as red book is similar to many other usages, and can be viewed as compositional. However, the colours implied in phrases such as red wine (very dark), and red hair (coppery), or red soil, or red skin are very different. Indeed, these colours by themselves would not be called red by native speakers. These instances are contrastive, so red wine is so called only in comparison with the other kind of wine (which also is not white for the same reasons). This view goes back to de Saussure:
Each of a set of synonyms like redouter ('to dread'), craindre ('to fear'), avoir peur ('to be afraid') has its particular value only because they stand in contrast with one another. No word has a value that can be identified independently of what else is in its vicinity.
An attempt to defend a system based on propositional meaning for semantic underspecification can be found in the generative lexicon model of James Pustejovsky, who extends contextual operations (based on type shifting) into the lexicon. Thus meanings are generated "on the fly" (as you go), based on finite context.
Another set of concepts related to fuzziness in semantics is based on prototypes. The work of Eleanor Rosch in the 1970s led to a view that natural categories are not characterizable in terms of necessary and sufficient conditions, but are graded (fuzzy at their boundaries) and inconsistent as to the status of their constituent members. One may compare it with Jung's archetype, though the concept of archetype sticks to static concept. Some post-structuralists are against the fixed or static meaning of the words. Derrida, following Nietzsche, talked about slippages in fixed meanings.
Systems of categories are not objectively out there in the world but are rooted in people's experience. These categories evolve as learned concepts of the world – meaning is not an objective truth, but a subjective construct, learned from experience, and language arises out of the "grounding of our conceptual systems in shared embodiment and bodily experience". A corollary of this is that the conceptual categories (i.e. the lexicon) will not be identical for different cultures, or indeed, for every individual in the same culture. This leads to another debate (see the Sapir–Whorf hypothesis or Eskimo words for snow).
Originates from Montague's work (see above). A highly formalized theory of natural language semantics in which expressions are assigned denotations (meanings) such as individuals, truth values, or functions from one of these to another. The truth of a sentence, and its logical relation to other sentences, is then evaluated relative to a model.
Pioneered by the philosopher Donald Davidson, another formalized theory, which aims to associate each natural language sentence with a meta-language description of the conditions under which it is true, for example: 'Snow is white' is true if and only if snow is white. The challenge is to arrive at the truth conditions for any sentences from fixed meanings assigned to the individual words and fixed rules for how to combine them. In practice, truth-conditional semantics is similar to model-theoretic semantics; conceptually, however, they differ in that truth-conditional semantics seeks to connect language with statements about the real world (in the form of meta-language statements), rather than with abstract models.
This theory is an effort to explain properties of argument structure. The assumption behind this theory is that syntactic properties of phrases reflect the meanings of the words that head them. With this theory, linguists can better deal with the fact that subtle differences in word meaning correlate with other differences in the syntactic structure that the word appears in. The way this is gone about is by looking at the internal structure of words. These small parts that make up the internal structure of words are termed semantic primitives.
A linguistic theory that investigates word meaning. This theory understands that the meaning of a word is fully reflected by its context. Here, the meaning of a word is constituted by its contextual relations. Therefore, a distinction between degrees of participation as well as modes of participation are made. In order to accomplish this distinction any part of a sentence that bears a meaning and combines with the meanings of other constituents is labeled as a semantic constituent. Semantic constituents that cannot be broken down into more elementary constituents are labeled minimal semantic constituents.
Various fields or disciplines have long been contributing to cross-cultural semantics. Are words like ‘love’, ‘truth’, and ‘hate’ universals (see Underhill 2012)? Is even the word ‘sense’ – so central to semantics – a universal, or a concept entrenched in a long-standing but culture-specific tradition (see Wierzbicka 2010)? These are the kind of crucial questions that are discussed in cross-cultural semantics. Translation theory, Ethnolinguistics, Linguistic Anthropology and Cultural linguistics specialize in the field of comparing, contrasting, and translating words, terms and meanings from one language to another (see Herder, W. von Humboldt, Boas, Sapir, and Whorf). But philosophy, sociology, and anthropology have long established traditions in contrasting the different nuances of the terms and concepts we use. And online encyclopaedias such as the Stanford encyclopedia of philosophy, https://plato.stanford.edu, and more and more Wikipedia itself have greatly facilitated the possibilities of comparing the background and usages of key cultural terms. In recent years the question of whether key terms are translatable or untranslatable has increasingly come to the fore of global discussions, especially since the publication of Barbara Cassin’s Dictionary of Untranslatables: A Philosophical Lexicon, in 2014.
Cassin, Barbara, Dictionary of Untranslatables: A Philosophical Lexicon, Princeton University Press, 2014. Sadow, Lauren, ed. In Conversation with Anna Wierzbicka, https://www.youtube.com/watch?v=jCw3dfmgP-0 2014.Underhill, James, W. Ethnolinguistics and Cultural Concepts: truth, love, hate & war, Cambridge University Press, 2012. Experience, Evidence, and Sense: The hidden cultural legacy of English, Oxford University Press, 2010.
Computational semantics is focused on the processing of linguistic meaning. In order to do this concrete algorithms and architectures are described. Within this framework the algorithms and architectures are also analyzed in terms of decidability, time/space complexity, data structures that they require and communication protocols.
In computer science, the term semantics refers to the meaning of language constructs, as opposed to their form (syntax). According to Euzenat, semantics "provides the rules for interpreting the syntax which do not provide the meaning directly but constrains the possible interpretations of what is declared." In ontology engineering, the term semantics refers to the meaning of concepts, properties, and relationships that formally represent real-world entities, events, and scenes in a logical underpinning, such as a description logic, and typically implemented in the Web Ontology Language. The meaning of description logic concepts and roles is defined by their model-theoretic semantics, which are based on interpretations. The concepts, properties, and relationships defined in OWL ontologies can be deployed directly in the web site markup as RDFa, HTML5 Microdata, or JSON-LD, in graph databases as RDF triples or quads, and dereferenced in LOD datasets.
For instance, the following statements use different syntaxes, but cause the same instructions to be executed, namely, perform an arithmetical addition of 'y' to 'x' and store the result in a variable called 'x':
||Ada, ALGOL, ALGOL 68, BCPL, Dylan, Eiffel, Modula-2, Oberon, OCaml, Object Pascal (Delphi), Pascal, SETL, Simula, Smalltalk, Standard ML, VHDL, etc.|
||Assembly languages: Intel 8086|
||Assembly languages: ARM|
||BASIC: most dialects; Fortran, MATLAB, Lua|
The Semantic Web refers to the extension of the World Wide Web via embedding added semantic metadata, using semantic data modeling techniques such as Resource Description Framework (RDF) and Web Ontology Language (OWL). On the Semantic Web, terms such as semantic network and semantic data model are used to describe particular types of data model characterized by the use of directed graphs in which the vertices denote concepts or entities in the world and their properties, and the arcs denote relationships between them. These can formally be described as description logic concepts and roles, which correspond to OWL classes and properties.
In psychology, semantic memory is memory for meaning – in other words, the aspect of memory that preserves only the gist, the general significance, of remembered experience – while episodic memory is memory for the ephemeral details – the individual features, or the unique particulars of experience. The term 'episodic memory' was introduced by Tulving and Schacter in the context of 'declarative memory' which involved simple association of factual or objective information concerning its object. Word meaning is measured by the company they keep, i.e. the relationships among words themselves in a semantic network. The memories may be transferred intergenerationally or isolated in one generation due to a cultural disruption. Different generations may have different experiences at similar points in their own time-lines. This may then create a vertically heterogeneous semantic net for certain words in an otherwise homogeneous culture. In a network created by people analyzing their understanding of the word (such as Wordnet) the links and decomposition structures of the network are few in number and kind, and include part of, kind of, and similar links. In automated ontologies the links are computed vectors without explicit meaning. Various automated technologies are being developed to compute the meaning of words: latent semantic indexing and support vector machines as well as natural language processing, artificial neural networks and predicate calculus techniques.
Ideasthesia is a psychological phenomenon in which activation of concepts evokes sensory experiences. For example, in synesthesia, activation of a concept of a letter (e.g., that of the letter A) evokes sensory-like experiences (e.g., of red color).
A connotation is a commonly understood cultural or emotional association that some word or phrase carries, in addition to its explicit or literal meaning, which is its denotation.
A connotation is frequently described as either positive or negative, with regard to its pleasing or displeasing emotional connection. For example, a stubborn person may be described as being either strong-willed or pig-headed; although these have the same literal meaning (stubborn), strong-willed connotes admiration for the level of someone's will (a positive connotation), while pig-headed connotes frustration in dealing with someone (a negative connotation).Denotational semantics
In computer science, denotational semantics (initially known as mathematical semantics or Scott–Strachey semantics) is an approach of formalizing the meanings of programming languages by constructing mathematical objects (called denotations) that describe the meanings of expressions from the languages. Other approaches provide formal semantics of programming languages including axiomatic semantics and operational semantics.
Broadly speaking, denotational semantics is concerned with finding mathematical objects called domains that represent what programs do. For example, programs (or program phrases) might be represented by partial functions or by games between the environment and the system.
An important tenet of denotational semantics is that semantics should be compositional: the denotation of a program phrase should be built out of the denotations of its subphrases.Discourse
Discourse (from Latin discursus, "running to and from") denotes written and spoken communications:
In semantics and discourse analysis: Discourse is a conceptual generalization of conversation within each modality and context of communication.
The totality of codified language (vocabulary) used in a given field of intellectual enquiry and of social practice, such as legal discourse, medical discourse, religious discourse, et cetera.
In the work of Michel Foucault, and that of the social theoreticians he inspired: discourse describes "an entity of sequences, of signs, in that they are enouncements (énoncés)", statements in conversation.As discourse, an "enouncement" (statement) is not a unit of semiotic signs, but an abstract construct that allows the semiotic signs to assign meaning, and so communicate specific, repeatable communications to, between, and among objects, subjects, and statements. Therefore, a discourse is composed of semiotic sequences (relations among signs that communicate meaning) between and among objects, subjects, and statements.
The term "discursive formation" (French: formation discursive) conceptually describes the regular communications (written and spoken) that produce such discourses, such as informal conversations. As a philosopher, Michel Foucault applied the discursive formation in the analyses of large bodies of knowledge, such as political economy and natural history.In the first sense-usage (semantics and discourse analysis), the term discourse is studied in corpus linguistics, the study of language expressed in corpora (samples) of "real world" text. In the second sense (the codified language of a field of enquiry) and in the third sense (a statement, un énoncé), the analysis of a discourse examines and determines the connections among language and structure and agency.
Moreover, because a discourse is a body of text meant to communicate specific data, information, and knowledge, there exist internal relations in the content of a given discourse; likewise, there exist external relations among discourses. As such, a discourse does not exist per se (in itself), but is related to other discourses, by way of inter-discursivity. Discourses are also perpetually differentiating toward each other in time. Therefore, in the course of intellectual enquiry, the discourse among researchers features the questions and answers of What is ...? and What is not. ..., conducted according to the meanings (denotation and connotation) of the concepts (statements) used in the given field of enquiry, such as anthropology, ethnography, and sociology; cultural studies and literary theory; the philosophy of science and feminism.Document Style Semantics and Specification Language
The Document Style Semantics and Specification Language (DSSSL) is an international standard developed to provide a stylesheets for SGML documents.DSSSL consists of two parts: a tree transformation process that can be used to manipulate the tree structure of documents prior to presentation, and a formatting process that associates the elements in the source document with specific nodes in the target representation—the flow object tree. DSSSL specifications are device-independent pieces of information that can be interchanged between different platforms. DSSSL does not standardize the back-end formatters that generate the language's output. Such formatters may render the output for on-screen display, or write it to a computer file in a specific format (such as PostScript or Rich Text Format.Based on a subset of the Scheme programming language, it is specified by the standard ISO/IEC 10179:1996. It was developed by ISO/IEC JTC1/SC34 (ISO/IEC Joint Technical Committee 1, Subcommittee 34 - Document description and processing languages).SGML contains information in a machine-readable but not very human-readable format. A "stylesheet" is used to present the information stored in SGML in a more pleasing or accessible way. DSSSL can convert to a wide range of formats, including RTF, HTML, and LaTeX.
DSSSL is compatible with any SGML-based document type, but it has been used most often with DocBook. In 1997, software engineer Geir Ove Grønmo published a syntax highlighting language definition for KEDIT.With the appearance of XML as an alternative to SGML, XML's associated stylesheet language XSL was also widely and rapidly adopted, from around 1999. Although DSSSL continued in use within the shrinking SGML field, XSL was very soon in use more extensively, and by more coders, than DSSSL had ever achieved. This was emphasised when previous SGML strongholds such as DocBook converted from SGML to XML, and also converted their favoured stylesheet language from DSSSL to XSL.
Sometime in or before 1994, Opera Software began drafting a "DSSSL Lite" specification for the consideration of the World Wide Web Consortium, since DSSSL was thought to be too complex for the World Wide Web.Exonym and endonym
An exonym or xenonym is an external name for a geographical place, a group of people, an individual person, or a language or dialect. It is a common name used only outside the place, group, or linguistic community in question. An endonym or autonym is an internal name for a geographical place, a group of people, or a language or dialect. It is a common name used only inside the place, group, or linguistic community in question; it is their name for themselves, their homeland, or their language.
For instance, Germany is the English language exonym, Allemagne is the French language exonym, and Deutschland is the endonym for the same country in Europe.
Marcel Aurousseau, an Australian geographer, first used the term exonym in his work The Rendering of Geographical Names (1957). The term endonym was devised subsequently as an antonym for the term exonym.Formal semantics (linguistics)
In linguistics, formal semantics seeks to understand linguistic meaning by constructing precise mathematical models of the principles that speakers use to define relations between expressions in a natural language and the world that supports meaningful discourse. The mathematical tools used are the confluence of formal logic and formal language theory, especially typed lambda calculi.General semantics
General semantics is a self-improvement and therapy program begun in the 1920s that seeks to regulate human mental habits and behaviors. After partial launches under the names human engineering and humanology, Polish-American originator Alfred Korzybski (1879–1950) fully launched the program as general semantics in 1933 with the publication of Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics.
In Science and Sanity, general semantics is presented as both a theoretical and a practical system whose adoption can reliably alter human behavior in the direction of greater sanity. In the 1947 preface to the third edition of Science and Sanity, Korzybski wrote: "We need not blind ourselves with the old dogma that 'human nature cannot be changed', for we find that it can be changed." However, in the opinion of a majority of psychiatrists, the tenets and practices of general semantics are not an effective way of treating patients with psychological or mental illnesses. While Korzybski considered his program to be empirically based and to strictly follow the scientific method, general semantics has been described as veering into the domain of pseudoscience.Starting around 1940, university English professor S. I. Hayakawa (1906–1992), speech professor Wendell Johnson, speech professor Irving J. Lee, and others assembled elements of general semantics into a package suitable for incorporation into mainstream communications curricula. The Institute of General Semantics, which Korzybski and co-workers founded in 1938, continues today. General semantics as a movement has waned considerably since the 1950s, although many of its ideas live on in other movements, such as neuro-linguistic programming and rational emotive behavior therapy.Meaning (linguistics)
In linguistics, meaning is the information or concepts that a sender intends to convey, or does convey, in communication with a receiver.Opposite (semantics)
In lexical semantics, opposites are words lying in an inherently incompatible binary relationship, like the opposite pairs big : small, long : short, and precede : follow. The notion of incompatibility here refers to the fact that one word in an opposite pair entails that it is not the other pair member. For example, something that is long entails that it is not short. It is referred to as a 'binary' relationship because there are two members in a set of opposites. The relationship between opposites is known as opposition. A member of a pair of opposites can generally be determined by the question What is the opposite of X ?
The term antonym (and the related antonymy) is commonly taken to be synonymous with opposite, but antonym also has other more restricted meanings. Graded (or gradable) antonyms are word pairs whose meanings are opposite and which lie on a continuous spectrum (hot, cold). Complementary antonyms are word pairs whose meanings are opposite but whose meanings do not lie on a continuous spectrum (push, pull). Relational antonyms are word pairs where opposite makes sense only in the context of the relationship between the two meanings (teacher, pupil). These more restricted meanings may not apply in all scholarly contexts, with Lyons (1968, 1977) defining antonym to mean gradable antonyms, and Crystal (2003) warns that antonymy and antonym should be regarded with care.Pragmatics
Pragmatics is a subfield of linguistics and semiotics that studies the ways in which context contributes to meaning. Pragmatics encompasses speech act theory, conversational implicature, talk in interaction and other approaches to language behavior in philosophy, sociology, linguistics and anthropology. Unlike semantics, which examines meaning that is conventional or "coded" in a given language, pragmatics studies how the transmission of meaning depends not only on structural and linguistic knowledge (e.g., grammar, lexicon, etc.) of the speaker and listener, but also on the context of the utterance, any pre-existing knowledge about those involved, the inferred intent of the speaker, and other factors. In this respect, pragmatics explains how language users are able to overcome apparent ambiguity, since meaning relies on the manner, place, time, etc. of an utterance.The ability to understand another speaker's intended meaning is called pragmatic competence.Programming language
A programming language is a formal language, which comprises a set of instructions that produce various kinds of output. Programming languages are used in computer programming to implement algorithms.
Most programming languages consist of instructions for computers. There are programmable machines that use a set of specific instructions, rather than general programming languages. Early ones preceded the invention of the digital computer, the first probably being the automatic flute player described in the 9th century by the brothers Musa in Baghdad, during the Islamic Golden Age. Since the early 1800s, programs have been used to direct the behavior of machines such as Jacquard looms, music boxes and player pianos. The programs for these machines (such as a player piano's scrolls) did not produce different behavior in response to different inputs or conditions.
Thousands of different programming languages have been created, and more are being created every year. Many programming languages are written in an imperative form (i.e., as a sequence of operations to perform) while other languages use the declarative form (i.e. the desired result is specified, not how to achieve it).
The description of a programming language is usually split into the two components of syntax (form) and semantics (meaning). Some languages are defined by a specification document (for example, the C programming language is specified by an ISO Standard) while other languages (such as Perl) have a dominant implementation that is treated as a reference. Some languages have both, with the basic language defined by a standard and extensions taken from the dominant implementation being common.Programming language theory
Programming language theory (PLT) is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of programming languages and their individual features. It falls within the discipline of computer science, both depending on and affecting mathematics, software engineering, linguistics and even cognitive science. It is a well-recognized branch of computer science, and an active research area, with results published in numerous journals dedicated to PLT, as well as in general computer science and engineering publications.Reference
Reference is a relation between objects in which one object designates, or acts as a means by which to connect to or link to, another object. The first object in this relation is said to refer to the second object. It is called a name for the second object. The second object, the one to which the first object refers, is called the referent of the first object. A name is usually a phrase or expression, or some other symbolic representation. Its referent may be anything – a material object, a person, an event, an activity, or an abstract concept.
References can take on many forms, including: a thought, a sensory perception that is audible (onomatopoeia), visual (text), olfactory, or tactile, emotional state, relationship with other, spacetime coordinate, symbolic or alpha-numeric, a physical object or an energy projection. In some cases, methods are used that intentionally hide the reference from some observers, as in cryptography.
References feature in many spheres of human activity and knowledge, and the term adopts shades of meaning particular to the contexts in which it is used. Some of them are described in the sections below.Second-order logic
In logic and mathematics second-order logic is an extension of first-order logic, which itself is an extension of propositional logic. Second-order logic is in turn extended by higher-order logic and type theory.
First-order logic quantifies only variables that range over individuals (elements of the domain of discourse); second-order logic, in addition, also quantifies over relations. For example, the second-order sentence says that for every unary relation (or set) P of individuals, and every individual x, either x is in P or it is not (this is the principle of bivalence). Second-order logic also includes quantification over sets, functions, and other variables as explained in the section Syntax and fragments. Both first-order and second-order logic use the idea of a domain of discourse (often called simply the "domain" or the "universe"). The domain is a set over which individual elements may be quantified.Semantics (computer science)
In programming language theory, semantics is the field concerned with the rigorous mathematical study of the meaning of programming languages. It does so by evaluating the meaning of syntactically valid strings defined by a specific programming language, showing the computation involved. In such a case that the evaluation would be of syntactically invalid strings, the result would be non-computation. Semantics describes the processes a computer follows when executing a program in that specific language. This can be shown by describing the relationship between the input and output of a program, or an explanation of how the program will be executed on a certain platform, hence creating a model of computation.
Formal semantics, for instance, helps to write compilers, better understand what a program is doing, and to prove, e.g., that the following if statement
if 1 == 1 then S1 else S2
has the same effect as S1 alone.Semantics of logic
In logic, the semantics of logic is the study of the semantics, or interpretations, of formal and (idealizations of) natural languages usually trying to capture the pre-theoretic notion of entailment.Sobriquet
A sobriquet ( SOH-bri-kay) or soubriquet is a nickname, sometimes assumed, but often given by another and being descriptive in nature. Distinct from a pseudonym, it typically is a familiar name used in place of a real name without the need of explanation, often becoming more familiar than the original name.
The term, sobriquet, may apply to the nickname for a specific person, group of people, or place. Examples are Emiye Menelik, a name of Emperor Menelik II of Ethiopia, who was popularly and affectionately recognized for his kindness ('emiye' means mother in Amharic); Genghis Khan, who now is rarely recognized by his original name, Temüjin; and Mohandas Gandhi, who is better known as Mahatma Gandhi. Well-known places often have sobriquets, such as New York City, often referred to as the Big Apple.Synonym
A synonym is a word or phrase that means exactly or nearly the same as another lexeme (word or phrase) in the same language. Words that are synonyms are said to be synonymous, and the state of being a synonym is called synonymy. For example, the words begin, start, commence, and initiate are all synonyms of one another. Words are typically synonymous in one particular sense: for example, long and extended in the context long time or extended time are synonymous, but long cannot be used in the phrase extended family. Synonyms with exactly the same meaning share a seme or denotational sememe, whereas those with inexactly similar meanings share a broader denotational or connotational sememe and thus overlap within a semantic field. The former are sometimes called cognitive synonyms and the latter, near-synonyms, plesionyms or poecilonyms.Thesaurus
In general usage, a thesaurus is a reference work that lists words grouped together according to similarity of meaning (containing synonyms and sometimes antonyms), in contrast to a dictionary, which provides definitions for words, and generally lists them in alphabetical order. The main purpose of such reference works for users "to find the word, or words, by which [an] idea may be most fitly and aptly expressed," quoting Peter Mark Roget, author of Roget's Thesaurus.Although including synonyms, a thesaurus should not be taken as a complete list of all the synonyms for a particular word. The entries are also designed for drawing distinctions between similar words and assisting in choosing exactly the right word. Unlike a dictionary, a thesaurus entry does not give the definition of words.
In library science and information science, thesauri have been widely used to specify domain models. Recently, thesauri have been implemented with Simple Knowledge Organization System (SKOS).