Stochastic

The word stochastic is an adjective in English that describes something that was randomly determined.[1] The word first appeared in English to describe a mathematical object called a stochastic process, but now in mathematics the terms stochastic process and random process are considered interchangeable.[2][3][4][5][6] The word, with its current definition meaning random, came from German, but it originally came from Greek στόχος (stókhos), meaning 'aim, guess'.[1]

The term stochastic is used in many different fields, particularly where stochastic or random processes are used to represent systems or phenomena that seem to change in a random way. Examples of such fields include the physical sciences such as biology,[7] chemistry,[8] ecology,[9] neuroscience,[10] and physics[11] as well as technology and engineering fields such as image processing, signal processing,[12] information theory,[13] computer science,[14] cryptography[15] and telecommunications.[16] It is also used in finance, due to seemingly random changes in financial markets.[17][18][19]

Etymology

The word stochastic in English was originally used as an adjective with the definition "pertaining to conjecturing", and stemming from a Greek word meaning "to aim at a mark, guess", and the Oxford English Dictionary gives the year 1662 as its earliest occurrence.[1] In his work on probability Ars Conjectandi, originally published in Latin in 1713, Jakob Bernoulli used the phrase "Ars Conjectandi sive Stochastice", which has been translated to "the art of conjecturing or stochastics".[20] This phrase was used, with reference to Bernoulli, by Ladislaus Bortkiewicz[21] who in 1917 wrote in German the word stochastik with a sense meaning random. The term stochastic process first appeared in English in a 1934 paper by Joseph Doob.[1] For the term and a specific mathematical definition, Doob cited another 1934 paper, where the term stochastischer Prozeß was used in German by Aleksandr Khinchin,[22][23] though the German term had been used earlier in 1931 by Andrey Kolmogorov.[24]

Mathematics

In the early 1930s, Aleksandr Khinchin gave the first mathematical definition of a stochastic process as a set of random variables indexed by the real line.[25][22][a] Further fundamental work on probability theory and stochastic processes was done by Khinchin as well as other mathematicians such as Andrey Kolmogorov, Joseph Doob, William Feller, Maurice Fréchet, Paul Lévy, Wolfgang Doeblin, and Harald Cramér.[27][28] Decades later Cramér referred to the 1930s as the "heroic period of mathematical probability theory".[28]

In mathematics, specifically probability theory, the theory of stochastic processes is considered to be an important contribution to mathematics[29] and it continues to be an active topic of research for both theoretical reasons and applications.[30][31][32]

The word stochastic is used to describe other terms and objects in mathematics. Examples include a stochastic matrix, which describes a stochastic process known as a Markov process, and stochastic calculus, which involves differential equations and integrals based on stochastic processes such as the Wiener process, also called the Brownian motion process.

Artificial intelligence

In artificial intelligence, stochastic programs work by using probabilistic methods to solve problems, as in simulated annealing, stochastic neural networks, stochastic optimization, genetic algorithms, and genetic programming. A problem itself may be stochastic as well, as in planning under uncertainty.

Natural science

One of the simplest continuous-time stochastic processes is Brownian motion. This was first observed by botanist Robert Brown while looking through a microscope at pollen grains in water.

Physics

The name "Monte Carlo" for the stochastic Monte Carlo method was popularized by physics researchers Stanisław Ulam, Enrico Fermi, John von Neumann, and Nicholas Metropolis, among others. The name is a reference to the Monte Carlo Casino in Monaco where Ulam's uncle would borrow money to gamble.[33] The use of randomness and the repetitive nature of the process are analogous to the activities conducted at a casino. Methods of simulation and statistical sampling generally did the opposite: using simulation to test a previously understood deterministic problem. Though examples of an "inverted" approach do exist historically, they were not considered a general method until the popularity of the Monte Carlo method spread.

Perhaps the most famous early use was by Enrico Fermi in 1930, when he used a random method to calculate the properties of the newly discovered neutron. Monte Carlo methods were central to the simulations required for the Manhattan Project, though were severely limited by the computational tools at the time. Therefore, it was only after electronic computers were first built (from 1945 on) that Monte Carlo methods began to be studied in depth. In the 1950s they were used at Los Alamos for early work relating to the development of the hydrogen bomb, and became popularized in the fields of physics, physical chemistry, and operations research. The RAND Corporation and the U.S. Air Force were two of the major organizations responsible for funding and disseminating information on Monte Carlo methods during this time, and they began to find a wide application in many different fields.

Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling.

Biology

Stochastic resonance: In biological systems, introducing stochastic "noise" has been found to help improve the signal strength of the internal feedback loops for balance and other vestibular communication.[34] It has been found to help diabetic and stroke patients with balance control.[35] Many biochemical events also lend themselves to stochastic analysis. Gene expression, for example, has a stochastic component through the molecular collisions—as during binding and unbinding of RNA polymerase to a gene promoter—via the solution's Brownian motion.

Medicine

Stochastic effect, or "chance effect" is one classification of radiation effects that refers to the random, statistical nature of the damage. In contrast to the deterministic effect, severity is independent of dose. Only the probability of an effect increases with dose.

Geomorphology

The formation of river meanders has been analyzed as a stochastic process.

Creativity

Simonton (2003, Psych Bulletin) argues that creativity in science (of scientists) is a constrained stochastic behaviour such that new theories in all sciences are, at least in part, the product of a stochastic process.

Computer science

Stochastic ray tracing is the application of Monte Carlo simulation to the computer graphics ray tracing algorithm. "Distributed ray tracing samples the integrand at many randomly chosen points and averages the results to obtain a better approximation. It is essentially an application of the Monte Carlo method to 3D computer graphics, and for this reason is also called Stochastic ray tracing."

Stochastic forensics analyzes computer crime by viewing computers as stochastic processes.

Music

In music, mathematical processes based on probability can generate stochastic elements.

Stochastic processes may be used in music to compose a fixed piece or may be produced in performance. Stochastic music was pioneered by Iannis Xenakis, who coined the term stochastic music. Specific examples of mathematics, statistics, and physics applied to music composition are the use of the statistical mechanics of gases in Pithoprakta, statistical distribution of points on a plane in Diamorphoses, minimal constraints in Achorripsis, the normal distribution in ST/10 and Atrées, Markov chains in Analogiques, game theory in Duel and Stratégie, group theory in Nomos Alpha (for Siegfried Palm), set theory in Herma and Eonta,[36] and Brownian motion in N'Shima. Xenakis frequently used computers to produce his scores, such as the ST series including Morsima-Amorsima and Atrées, and founded CEMAMu. Earlier, John Cage and others had composed aleatoric or indeterminate music, which is created by chance processes but does not have the strict mathematical basis (Cage's Music of Changes, for example, uses a system of charts based on the I-Ching). Lejaren Hiller and Leonard Issacson used generative grammars and Markov chains in their 1957 Illiac Suite. Modern electronic music production techniques make these processes relatively simple to implement, and many hardware devices such as synthesizers and drum machines incorporate randomization features. Generative music techniques are therefore readily accessible to composers, performers, and producers.

Subtractive color reproduction

When color reproductions are made, the image is separated into its component colors by taking multiple photographs filtered for each color. One resultant film or plate represents each of the cyan, magenta, yellow, and black data. Color printing is a binary system, where ink is either present or not present, so all color separations to be printed must be translated into dots at some stage of the work-flow. Traditional line screens which are amplitude modulated had problems with moiré but were used until stochastic screening became available. A stochastic (or frequency modulated) dot pattern creates a sharper image.

Language and linguistics

Non-deterministic approaches in language studies are largely inspired by the work of Ferdinand de Saussure, for example, in functionalist linguistic theory, which argues that competence is based on performance.[37][38] This distinction in functional theories of grammar should be carefully distinguished from the langue and parole distinction. To the extent that linguistic knowledge is constituted by experience with language, grammar is argued to be probabilistic and variable rather than fixed and absolute. This conception of grammar as probabilistic and variable follows from the idea that one's competence changes in accordance with one's experience with language. Though this conception has been contested,[39] it has also provided the foundation for modern statistical natural language processing[40] and for theories of language learning and change.[41]

Social sciences

Stochastic social science theory is similar to systems theory in that events are interactions of systems, although with a marked emphasis on unconscious processes. The event creates its own conditions of possibility, rendering it unpredictable if simply for the number of variables involved. Stochastic social science theory can be seen as an elaboration of a kind of 'third axis' in which to situate human behavior alongside the traditional 'nature vs. nurture' opposition. See Julia Kristeva on her usage of the 'semiotic', Luce Irigaray on reverse Heideggerian epistemology, and Pierre Bourdieu on polythetic space for examples of stochastic social science theory.

Business

Manufacturing

Manufacturing processes are assumed to be stochastic processes. This assumption is largely valid for either continuous or batch manufacturing processes. Testing and monitoring of the process is recorded using a process control chart which plots a given process control parameter over time. Typically a dozen or many more parameters will be tracked simultaneously. Statistical models are used to define limit lines which define when corrective actions must be taken to bring the process back to its intended operational window.

This same approach is used in the service industry where parameters are replaced by processes related to service level agreements.

Finance

The financial markets use stochastic models to represent the seemingly random behaviour of assets such as stocks, commodities, relative currency prices (i.e., the price of one currency compared to that of another, such as the price of US Dollar compared to that of the Euro), and interest rates. These models are then used by quantitative analysts to value options on stock prices, bond prices, and on interest rates, see Markov models. Moreover, it is at the heart of the insurance industry.

Media

The marketing and the changing movement of audience tastes and preferences, as well as the solicitation of and the scientific appeal of certain film and television debuts (i.e., their opening weekends, word-of-mouth, top-of-mind knowledge among surveyed groups, star name recognition and other elements of social media outreach and advertising), are determined in part by stochastic modeling. A recent attempt at repeat business analysis was done by Japanese scholars and is part of the Cinematic Contagion Systems patented by Geneva Media Holdings, and such modeling has been used in data collection from the time of the original Nielsen ratings to modern studio and television test audiences.

See also

Notes

  1. ^ Doob, when citing Khinchin, uses the term 'chance variable', which used to be an alternative term for 'random variable'.[26]

References

  1. ^ a b c d "Stochastic". Oxford Dictionaries. Oxford University Press.
  2. ^ Robert J. Adler; Jonathan E. Taylor (29 January 2009). Random Fields and Geometry. Springer Science & Business Media. pp. 7–8. ISBN 978-0-387-48116-6.
  3. ^ David Stirzaker (2005). Stochastic Processes and Models. Oxford University Press. p. 45. ISBN 978-0-19-856814-8.
  4. ^ Loïc Chaumont; Marc Yor (19 July 2012). Exercises in Probability: A Guided Tour from Measure Theory to Random Processes, Via Conditioning. Cambridge University Press. p. 175. ISBN 978-1-107-60655-5.
  5. ^ Murray Rosenblatt (1962). Random Processes. Oxford University Press. p. 91.
  6. ^ Olav Kallenberg (8 January 2002). Foundations of Modern Probability. Springer Science & Business Media. pp. 24 and 25. ISBN 978-0-387-95313-7.
  7. ^ Paul C. Bressloff (22 August 2014). Stochastic Processes in Cell Biology. Springer. ISBN 978-3-319-08488-6.
  8. ^ N.G. Van Kampen (30 August 2011). Stochastic Processes in Physics and Chemistry. Elsevier. ISBN 978-0-08-047536-3.
  9. ^ Russell Lande; Steinar Engen; Bernt-Erik Sæther (2003). Stochastic Population Dynamics in Ecology and Conservation. Oxford University Press. ISBN 978-0-19-852525-7.
  10. ^ Carlo Laing; Gabriel J Lord (2010). Stochastic Methods in Neuroscience. OUP Oxford. ISBN 978-0-19-923507-0.
  11. ^ Wolfgang Paul; Jörg Baschnagel (11 July 2013). Stochastic Processes: From Physics to Finance. Springer Science & Business Media. ISBN 978-3-319-00327-6.
  12. ^ Edward R. Dougherty (1999). Random processes for image and signal processing. SPIE Optical Engineering Press. ISBN 978-0-8194-2513-3.
  13. ^ Thomas M. Cover; Joy A. Thomas (28 November 2012). Elements of Information Theory. John Wiley & Sons. p. 71. ISBN 978-1-118-58577-1.
  14. ^ Michael Baron (15 September 2015). Probability and Statistics for Computer Scientists, Second Edition. CRC Press. p. 131. ISBN 978-1-4987-6060-7.
  15. ^ Jonathan Katz; Yehuda Lindell (2007-08-31). Introduction to Modern Cryptography: Principles and Protocols. CRC Press. p. 26. ISBN 978-1-58488-586-3.
  16. ^ François Baccelli; Bartlomiej Blaszczyszyn (2009). Stochastic Geometry and Wireless Networks. Now Publishers Inc. pp. 200–. ISBN 978-1-60198-264-3.
  17. ^ J. Michael Steele (2001). Stochastic Calculus and Financial Applications. Springer Science & Business Media. ISBN 978-0-387-95016-7.
  18. ^ Marek Musiela; Marek Rutkowski (21 January 2006). Martingale Methods in Financial Modelling. Springer Science & Business Media. ISBN 978-3-540-26653-2.
  19. ^ Steven E. Shreve (3 June 2004). Stochastic Calculus for Finance II: Continuous-Time Models. Springer Science & Business Media. ISBN 978-0-387-40101-0.
  20. ^ O. B. Sheĭnin (2006). Theory of probability and statistics as exemplified in short dictums. NG Verlag. p. 5. ISBN 978-3-938417-40-9.
  21. ^ Oscar Sheynin; Heinrich Strecker (2011). Alexandr A. Chuprov: Life, Work, Correspondence. V&R unipress GmbH. p. 136. ISBN 978-3-89971-812-6.
  22. ^ a b Doob, Joseph (1934). "Stochastic Processes and Statistics". Proceedings of the National Academy of Sciences of the United States of America. 20 (6): 376–379. doi:10.1073/pnas.20.6.376. PMC 1076423.
  23. ^ Khintchine, A. (1934). "Korrelationstheorie der stationeren stochastischen Prozesse". Mathematische Annalen. 109 (1): 604–615. doi:10.1007/BF01449156. ISSN 0025-5831.
  24. ^ Kolmogoroff, A. (1931). "Über die analytischen Methoden in der Wahrscheinlichkeitsrechnung". Mathematische Annalen. 104 (1): 1. doi:10.1007/BF01457949. ISSN 0025-5831.
  25. ^ Vere-Jones, David (2006). "Khinchin, Aleksandr Yakovlevich": 4. doi:10.1002/0471667196.ess6027.pub2.
  26. ^ Snell, J. Laurie (2005). "Obituary: Joseph Leonard Doob". Journal of Applied Probability. 42 (1): 251. doi:10.1239/jap/1110381384. ISSN 0021-9002.
  27. ^ Bingham, N. (2000). "Studies in the history of probability and statistics XLVI. Measure into probability: from Lebesgue to Kolmogorov". Biometrika. 87 (1): 145–156. doi:10.1093/biomet/87.1.145. ISSN 0006-3444.
  28. ^ a b Cramer, Harald (1976). "Half a Century with Probability Theory: Some Personal Recollections". The Annals of Probability. 4 (4): 509–546. doi:10.1214/aop/1176996025. ISSN 0091-1798.
  29. ^ Applebaum, David (2004). "Lévy processes: From probability to finance and quantum groups". Notices of the AMS. 51 (11): 1336–1347.
  30. ^ Jochen Blath; Peter Imkeller; Sylvie Rœlly (2011). Surveys in Stochastic Processes. European Mathematical Society. pp. 5–. ISBN 978-3-03719-072-2.
  31. ^ Michel Talagrand (12 February 2014). Upper and Lower Bounds for Stochastic Processes: Modern Methods and Classical Problems. Springer Science & Business Media. pp. 4–. ISBN 978-3-642-54075-2.
  32. ^ Paul C. Bressloff (22 August 2014). Stochastic Processes in Cell Biology. Springer. pp. vii–ix. ISBN 978-3-319-08488-6.
  33. ^ Douglas Hubbard "How to Measure Anything: Finding the Value of Intangibles in Business" p. 46, John Wiley & Sons, 2007
  34. ^ Hänggi, P. (2002). "Stochastic Resonance in Biology How Noise Can Enhance Detection of Weak Signals and Help Improve Biological Information Processing". ChemPhysChem. 3 (3): 285–90. doi:10.1002/1439-7641(20020315)3:3<285::AID-CPHC285>3.0.CO;2-A. PMID 12503175.
  35. ^ Priplata, A.; et al. (2006). "Noise-Enhanced Balance Control in Patients with Diabetes and Patients with Stroke" (PDF). Ann Neurol. 59: 4–12. doi:10.1002/ana.20670. PMID 16287079.
  36. ^ Ilias Chrissochoidis, Stavros Houliaras, and Christos Mitsakis, "Set theory in Xenakis' EONTA", in International Symposium Iannis Xenakis, ed. Anastasia Georgaki and Makis Solomos (Athens: The National and Kapodistrian University, 2005), 241–249.
  37. ^ Newmeyer, Frederick. 2001. "The Prague School and North American functionalist approaches to syntax" Journal of Linguistics 37, pp. 101–126. "Since most American functionalists adhere to this trend, I will refer to it and its practitioners with the initials 'USF'. Some of the more prominent USFs are Joan Bybee, William Croft, Talmy Givon, John Haiman, Paul Hopper, Marianne Mithun and Sandra Thompson. In its most extreme form (Hopper 1987, 1988), USF rejects the Saussurean dichotomies such as langue vs. parôle. For early interpretivist approaches to focus, see Chomsky (1971) and Jackendoff (1972). parole and synchrony vs. diachrony. All adherents of this tendency feel that the Chomskyan advocacy of a sharp distinction between competence and performance is at best unproductive and obscurantist; at worst theoretically unmotivated. "
  38. ^ Bybee, Joan. "Usage-based phonology." p. 213 in Darnel, Mike (ed). 1999. Functionalism and Formalism in Linguistics: General papers. John Benjamins Publishing Company
  39. ^ Chomsky (1959). Review of Skinner's Verbal Behavior, Language, 35: 26–58
  40. ^ Manning and Schütze, (1999) Foundations of Statistical Natural Language Processing, MIT Press. Cambridge, MA
  41. ^ Bybee (2007) Frequency of use and the organization of language. Oxford: Oxford University Press

Further reading

External links

  • The dictionary definition of stochastic at Wiktionary
Algebra

Algebra (from Arabic "al-jabr", literally meaning "reunion of broken parts") is one of the broad parts of mathematics, together with number theory, geometry and analysis. In its most general form, algebra is the study of mathematical symbols and the rules for manipulating these symbols; it is a unifying thread of almost all of mathematics. It includes everything from elementary equation solving to the study of abstractions such as groups, rings, and fields. The more basic parts of algebra are called elementary algebra; the more abstract parts are called abstract algebra or modern algebra. Elementary algebra is generally considered to be essential for any study of mathematics, science, or engineering, as well as such applications as medicine and economics. Abstract algebra is a major area in advanced mathematics, studied primarily by professional mathematicians.

Elementary algebra differs from arithmetic in the use of abstractions, such as using letters to stand for numbers that are either unknown or allowed to take on many values. For example, in the letter is unknown, but the law of inverses can be used to discover its value: . In E = mc2, the letters and are variables, and the letter is a constant, the speed of light in a vacuum. Algebra gives methods for writing formulas and solving equations that are much clearer and easier than the older method of writing everything out in words.

The word algebra is also used in certain specialized ways. A special kind of mathematical object in abstract algebra is called an "algebra", and the word is used, for example, in the phrases linear algebra and algebraic topology.

A mathematician who does research in algebra is called an algebraist.

Gaussian process

In probability theory and statistics, a Gaussian process is a stochastic process (a collection of random variables indexed by time or space), such that every finite collection of those random variables has a multivariate normal distribution, i.e. every finite linear combination of them is normally distributed. The distribution of a Gaussian process is the joint distribution of all those (infinitely many) random variables, and as such, it is a distribution over functions with a continuous domain, e.g. time or space.

A machine-learning algorithm that involves a Gaussian process uses lazy learning and a measure of the similarity between points (the kernel function) to predict the value for an unseen point from training data. The prediction is not just an estimate for that point, but also has uncertainty information—it is a one-dimensional Gaussian distribution (which is the marginal distribution at that point).For some kernel functions, matrix algebra can be used to calculate the predictions using the technique of kriging. When a parameterised kernel is used, optimisation software is typically used to fit a Gaussian process model.

The concept of Gaussian processes is named after Carl Friedrich Gauss because it is based on the notion of the Gaussian distribution (normal distribution). Gaussian processes can be seen as an infinite-dimensional generalization of multivariate normal distributions.

Gaussian processes are useful in statistical modelling, benefiting from properties inherited from the normal. For example, if a random process is modelled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly. Such quantities include the average value of the process over a range of times and the error in estimating the average using sample values at a small set of times.

Haematopoiesis

Haematopoiesis (from Greek αἷμα, "blood" and ποιεῖν "to make"; also hematopoiesis in American English; sometimes also haemopoiesis or hemopoiesis) is the formation of blood cellular components. All cellular blood components are derived from haematopoietic stem cells. In a healthy adult person, approximately 1011–1012 new blood cells are produced daily in order to maintain steady state levels in the peripheral circulation.

Independence (probability theory)

In probability theory, two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of occurrence of the other. Similarly, two random variables are independent if the realization of one does not affect the probability distribution of the other.

The concept of independence extends to dealing with collections of more than two events or random variables, in which case the events are pairwise independent if each pair are independent of each other, and the events are mutually independent if each event is independent of each other combination of events.

Itô calculus

Itô calculus, named after Kiyoshi Itô, extends the methods of calculus to stochastic processes such as Brownian motion (see Wiener process). It has important applications in mathematical finance and stochastic differential equations.

The central concept is the Itô stochastic integral, a stochastic generalization of the Riemann–Stieltjes integral in analysis. The integrands and the integrators are now stochastic processes:

where H is a locally square-integrable process adapted to the filtration generated by X (Revuz & Yor 1999, Chapter IV), which is a Brownian motion or, more generally, a semimartingale. The result of the integration is then another stochastic process. Concretely, the integral from 0 to any particular t is a random variable, defined as a limit of a certain sequence of random variables. The paths of Brownian motion fail to satisfy the requirements to be able to apply the standard techniques of calculus. So with the integrand a stochastic process, the Itô stochastic integral amounts to an integral with respect to a function which is not differentiable at any point and has infinite variation over every time interval. The main insight is that the integral can be defined as long as the integrand H is adapted, which loosely speaking means that its value at time t can only depend on information available up until this time. Roughly speaking, one chooses a sequence of partitions of the interval from 0 to t and construct Riemann sums. Every time we are computing a Riemann sum, we are using a particular instantiation of the integrator. It is crucial which point in each of the small intervals is used to compute the value of the function. The limit then is taken in probability as the mesh of the partition is going to zero. Numerous technical details have to be taken care of to show that this limit exists and is independent of the particular sequence of partitions. Typically, the left end of the interval is used.

Important results of Itô calculus include the integration by parts formula and Itô's lemma, which is a change of variables formula. These differ from the formulas of standard calculus, due to quadratic variation terms.

In mathematical finance, the described evaluation strategy of the integral is conceptualized as that we are first deciding what to do, then observing the change in the prices. The integrand is how much stock we hold, the integrator represents the movement of the prices, and the integral is how much money we have in total including what our stock is worth, at any given moment. The prices of stocks and other traded financial assets can be modeled by stochastic processes such as Brownian motion or, more often, geometric Brownian motion (see Black–Scholes). Then, the Itô stochastic integral represents the payoff of a continuous-time trading strategy consisting of holding an amount Ht of the stock at time t. In this situation, the condition that H is adapted corresponds to the necessary restriction that the trading strategy can only make use of the available information at any time. This prevents the possibility of unlimited gains through high-frequency trading: buying the stock just before each uptick in the market and selling before each downtick. Similarly, the condition that H is adapted implies that the stochastic integral will not diverge when calculated as a limit of Riemann sums (Revuz & Yor 1999, Chapter IV).

List of stochastic processes topics

In the mathematics of probability, a stochastic process is a random function. In practical applications, the domain over which the function is defined is a time interval (time series) or a region of space (random field).

Familiar examples of time series include stock market and exchange rate fluctuations, signals such as speech, audio and video; medical data such as a patient's EKG, EEG, blood pressure or temperature; and random movement such as Brownian motion or random walks.

Examples of random fields include static images, random topographies (landscapes), or composition variations of an inhomogeneous material.

Lone wolf (terrorism)

A lone actor, lone-actor terrorist, or lone wolf, is someone who prepares and commits violent acts alone, outside of any command structure and without material assistance from any group. He or she may be influenced or motivated by the ideology and beliefs of an external group and may act in support of such a group. In its original sense, a "lone wolf" is an animal or person that generally lives or spends time alone instead of with a group.Observers note the attacks are a relatively rare type of terrorist attack but have been increasing in number, and that it is sometimes difficult to tell whether an actor has received outside help and what appears to be a lone wolf attack may actually have been carefully orchestrated from outside.

Malliavin calculus

In probability theory and related fields, Malliavin calculus is a set of mathematical techniques and ideas that extend the mathematical field of calculus of variations from deterministic functions to stochastic processes. In particular, it allows the computation of derivatives of random variables. Malliavin calculus is also called the stochastic calculus of variations.

Malliavin calculus is named after Paul Malliavin whose ideas led to a proof that Hörmander's condition implies the existence and smoothness of a density for the solution of a stochastic differential equation; Hörmander's original proof was based on the theory of partial differential equations. The calculus has been applied to stochastic partial differential equations as well.

The calculus allows integration by parts with random variables; this operation is used in mathematical finance to compute the sensitivities of financial derivatives. The calculus has applications in, for example, stochastic filtering.

Network scheduler

A network scheduler, also called packet scheduler, queueing discipline, qdisc or queueing algorithm, is an arbiter on a node in packet switching communication network. It manages the sequence of network packets in the transmit and receive queues of the network interface controller. There are several network schedulers available for the different operating systems, that implement many of the existing network scheduling algorithms.

The network scheduler logic decides which network packet to forward next. The network scheduler is associated with a queuing system, storing the network packets temporarily until they are transmitted. Systems may have a single or multiple queues in which case each may hold the packets of one flow, classification, or priority.

In some cases it may not be possible to schedule all transmissions within the constraints of the system. In these cases the network scheduler is responsible for deciding which traffic to forward and what gets dropped.

Observational error

Observational error (or measurement error) is the difference between a measured value of a quantity and its true value. In statistics, an error is not a "mistake". Variability is an inherent part of the results of measurements and of the measurement process.

Measurement errors can be divided into two components: random error and systematic error.Random errors are errors in measurement that lead to measurable values being inconsistent when repeated measurements of a constant attribute or quantity are taken. Systematic errors are errors that are not determined by chance but are introduced by an inaccuracy (involving either the observation or measurement process) inherent to the system. Systematic error may also refer to an error with a non-zero mean, the effect of which is not reduced when observations are averaged.

Queueing theory

Queueing theory is the mathematical study of waiting lines, or queues. A queueing model is constructed so that queue lengths and waiting time can be predicted. Queueing theory is generally considered a branch of operations research because the results are often used when making business decisions about the resources needed to provide a service.

Queueing theory has its origins in research by Agner Krarup Erlang when he created models to describe the Copenhagen telephone exchange. The ideas have since seen applications including telecommunication, traffic engineering, computing

and, particularly in industrial engineering, in the design of factories, shops, offices and hospitals, as well as in project management.

Stationary process

In mathematics and statistics, a stationary process (a.k.a. a strict/strictly stationary process or strong/strongly stationary process) is a stochastic process whose unconditional joint probability distribution does not change when shifted in time. Consequently, parameters such as mean and variance also do not change over time.

Since stationarity is an assumption underlying many statistical procedures used in time series analysis, non-stationary data is often transformed to become stationary. The most common cause of violation of stationarity is a trend in the mean, which can be due either to the presence of a unit root or of a deterministic trend. In the former case of a unit root, stochastic shocks have permanent effects, and the process is not mean-reverting. In the latter case of a deterministic trend, the process is called a trend stationary process, and stochastic shocks have only transitory effects after which the variable tends toward a deterministically evolving (non-constant) mean.

A trend stationary process is not strictly stationary, but can easily be transformed into a stationary process by removing the underlying trend, which is solely a function of time. Similarly, processes with one or more unit roots can be made stationary through differencing. An important type of non-stationary process that does not include a trend-like behavior is a cyclostationary process, which is a stochastic process that varies cyclically with time.

For many applications strict-sense stationarity is too restrictive. Other forms of stationarity such as wide-sense stationarity or N-to order stationarity are then employed. The definitions for different kinds of stationarity are not consistent among different authors (see Other terminology).

Stochastic calculus

Stochastic calculus is a branch of mathematics that operates on stochastic processes. It allows a consistent theory of integration to be defined for integrals of stochastic processes with respect to stochastic processes. It is used to model systems that behave randomly.

The best-known stochastic process to which stochastic calculus is applied is the Wiener process (named in honor of Norbert Wiener), which is used for modeling Brownian motion as described by Louis Bachelier in 1900 and by Albert Einstein in 1905 and other physical diffusion processes in space of particles subject to random forces. Since the 1970s, the Wiener process has been widely applied in financial mathematics and economics to model the evolution in time of stock prices and bond interest rates.

The main flavours of stochastic calculus are the Itô calculus and its variational relative the Malliavin calculus. For technical reasons the Itô integral is the most useful for general classes of processes, but the related Stratonovich integral is frequently useful in problem formulation (particularly in engineering disciplines). The Stratonovich integral can readily be expressed in terms of the Itô integral. The main benefit of the Stratonovich integral is that it obeys the usual chain rule and therefore does not require Itô's lemma. This enables problems to be expressed in a coordinate system invariant form, which is invaluable when developing stochastic calculus on manifolds other than Rn.

The dominated convergence theorem does not hold for the Stratonovich integral, consequently it is very difficult to prove results without re-expressing the integrals in Itô form.

Stochastic differential equation

A stochastic differential equation (SDE) is a differential equation in which one or more of the terms is a stochastic process, resulting in a solution which is also a stochastic process. SDEs are used to model various phenomena such as unstable stock prices or physical systems subject to thermal fluctuations. Typically, SDEs contain a variable which represents random white noise calculated as the derivative of Brownian motion or the Wiener process. However, other types of random behaviour are possible, such as jump processes.

Stochastic game

In game theory, a stochastic game, introduced by Lloyd Shapley in the early 1950s, is a dynamic game with probabilistic transitions played by one or more players. The game is played in a sequence of stages. At the beginning of each stage the game is in some state. The players select actions and each player receives a payoff that depends on the current state and the chosen actions. The game then moves to a new random state whose distribution depends on the previous state and the actions chosen by the players. The procedure is repeated at the new state and play continues for a finite or infinite number of stages. The total payoff to a player is often taken to be the discounted sum of the stage payoffs or the limit inferior of the averages of the stage payoffs.

Stochastic games generalize both Markov decision processes and repeated games.

Stochastic gradient descent

Stochastic gradient descent (often shortened to SGD), also known as incremental gradient descent, is an iterative method for optimizing a differentiable objective function, a stochastic approximation of gradient descent optimization. A 2018 article implicitly credits Herbert Robbins and Sutton Monro for developing SGD in their 1951 article titled "A Stochastic Approximation Method"; see Stochastic approximation for more information. It is called stochastic because samples are selected randomly (or shuffled) instead of as a single group (as in standard gradient descent) or in the order they appear in the training set.

Stochastic matrix

In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, substitution matrix, or Markov matrix.The stochastic matrix was first developed by Andrey Markov at the beginning of the 20th century, and has found use throughout a wide variety of scientific fields, including probability theory, statistics, mathematical finance and linear algebra, as well as computer science and population genetics.There are several different definitions and types of stochastic matrices:

A right stochastic matrix is a real square matrix, with each row summing to 1.

A left stochastic matrix is a real square matrix, with each column summing to 1.

A doubly stochastic matrix is a square matrix of nonnegative real numbers with each row and column summing to 1.In the same vein, one may define a stochastic vector (also called probability vector) as a vector whose elements are nonnegative real numbers which sum to 1. Thus, each row of a right stochastic matrix (or column of a left stochastic matrix) is a stochastic vector.A common convention in English language mathematics literature is to use row vectors of probabilities and right stochastic matrices rather than column vectors of probabilities and left stochastic matrices; this article follows that convention.

Stochastic process

In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a collection of random variables. Historically, the random variables were associated with or indexed by a set of numbers, usually viewed as points in time, giving the interpretation of a stochastic process representing numerical values of some system randomly changing over time, such as the growth of a bacterial population, an electrical current fluctuating due to thermal noise, or the movement of a gas molecule. Stochastic processes are widely used as mathematical models of systems and phenomena that appear to vary in a random manner. They have applications in many disciplines including sciences such as biology, chemistry, ecology, neuroscience, and physics as well as technology and engineering fields such as image processing, signal processing, information theory, computer science, cryptography and telecommunications. Furthermore, seemingly random changes in financial markets have motivated the extensive use of stochastic processes in finance.Applications and the study of phenomena have in turn inspired the proposal of new stochastic processes. Examples of such stochastic processes include the Wiener process or Brownian motion process, used by Louis Bachelier to study price changes on the Paris Bourse, and the Poisson process, used by A. K. Erlang to study the number of phone calls occurring in a certain period of time. These two stochastic processes are considered the most important and central in the theory of stochastic processes, and were discovered repeatedly and independently, both before and after Bachelier and Erlang, in different settings and countries.The term random function is also used to refer to a stochastic or random process, because a stochastic process can also be interpreted as a random element in a function space. The terms stochastic process and random process are used interchangeably, often with no specific mathematical space for the set that indexes the random variables. But often these two terms are used when the random variables are indexed by the integers or an interval of the real line. If the random variables are indexed by the Cartesian plane or some higher-dimensional Euclidean space, then the collection of random variables is usually called a random field instead. The values of a stochastic process are not always numbers and can be vectors or other mathematical objects.Based on their mathematical properties, stochastic processes can be divided into various categories, which include random walks, martingales, Markov processes, Lévy processes, Gaussian processes, random fields, renewal processes, and branching processes. The study of stochastic processes uses mathematical knowledge and techniques from probability, calculus, linear algebra, set theory, and topology as well as branches of mathematical analysis such as real analysis, measure theory, Fourier analysis, and functional analysis. The theory of stochastic processes is considered to be an important contribution to mathematics and it continues to be an active topic of research for both theoretical reasons and applications.

Stochastic quantum mechanics

Stochastic quantum mechanics (or the stochastic interpretation) is an interpretation of quantum mechanics.

The modern application of stochastics to quantum mechanics involves the assumption of spacetime stochasticity, the idea that the small-scale structure of spacetime is undergoing both metric and topological fluctuations (John Archibald Wheeler's "quantum foam"), and that the averaged result of these fluctuations recreates a more conventional-looking metric at larger scales that can be described using classical physics, along with an element of nonlocality that can be described using quantum mechanics. A stochastic interpretation of quantum mechanics due to persistent vacuum fluctuation. The main idea is that vacuum or spacetime fluctuations are the reason for quantum mechanics and not a result of it as it is usually considered.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.