Information Age

The Information Age (also known as the Computer Age, Digital Age, or New Media Age) is a historic period in the 21st century characterized by the rapid shift from traditional industry that the Industrial Revolution brought through industrialization, to an economy based on information technology. The onset of the Information Age is associated with the Digital Revolution, just as the Industrial Revolution marked the onset of the Industrial Age.[1] The definition of what "digital" means (or what "information" means) continues to change over time as new technologies, user devices, methods of interaction with other humans and devices enter the domain of research, development and market launch.

During the Information Age, digital industry shapes a knowledge-based society surrounded by a high-tech global economy that exerts influence on how the manufacturing and service sectors operate in an efficient and convenient way. In a commercialized society, the information industry can allow individuals to explore their personalized needs, therefore simplifying the procedure of making decisions for transactions and significantly lowering costs both for producers and for buyers. This is accepted overwhelmingly by participants throughout the entire economic activities for efficacy purposes, and new economic incentives would then be indigenously encouraged, such as the knowledge economy.[2]

The Information Age formed by capitalizing on computer microminiaturization advances.[3] This evolution of technology in daily life and social organization has led to the modernization of information and communication processes becoming the driving force of social evolution.[4]

Concept of origin

The Information Age can define as the Primary Information Age and the Secondary Information Age. Information in the Primary Information age was handled by Newspaper, Radio and Television. The Secondary Information Age was developed by Internet, Satellite televisions and Mobile phones. The Tertiary Information Age was emerged by media of the Primary Information Age interconnected with media of the Secondary Information Age. Today we are experiencing it.[5]

Three stages of the Information Age
Three stages of the Information Age

Progression

Rings of time Information Age (Digital Revolution)
Rings of time showing some important dates in Information Age (Digital Revolution) from 1968 to 2017

Library expansion

Library expansion was calculated in 1945 by Fremont Rider to double in capacity every 16 years, if sufficient space were made available.[6] He advocated replacing bulky, decaying printed works with miniaturized microform analog photographs, which could be duplicated on-demand for library patrons or other institutions. He did not foresee the digital technology that would follow decades later to replace analog microform with digital imaging, storage, and transmission media. Automated, potentially lossless digital technologies allowed vast increases in the rapidity of information growth. Moore's law, which was formulated around 1965, calculated that the number of transistors in a dense integrated circuit doubles approximately every two years.[7]

The proliferation of the smaller and less expensive personal computers and improvements in computing power by the early 1980s resulted in a sudden access to and ability to share and store information for increasing numbers of workers. Connectivity between computers within companies led to the ability of workers at different levels to access greater amounts of information.

Information storage

The world's technological capacity to store information grew from 2.6 (optimally compressed) exabytes in 1986 to 15.8 in 1993, over 54.5 in 2000, and to 295 (optimally compressed) exabytes in 2007. This is the informational equivalent to less than one 730-MB CD-ROM per person in 1986 (539 MB per person), roughly 4 CD-ROM per person of 1993, 12 CD-ROM per person in the year 2000, and almost 61 CD-ROM per person in 2007.[8] It is estimated that the world's capacity to store information has reached 5 zettabytes in 2014.[9] This is the informational equivalent of 4,500 stacks of printed books from the earth to the sun.

Information transmission

The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986, 715 (optimally compressed) exabytes in 1993, 1.2 (optimally compressed) zettabytes in 2000, and 1.9 zettabytes in 2007 (this is the information equivalent of 174 newspapers per person per day).[8] The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986, 471 petabytes in 1993, 2.2 (optimally compressed) exabytes in 2000, and 65 (optimally compressed) exabytes in 2007 (this is the information equivalent of 6 newspapers per person per day).[8] In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses and homes globally. Technology was developing so quickly that a computer costing $3000 in 1997 would cost $2000 two years later and $1000 the following year.

Computation

The world's technological capacity to compute information with humanly guided general-purpose computers grew from 3.0 × 108 MIPS in 1986, to 4.4 × 109 MIPS in 1993, 2.9 × 1011 MIPS in 2000 to 6.4 × 1012 MIPS in 2007.[8] An article in the recognized Journal Trends in Ecology and Evolution reports that by now digital technology "has vastly exceeded the cognitive capacity of any single human being and has done so a decade earlier than predicted. In terms of capacity, there are two measures of importance: the number of operations a system can perform and the amount of information that can be stored. The number of synaptic operations per second in a human brain has been estimated to lie between 10^15 and 10^17. While this number is impressive, even in 2007 humanity's general-purpose computers were capable of performing well over 10^18 instructions per second. Estimates suggest that the storage capacity of an individual human brain is about 10^12 bytes. On a per capita basis, this is matched by current digital storage (5x10^21 bytes per 7.2x10^9 people)".[9]

Relation to economics

Eventually, Information and Communication Technology—computers, computerized machinery, fiber optics, communication satellites, internet, and other ICT tools—became a significant part of the economy. Microcomputers were developed and many businesses and industries were greatly changed by ICT.

Nicholas Negroponte captured the essence of these changes in his 1995 book, Being Digital.[10] His book discusses similarities and differences between products made of atoms and products made of bits. In essence, a copy of a product made of bits can be made cheaply and quickly, and shipped across the country or internationally quickly and at very low cost.

Impact on jobs and income distribution

The Information Age has affected the workforce in several ways. It has created a situation in which workers who perform easily automated tasks are forced to find work that is not easily automated.[11] Workers are also being forced to compete in a global job market. Lastly, workers are being replaced by computers that can do their jobs faster and more effectively. This poses problems for workers in industrial societies, which are still to be solved. However, solutions that involve lowering the working time are usually highly resisted.

Jobs traditionally associated with the middle class (assembly line workers, data processors, foremen and supervisors) are beginning to disappear, either through outsourcing or automation. Individuals who lose their jobs must either move up, joining a group of "mind workers" (engineers, doctors, attorneys, teachers, scientists, professors, executives, journalists, consultants), or settle for low-skill, low-wage service jobs.

The "mind workers" are able to compete successfully in the world market and receive (relatively) high wages. Conversely, production workers and service workers in industrialized nations are unable to compete with workers in developing countries and either lose their jobs through outsourcing or are forced to accept wage cuts.[12] In addition, the internet makes it possible for workers in developing countries to provide in-person services and compete directly with their counterparts in other nations.

This has had several major consequences, including increased opportunity in developing countries and the globalisation of the workforce.

Workers in developing countries have a competitive advantage that translates into increased opportunities and higher wages.[13] The full impact on the workforce in developing countries is complex and has downsides. (see discussion in section on Globalisation).

In the past, the economic fate of workers was tied to the fate of national economies. For example, workers in the United States were once well paid in comparison to the workers in other countries. With the advent of the Information Age and improvements in communication, this is no longer the case. Because workers are forced to compete in a global job market, wages are less dependent on the success or failure of individual economies.[12]

Automation, productivity and job gain

The Information Age has affected the workforce in that automation and computerisation have resulted in higher productivity coupled with net job loss in manufacture. In the United States for example, from January 1972 to August 2010, the number of people employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing value rose 270%.[14]

Although it initially appeared that job loss in the industrial sector might be partially offset by the rapid growth of jobs in the IT sector, the recession of March 2001 foreshadowed a sharp drop in the number of jobs in the IT sector. This pattern of decrease in jobs continued until 2003.[15]

Data has shown that overall, technology creates more jobs than it destroys even in the short run.[16]

Rise of information-intensive industry

Industry is becoming more information-intensive and less labor and capital-intensive (see Information industry). This trend has important implications for the workforce; workers are becoming increasingly productive as the value of their labor decreases. However, there are also important implications for capitalism itself; not only is the value of labor decreased, the value of capital is also diminished. In the classical model, investments in human capital and financial capital are important predictors of the performance of a new venture.[17] However, as demonstrated by Mark Zuckerberg and Facebook, it now seems possible for a group of relatively inexperienced people with limited capital to succeed on a large scale.[18]

Innovations

The Information Age was enabled by technology developed in the Digital Revolution, which was itself enabled by building on the developments in the Technological Revolution.

Computers

Before the advent of electronics, mechanical computers, like the Analytical Engine in 1837, were designed to provide routine mathematical calculation and simple decision-making capabilities. Military needs during World War II drove development of the first electronic computers, based on vacuum tubes, including the Z3, the Atanasoff–Berry Computer, Colossus computer, and ENIAC.

The invention of the transistor in 1947 enabled the era of mainframe computers (1950s – 1970s), typified by the IBM 360. These large, room-sized computers provided data calculation and manipulation that was much faster than humanly possible, but were expensive to buy and maintain, so were initially limited to a few scientific institutions, large corporations, and government agencies. As transistor technology rapidly improved, the ratio of computing power to size increased dramatically, giving direct access to computers to ever smaller groups of people.

Along with electronic arcade machines and home video game consoles in the 1970s, the development of personal computers like the Commodore PET and Apple II (both in 1977) gave individuals access to the computer. But data sharing between individual computers was either non-existent or largely manual, at first using punched cards and magnetic tape, and later floppy disks.

Data

The first developments for storing data were initially based on photographs, starting with microphotography in 1851 and then microform in the 1920s, with the ability to store documents on film, making them much more compact. In the 1970s, electronic paper allowed digital information to appear as paper documents.

Early information theory and Hamming codes were developed about 1950, but awaited technical innovations in data transmission and storage to be put to full use. While cables transmitting digital data connected computer terminals and peripherals to mainframes were common, and special message-sharing systems leading to email were first developed in the 1960s, independent computer-to-computer networking began with ARPANET in 1969. This expanded to become the Internet (coined in 1974), and then the World Wide Web in 1989.

Public digital data transmission first utilized existing phone lines using dial-up, starting in the 1950s, and this was the mainstay of the Internet until broadband in the 2000s. The introduction of wireless networking in the 1990s combined with the proliferation of communications satellites in the 2000s allowed for public digital transmission without the need for cables. This technology led to digital television, GPS, and satellite radio through the 1990s and 2000s.

Computers continued to become smaller and more powerful, to the point where they could be carried. In the 1980s and 1990s, laptops were developed as a form of portable computers, and PDAs could be used while standing or walking. Pagers existing since the 1950s, were largely replaced by mobile phones beginning in the late 1990s, providing mobile networking features to some computers. Now commonplace, this technology is extended to digital cameras and other wearable devices. Starting in the late 1990s, tablets and then smartphones combined and extended these abilities of computing, mobility, and information sharing.

Optics

Optical communication has played an important role in communication networks.[19] Optical communication provided the hardware basis for internet technology, laying the foundations for the Digital Revolution and Information Age.[20]

While working at Tohoku University, Japanese engineer Jun-ichi Nishizawa proposed fiber-optic communication, the use of optical fibers for optical communication, in 1963.[21] Nishizawa invented other technologies that contributed to the development of optical fiber communications, such as the graded-index optical fiber as a channel for transmitting light from semiconductor lasers.[22][23] He patented the graded-index optical fiber in 1964.[20] The solid-state optical fiber was invented by Nishizawa in 1964.[24]

The three essential elements of optical communication were invented by Jun-ichi Nishizawa: the semiconductor laser (1957) being the light source, the graded-index optical fiber (1964) as the transmission line, and the PIN photodiode (1950) as the optical receiver.[20] Izuo Hayashi's invention of the continuous wave semiconductor laser in 1970 led directly to the light sources in fiber-optic communication, laser printers, barcode readers, and optical disc drives, commercialized by Japanese entrepreneurs,[25] and opening up the field of optical communications.[19]

See also

References

  1. ^ Manuel, Castells (1996). The information age : economy, society and culture. Oxford: Blackwell. ISBN 978-0631215943. OCLC 43092627.
  2. ^ "Technology and Workforce: Comparison between the Information Revolution and the Industrial Revolution" by Mathias Humbert, University of California, Berkeley
  3. ^ Kluver, Randy. "Globalization, Informatization, and Intercultural Communication". United Nations Public Administration Network. Retrieved 18 April 2013.
  4. ^ Hilbert, M. (2015). Digital Technology and Social Change [Open Online Course at the University of California] (freely available). Retrieved from https://canvas.instructure.com/courses/949415
  5. ^ Iranga, Suroshana (2016). Social Media Culture. Colombo: S. Godage and Brothers. ISBN 978-9553067432.
  6. ^ Rider (1944). The Scholar and the Future of the Research Library. New York City: Hadham Press.
  7. ^ "Moore's Law to roll on for another decade". Retrieved 2011-11-27. Moore also affirmed he never said transistor count would double every 18 months, as is commonly said. Initially, he said transistors on a chip would double every year. He then recalibrated it to every two years in 1975. David House, an Intel executive at the time, noted that the changes would cause computer performance to double every 18 months.
  8. ^ a b c d Hilbert, Martin; López, Priscila (2011). "The World's Technological Capacity to Store, Communicate, and Compute Information". Science. 332 (6025): 60–65. Bibcode:2011Sci...332...60H. doi:10.1126/science.1200970. ISSN 0036-8075. PMID 21310967.
  9. ^ a b Gillings, Michael R.; Hilbert, Martin; Kemp, Darrell J. (2016). "Information in the Biosphere: Biological and Digital Worlds". Trends in Ecology & Evolution. 31 (3): 180–189. doi:10.1016/j.tree.2015.12.013. PMID 26777788.
  10. ^ "Negroponte's articles". Archives.obs-us.com. 1996-12-30. Retrieved 2012-06-11.
  11. ^ Porter, Michael. "How Information Gives You Competitive Advantage". Harvard Business Review. Retrieved 9 September 2015.
  12. ^ a b McGowan, Robert (1991). "The work of nations: Preparing ourselves for the 21st century capitalism, by Robert Reich. New York: Knopf Publishing, 1991". Human Resource Management. 30 (4): 535–538. doi:10.1002/hrm.3930300407. ISSN 1099-050X.
  13. ^ Bhagwati, Jagdish N. (2005). In defense of Globalization. New York: Oxford University Press.
  14. ^ "U.S. Manufacturing : Output vs. Jobs, January 1972 to August 2010 ". BLS and Fed Reserve graphic, reproduced in Smith, Fran. "Job Losses and Productivity Gains", OpenMarket.org, Oct 05, 2010.
  15. ^ Cooke, Sandra D. "Information Technology Workers in the Digital Economy", in Digital Economy 2003. 2003: Economics and Statistics Administration, Department of Commerce.
  16. ^ Yongsung, Chang; Hong (July 2013). "Jay H." SERI Quarterly. 6 (3): 44–53. Retrieved 29 April 2014.
  17. ^ Cooper, Arnold C.; Gimeno-Gascon, F. Javier; Woo, Carolyn Y. (1994). "Initial human and financial capital as predictors of new venture performance". Journal of Business Venturing. 9 (5): 371–395. doi:10.1016/0883-9026(94)90013-2.
  18. ^ Carr, David (2010-10-03). "Film Version of Zuckerberg Divides the Generations". The New York Times. ISSN 0362-4331. Retrieved 2016-12-20.
  19. ^ a b S. Millman (1983), A History of Engineering and Science in the Bell System, page 10, AT&T Bell Laboratories
  20. ^ a b c The Third Industrial Revolution Occurred in Sendai, Soh-VEHE International Patent Office, Japan Patent Attorneys Association
  21. ^ Nishizawa, Jun-ichi & Suto, Ken (2004). "Terahertz wave generation and light amplification using Raman effect". In Bhat, K. N. & DasGupta, Amitava. Physics of semiconductor devices. New Delhi, India: Narosa Publishing House. p. 27. ISBN 978-81-7319-567-9.
  22. ^ "Optical Fiber". Sendai New. Archived from the original on September 29, 2009. Retrieved April 5, 2009.
  23. ^ "New Medal Honors Japanese Microelectrics Industry Leader". Institute of Electrical and Electronics Engineers.
  24. ^ Semiconductor Technologies, page 338, Ohmsha, 1982
  25. ^ Johnstone, Bob (2000). We were burning : Japanese entrepreneurs and the forging of the electronic age. New York: BasicBooks. p. 252. ISBN 9780465091188.
  26. ^ "Newspapers News and News Archive Resources: Computer and Technology Sources". Temple University. Retrieved 9 September 2015.

Further reading

  • Oliver Stengel et al. (2017). Digitalzeitalter - Digitalgesellschaft, Springer ISBN 978-3658117580
  • Mendelson, Edward (June 2016). In the Depths of the Digital Age, The New York Review of Books
  • Bollacker, Kurt D. (2010) Avoiding a Digital Dark Age, American Scientist, March–April 2010, Volume 98, Number 2, p. 106ff
  • Castells, Manuel. (1996-98). The Information Age: Economy, Society and Culture, 3 vols. Oxford: Blackwell.
  • Gelbstein, E. (2006) Crossing the Executive Digital Divide. ISBN 99932-53-17-0

External links

Claude Shannon

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as "the father of information theory". Shannon is noted for having founded information theory with a landmark paper, A Mathematical Theory of Communication, that he published in 1948.

He is also well known for founding digital circuit design theory in 1937, when—as a 21-year-old master's degree student at the Massachusetts Institute of Technology (MIT)—he wrote his thesis demonstrating that electrical applications of Boolean algebra could construct any logical numerical relationship. Shannon contributed to the field of cryptanalysis for national defense during World War II, including his fundamental work on codebreaking and secure telecommunications.

Contemporary history

Contemporary history, in English-language historiography, is a subset of modern history which describes the historical period from approximately 1945 to the present. The term "contemporary history" has been in use at least since the early 19th century.Contemporary history is politically dominated by the Cold War (1945–91) between the United States and Soviet Union whose effects were felt across the world. The confrontation, which was mainly fought through proxy wars and through intervention in the internal politics of smaller nations, ultimately ended with the dissolution of the Soviet Union and Warsaw Pact in 1991, following the Revolutions of 1989. The latter stages and aftermath of the Cold War enabled the democratisation of much of Europe, Africa, and Latin America. In the Middle East, the period after 1945 was dominated by conflict involving the new state of Israel and the rise of petroleum politics, as well as the growth of Islamism after the 1980s. The first supranational organisations of government, such as the United Nations and European Union, emerged during the period after 1945, while the European colonial empires in Africa and Asia collapsed, gone by 1975. Countercultures rose and the sexual revolution transformed social relations in western countries between the 1960s and 1980s, epitomised by the Protests of 1968. Living standards rose sharply across the developed world because of the post-war economic boom, whereby such major economies as Japan and West Germany emerged. The culture of the United States, especially consumerism, spread widely. By the 1960s, many western countries had begun deindustrializing; in their place, globalization led to the emergence of new industrial centres, such as Japan, Taiwan and later China, which exported consumer goods to developed countries.

Science began transforming after 1945: spaceflight, nuclear technology, laser and semiconductor technology were developed alongside molecular biology and genetics, particle physics, and the Standard Model of quantum field theory. Meanwhile, the first computers were created, followed by the Internet, beginning the Information Age.

Cyberspace

Cyberspace is widespread, interconnected digital technology. The term entered the popular culture from science fiction and the arts but is now used by technology strategists, security professionals, government, military and industry leaders and entrepreneurs to describe the domain of the global technology environment. Others consider cyberspace to be just a notional environment in which communication over computer networks occurs. The word became popular in the 1990s when the uses of the Internet, networking, and digital communication were all growing dramatically and the term "cyberspace" was able to represent the many new ideas and phenomena that were emerging. It has been called the largest unregulated and uncontrolled domain in the history of mankind, and is also unique because it is a domain created by people vice the traditional physical domains.

The parent term of cyberspace is "cybernetic", derived from the Ancient Greek κυβερνήτης (kybernētēs, steersman, governor, pilot, or rudder), a word introduced by Norbert Wiener for his pioneering work in electronic communication and control science. This word cyberspace first appeared in the art installation of the same name by danish artist Susanne Ussing, 1968).As a social experience, individuals can interact, exchange ideas, share information, provide social support, conduct business, direct actions, create artistic media, play games, engage in political discussion, and so on, using this global network. They are sometimes referred to as cybernauts. The term cyberspace has become a conventional means to describe anything associated with the Internet and the diverse Internet culture. The United States government recognizes the interconnected information technology and the interdependent network of information technology infrastructures operating across this medium as part of the US national critical infrastructure. Amongst individuals on cyberspace, there is believed to be a code of shared rules and ethics mutually beneficial for all to follow, referred to as cyberethics. Many view the right to privacy as most important to a functional code of cyberethics. Such moral responsibilities go hand in hand when working online with global networks, specifically, when opinions are involved with online social experiences.According to Chip Morningstar and F. Randall Farmer, cyberspace is defined more by the social interactions involved rather than its technical implementation. In their view, the computational medium in cyberspace is an augmentation of the communication channel between real people; the core characteristic of cyberspace is that it offers an environment that consists of many participants with the ability to affect and influence each other. They derive this concept from the observation that people seek richness, complexity, and depth within a virtual world.

Digital Revolution

The Digital Revolution, also known as the Third Industrial Revolution, is the shift from mechanical and analogue electronic technology to digital electronics which began anywhere from the late 1950s to the late 1970s with the adoption and proliferation of digital computers and digital record keeping that continues to the present day. Implicitly, the term also refers to the sweeping changes brought about by digital computing and communication technology during (and after) the latter half of the 20th century. Analogous to the Agricultural Revolution and Industrial Revolution, the Digital Revolution marked the beginning of the Information Age.Central to this revolution is the mass production and widespread use of digital logic circuits, and its derived technologies, including the computer, digital cellular phone, and the Internet. These technological innovations have transformed traditional production and business techniques.

Ennis

Ennis (Irish: Inis, meaning "island") is the county town of County Clare, Ireland. The Irish name is short for Inis Cluana Rámhfhada ("island of the long rowing meadow"). The town is on the River Fergus, north of where it enters the Shannon Estuary, 19 km (12 mi) from Shannon Airport. In 2016, Ennis had a population of 25,276, making it the largest town in Clare and the 12th largest in Ireland.

Human factors and ergonomics

Human factors and ergonomics (commonly referred to as human factors) is the application of psychological and physiological principles to the (engineering and) design of products, processes, and systems. The goal of human factors is to reduce human error, increase productivity, and enhance safety and comfort with a specific focus on the interaction between the human and the thing of interest.The field is a combination of numerous disciplines, such as psychology, sociology, engineering, biomechanics, industrial design, physiology, anthropometry, interaction design, visual design, user experience, and user interface design. In research, human factors employs the scientific method to study human behavior so that the resultant data may be applied to the four primary goals. In essence, it is the study of designing equipment, devices and processes that fit the human body and its cognitive abilities. The two terms "human factors" and "ergonomics" are essentially synonymous.The International Ergonomics Association defines ergonomics or human factors as follows:

Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design to optimize human well-being and overall system performance.

Human factors is employed to fulfill the goals of occupational health and safety and productivity. It is relevant in the design of such things as safe furniture and easy-to-use interfaces to machines and equipment.

Proper ergonomic design is necessary to prevent repetitive strain injuries and other musculoskeletal disorders, which can develop over time and can lead to long-term disability.

Human factors and ergonomics is concerned with the "fit" between the user, equipment, and environment. It accounts for the user's capabilities and limitations in seeking to ensure that tasks, functions, information, and the environment suit that user.

To assess the fit between a person and the used technology, human factors specialists or ergonomists consider the job (activity) being done and the demands on the user; the equipment used (its size, shape, and how appropriate it is for the task), and the information used (how it is presented, accessed, and changed). Ergonomics draws on many disciplines in its study of humans and their environments, including anthropometry, biomechanics, mechanical engineering, industrial engineering, industrial design, information design, kinesiology, physiology, cognitive psychology, industrial and organizational psychology, and space psychology.

Industrial Age

The Industrial Age is a period of history that encompasses the changes in economic and social organization that began around 1760 in Great Britain and later in other countries, characterized chiefly by the replacement of hand tools with power-driven machines such as the power loom and the steam engine, and by the concentration of industry in large establishments.While it is commonly believed that the Industrial Age was supplanted by the Information Age in the late 20th century, a view that has become common since the Revolutions of 1989, as of 2013 electric power generation is still based mostly on fossil fuels and much of the Third World economy is still based on manufacturing. Thus it is debatable whether we have left the Industrial Age already or are still in it and in the process of reaching the Information Age.

Information and communications technology

Information and communications technology (ICT) is an extensional term for information technology (IT) that stresses the role of unified communications and the integration of telecommunications (telephone lines and wireless signals) and computers, as well as necessary enterprise software, middleware, storage, and audiovisual systems, that enable users to access, store, transmit, and manipulate information.The term ICT is also used to refer to the convergence of audiovisual and telephone networks with computer networks through a single cabling or link system. There are large economic incentives (huge cost savings due to the elimination of the telephone network) to merge the telephone network with the computer network system using a single unified system of cabling, signal distribution, and management.

ICT is a broad subject and the concepts are evolving. It covers any product that will store, retrieve, manipulate, transmit, or receive information electronically in a digital form (e.g., personal computers, digital television, email, or robots). For clarity, Zuppo provided an ICT hierarchy where all levels of the hierarchy "contain some degree of commonality in that they are related to technologies that facilitate the transfer of information and various types of electronically mediated communications". Theoretical differences between interpersonal-communication technologies and mass-communication technologies have been identified by the philosopher Piyush Mathur. Skills Framework for the Information Age is one of many models for describing and managing competencies for ICT professionals for the 21st century.

Information theory

Information theory studies the quantification, storage, and communication of information. It was originally proposed by Claude E. Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication". Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.

A key measure in information theory is "entropy". Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.

The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, information engineering, and electrical engineering. The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology, human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection. Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.

Knowledge economy

The knowledge economy is the use of knowledge (savoir, savoir-faire, savoir-être) to generate tangible and intangible values. Technology, and in particular, knowledge technology, helps to incorporate part of human knowledge into machines. This knowledge can be used by decision support systems in various fields to generate economic value. Knowledge economy is also possible without technology.The term was popularized by Peter Drucker as the title of Chapter 12 in his book The Age of Discontinuity (1969), that Drucker attributed to economist Fritz Machlup, originating in the idea of "scientific management" developed by Frederick Winslow Taylor.Other than the agricultural-intensive economies and labor-intensive economies, the global economy is in transition to a "knowledge economy", as an extension of an "information society" in the Information Age led by innovation. The transition requires that the rules and practices that determined success in the industrial economy need rewriting in an interconnected, globalized economy where knowledge resources such as trade secrets and expertise are as critical as other economic resources.

Lorem ipsum

In publishing and graphic design, lorem ipsum is a placeholder text commonly used to demonstrate the visual form of a document without relying on meaningful content (also called greeking). Replacing the actual content with placeholder text allows designers to design the form of the content before the content itself has been produced.

The lorem ipsum text is typically a scrambled section of De finibus bonorum et malorum, a 1st-century BC Latin text by Cicero, with words altered, added, and removed to make it nonsensical, improper Latin.

A variation of the ordinary lorem ipsum text has been used in typesetting since the 1960s or earlier, when it was popularized by advertisements for Letraset transfer sheets. It was introduced to the Information Age in the mid-1980s by Aldus Corporation, which employed it in graphics and word-processing templates for its desktop publishing program PageMaker. Many popular Word Processors use this format as a placeholder. Some examples are Pages or Microsoft Word

Manuel Castells

Manuel Castells Oliván (Spanish: [kasˈtels], Catalan: [kəsˈteʎs]; born 9 February 1942) is a Spanish sociologist especially associated with research on the information society, communication and globalization.

The 2000–2014 research survey of the Social Sciences Citation Index ranks him as the world's fifth most-cited social science scholar, and the foremost-cited communication scholar.He was awarded the 2012 Holberg Prize, for having "shaped our understanding of the political dynamics of urban and global economies in the network society." In 2013 he was awarded the Balzan Prize for Sociology.

Network-centric warfare

Network-centric warfare, also called network-centric operations or net-centric warfare, is a military doctrine or theory of war pioneered by the United States Department of Defense in the 1990s.

It seeks to translate an information advantage, enabled in part by information technology, into a competitive advantage through the robust computer networking of well informed geographically dispersed forces.

News aggregator

In computing, a news aggregator, also termed a feed aggregator, feed reader, news reader, RSS reader or simply aggregator, is client software or a web application which aggregates syndicated web content such as online newspapers, blogs, podcasts, and video blogs (vlogs) in one location for easy viewing. RSS is a synchronized subscription system. RSS uses extensible markup language (XML) to structure pieces of information to be aggregated in a feed reader that displays the information in a user-friendly interface. The updates distributed may include journal tables of contents, podcasts, videos, and news items.

Noosphere

The noosphere (; sometimes noösphere) is the sphere of human thought. The word derives from the Greek νοῦς (nous "mind") and σφαῖρα (sphaira "sphere"), in lexical analogy to "atmosphere" and "biosphere". It was introduced by Pierre Teilhard de Chardin in 1922 in his Cosmogenesis. Another possibility is the first use of the term by Édouard Le Roy (1870–1954), who together with Teilhard was listening to lectures of Vladimir Ivanovich Vernadsky at the Sorbonne. In 1936, Vernadsky accepted the idea of the noosphere in a letter to Boris Leonidovich Lichkov (though he states that the concept derives from Le Roy. Citing the work of Teilhard's biographer—Rene Cuenot—Sampson and Pitt stated that although the concept was jointly developed by all three men (Vernadsky, LeRoy, and Teilhard), Teilhard believed that he actually invented the word: "I believe, so far as one can ever tell, that the word 'noosphere' was my invention: but it was he [Le Roy] who launched it."

Open source

Open source is a term denoting that a product includes permission to use its source code, design documents, or content. It most commonly refers to the open-source model, in which open-source software or other products are released under an open-source license as part of the open-source-software movement. Use of the term originated with software, but has expanded beyond the software sector to cover other open content and forms of open collaboration.

Peace education

Peace education is the process of acquiring the values, the knowledge and developing the attitudes, skills, and behaviors to live in harmony with oneself, with others, and with the natural environment.

There are numerous United Nations declarations on the importance of peace Information Age Publishing. ISBN 978-1-59311-889-1. Chapter details; and Page, James S. (2008) 'Chapter 9: The United Nations and Peace Education'. In: Monisha Bajaj (ed.)Encyclopedia of Peace Education. (75-83). Charlotte: Information Age Publishing. ISBN 978-1-59311-898-3. Further information Ban Ki Moon, U.N. Secretary General, has dedicated the International Day of Peace 2013 to peace education in an effort to refocus minds and financing on the preeminence of peace education as the means to bring about a culture of peace. Koichiro Matsuura, the immediate past Director-General of UNESCO, has written of peace education as being of "fundamental importance to the mission of UNESCO and the United Nations". Peace education as a right is something which is now increasingly emphasized by peace researchers such as Betty Reardon and Douglas Roche. There has also been a recent meshing of peace education and human rights education.

Science Museum, London

The Science Museum is a major museum on Exhibition Road in South Kensington, London. It was founded in 1857 and today is one of the city's major tourist attractions, attracting 3.3 million visitors annually.Like other publicly funded national museums in the United Kingdom, the Science Museum does not charge visitors for admission. Temporary exhibitions, however, may incur an admission fee. It is part of the Science Museum Group, having merged with the Museum of Science and Industry in Manchester in 2012.

World Wide Web

The World Wide Web, commonly known as the WWW and the Web, is an information space where documents and other web resources are identified by Uniform Resource Locators (URLs, such as https://www.example.com/), which may be interlinked by hypertext, and are accessible via the Internet. The resources of the WWW may be accessed by users via a software application called a web browser.

Web resources may be any type of downloadable media, but web pages are hypertext media which have been formatted in Hypertext Markup Language (HTML). Such formatting allows for embedded hyperlinks which contain URLs and permit users to easily navigate to other web resources. In addition to text, web pages may contain images, video, audio, and software components that are rendered in the user's web browser as coherent pages of multimedia content.

Multiple web resources with a common theme, a common domain name, or both, make up a website. Websites are stored in computers which are running a program called a web server which responds to requests made over the Internet from web browsers running on user's computers. Website content can be largely provided by a publisher, or interactively where users contribute content or the content depends upon the users or their actions. Websites may be provided for myriad informative, entertainment, commercial, governmental, or non-governmental reasons.

English scientist Tim Berners-Lee invented the World Wide Web in 1989. He wrote the first web browser in 1990 while employed at CERN near Geneva, Switzerland. The browser was released outside CERN in 1991, first to other research institutions starting in January 1991 and to the general public on the Internet in August 1991. Since that time the World Wide Web has been central to the development of the Information Age and is a primary tool billions of people use to interact over the Internet.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.