Watson (computer)

Watson is a question-answering computer system capable of answering questions posed in natural language,[2] developed in IBM's DeepQA project by a research team led by principal investigator David Ferrucci.[3] Watson was named after IBM's first CEO, industrialist Thomas J. Watson.[4][5]

The computer system was initially developed to answer questions on the quiz show Jeopardy![6] and, in 2011, the Watson computer system competed on Jeopardy! against legendary champions Brad Rutter and Ken Jennings[4][7] winning the first place prize of $1 million.[8]

In February 2013, IBM announced that Watson software system's first commercial application would be for utilization management decisions in lung cancer treatment at Memorial Sloan Kettering Cancer Center, New York City, in conjunction with health insurance company WellPoint.[9] IBM Watson's former business chief, Manoj Saxena, says that 90% of nurses in the field who use Watson now follow its guidance.[10]

IBM Watson Logo 2017
Watson's avatar, inspired by the IBM "Smarter Planet" logo[1]

Description

DeepQA
The high-level architecture of IBM's DeepQA used in Watson[11]

Watson was created as a question answering (QA) computing system that IBM built to apply advanced natural language processing, information retrieval, knowledge representation, automated reasoning, and machine learning technologies to the field of open domain question answering.[2]

The key difference between QA technology and document search is that document search takes a keyword query and returns a list of documents, ranked in order of relevance to the query (often based on popularity and page ranking), while QA technology takes a question expressed in natural language, seeks to understand it in much greater detail, and returns a precise answer to the question.[12]

When created, IBM stated that, "more than 100 different techniques are used to analyze natural language, identify sources, find and generate hypotheses, find and score evidence, and merge and rank hypotheses."[13]

In recent years, the Watson capabilities have been extended and the way in which Watson works has been changed to take advantage of new deployment models (Watson on IBM Cloud) and evolved machine learning capabilities and optimised hardware available to developers and researchers. It is no longer purely a question answering (QA) computing system designed from Q&A pairs but can now 'see', 'hear', 'read', 'talk', 'taste', 'interpret', 'learn' and 'recommend'.

Software

Watson uses IBM's DeepQA software and the Apache UIMA (Unstructured Information Management Architecture) framework implementation. The system was written in various languages, including Java, C++, and Prolog, and runs on the SUSE Linux Enterprise Server 11 operating system using the Apache Hadoop framework to provide distributed computing.[14][15][16]

Hardware

The system is workload-optimized, integrating massively parallel POWER7 processors and built on IBM's DeepQA technology,[17] which it uses to generate hypotheses, gather massive evidence, and analyze data.[2] Watson employs a cluster of ninety IBM Power 750 servers, each of which uses a 3.5 GHz POWER7 eight-core processor, with four threads per core. In total, the system has 2,880 POWER7 processor threads and 16 terabytes of RAM.[17]

According to John Rennie, Watson can process 500 gigabytes, the equivalent of a million books, per second.[18] IBM's master inventor and senior consultant, Tony Pearson, estimated Watson's hardware cost at about three million dollars.[19] Its Linpack performance stands at 80 TeraFLOPs, which is about half as fast as the cut-off line for the Top 500 Supercomputers list.[20] According to Rennie, all content was stored in Watson's RAM for the Jeopardy game because data stored on hard drives would be too slow to be competitive with human Jeopardy champions.[18]

Data

The sources of information for Watson include encyclopedias, dictionaries, thesauri, newswire articles and literary works. Watson also used databases, taxonomies and ontologies. Specifically, DBPedia, WordNet and Yago were used.[21] The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias and other reference material that it could use to build its knowledge.[22]

Operation

The computer's techniques for unravelling Jeopardy! clues sounded just like mine. That machine zeroes in on keywords in a clue then combs its memory (in Watson's case, a 15-terabyte databank of human knowledge) for clusters of associations with those words. It rigorously checks the top hits against all the contextual information it can muster: the category name; the kind of answer being sought; the time, place, and gender hinted at in the clue; and so on. And when it feels "sure" enough, it decides to buzz. This is all an instant, intuitive process for a human Jeopardy! player, but I felt convinced that under the hood my brain was doing more or less the same thing.
— Ken Jennings[23]

Watson parses questions into different keywords and sentence fragments in order to find statistically related phrases.[22] Watson's main innovation was not in the creation of a new algorithm for this operation but rather its ability to quickly execute hundreds of proven language analysis algorithms simultaneously.[22][24] The more algorithms that find the same answer independently the more likely Watson is to be correct.[22] Once Watson has a small number of potential solutions, it is able to check against its database to ascertain whether the solution makes sense or not.[22]

Comparison with human players

Watson Jeopardy
Ken Jennings, Watson, and Brad Rutter in their Jeopardy! exhibition match.

Watson's basic working principle is to parse keywords in a clue while searching for related terms as responses. This gives Watson some advantages and disadvantages compared with human Jeopardy! players.[25] Watson has deficiencies in understanding the contexts of the clues. As a result, human players usually generate responses faster than Watson, especially to short clues.[22] Watson's programming prevents it from using the popular tactic of buzzing before it is sure of its response.[22] Watson has consistently better reaction time on the buzzer once it has generated a response, and is immune to human players' psychological tactics, such as jumping between categories on every clue.[22][26]

In a sequence of 20 mock games of Jeopardy, human participants were able to use the average six to seven seconds that Watson needed to hear the clue and decide whether to signal for responding.[22] During that time, Watson also has to evaluate the response and determine whether it is sufficiently confident in the result to signal.[22] Part of the system used to win the Jeopardy! contest was the electronic circuitry that receives the "ready" signal and then examined whether Watson's confidence level was great enough to activate the buzzer. Given the speed of this circuitry compared to the speed of human reaction times, Watson's reaction time was faster than the human contestants except when the human anticipated (instead of reacted to) the ready signal.[27] After signaling, Watson speaks with an electronic voice and gives the responses in Jeopardy!'s question format.[22] Watson's voice was synthesized from recordings that actor Jeff Woodman made for an IBM text-to-speech program in 2004.[28]

The Jeopardy! staff used different means to notify Watson and the human players when to buzz,[27] which was critical in many rounds.[26] The humans were notified by a light, which took them tenths of a second to perceive.[29][30] Watson was notified by an electronic signal and could activate the buzzer within about eight milliseconds.[31] The humans tried to compensate for the perception delay by anticipating the light,[32] but the variation in the anticipation time was generally too great to fall within Watson's response time.[26] Watson did not attempt to anticipate the notification signal.[30][32]

History

Development

Since Deep Blue's victory over Garry Kasparov in chess in 1997, IBM had been on the hunt for a new challenge. In 2004, IBM Research manager Charles Lickel, over dinner with coworkers, noticed that the restaurant they were in had fallen silent. He soon discovered the cause of this evening hiatus: Ken Jennings, who was then in the middle of his successful 74-game run on Jeopardy!. Nearly the entire restaurant had piled toward the televisions, mid-meal, to watch the phenomenon. Intrigued by the quiz show as a possible challenge for IBM, Lickel passed the idea on, and in 2005, IBM Research executive Paul Horn backed Lickel up, pushing for someone in his department to take up the challenge of playing Jeopardy! with an IBM system. Though he initially had trouble finding any research staff willing to take on what looked to be a much more complex challenge than the wordless game of chess, eventually David Ferrucci took him up on the offer.[33] In competitions managed by the United States government, Watson's predecessor, a system named Piquant, was usually able to respond correctly to only about 35% of clues and often required several minutes to respond.[34][35][36] To compete successfully on Jeopardy!, Watson would need to respond in no more than a few seconds, and at that time, the problems posed by the game show were deemed to be impossible to solve.[22]

In initial tests run during 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past Jeopardy! programs. While the best real-life competitors buzzed in half the time and responded correctly to as many as 95% of clues, Watson's first pass could get only about 15% correct. During 2007, the IBM team was given three to five years and a staff of 15 people to solve the problems.[22] By 2008, the developers had advanced Watson such that it could compete with Jeopardy! champions.[22] By February 2010, Watson could beat human Jeopardy! contestants on a regular basis.[37]

During the game, Watson had access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage[14] including the full text of the 2011 edition of Wikipedia,[38] but was not connected to the Internet.[39][22] For each clue, Watson's three most probable responses were displayed on the television screen. Watson consistently outperformed its human opponents on the game's signaling device, but had trouble in a few categories, notably those having short clues containing only a few words.

Although the system is primarily an IBM effort, Watson's development involved faculty and graduate students from Rensselaer Polytechnic Institute, Carnegie Mellon University, University of Massachusetts Amherst, the University of Southern California's Information Sciences Institute, the University of Texas at Austin, the Massachusetts Institute of Technology, and the University of Trento,[11] as well as students from New York Medical College.[40]

Jeopardy!

Preparation

IBMWatson
Watson demo at an IBM booth at a trade show

In 2008, IBM representatives communicated with Jeopardy! executive producer Harry Friedman about the possibility of having Watson compete against Ken Jennings and Brad Rutter, two of the most successful contestants on the show, and the program's producers agreed.[22][41] Watson's differences with human players had generated conflicts between IBM and Jeopardy! staff during the planning of the competition.[25] IBM repeatedly expressed concerns that the show's writers would exploit Watson's cognitive deficiencies when writing the clues, thereby turning the game into a Turing test. To alleviate that claim, a third party randomly picked the clues from previously written shows that were never broadcast.[25] Jeopardy! staff also showed concerns over Watson's reaction time on the buzzer. Originally Watson signalled electronically, but show staff requested that it press a button physically, as the human contestants would.[42] Even with a robotic "finger" pressing the buzzer, Watson remained faster than its human competitors. Ken Jennings noted, "If you're trying to win on the show, the buzzer is all", and that Watson "can knock out a microsecond-precise buzz every single time with little or no variation. Human reflexes can't compete with computer circuits in this regard."[26][32][43] Stephen Baker, a journalist who recorded Watson's development in his book Final Jeopardy, reported that the conflict between IBM and Jeopardy! became so serious in May 2010 that the competition was almost canceled.[25] As part of the preparation, IBM constructed a mock set in a conference room at one of its technology sites to model the one used on Jeopardy!. Human players, including former Jeopardy! contestants, also participated in mock games against Watson with Todd Alan Crain of The Onion playing host.[22] About 100 test matches were conducted with Watson winning 65% of the games.[44]

To provide a physical presence in the televised games, Watson was represented by an "avatar" of a globe, inspired by the IBM "smarter planet" symbol. Jennings described the computer's avatar as a "glowing blue ball criss-crossed by 'threads' of thought—42 threads, to be precise",[23] and stated that the number of thought threads in the avatar was an in-joke referencing the significance of the number 42 in Douglas Adams' Hitchhiker's Guide to the Galaxy.[23] Joshua Davis, the artist who designed the avatar for the project, explained to Stephen Baker that there are 36 triggerable states that Watson was able to use throughout the game to show its confidence in responding to a clue correctly; he had hoped to be able to find forty-two, to add another level to the Hitchhiker's Guide reference, but he was unable to pinpoint enough game states.[45]

A practice match was recorded on January 13, 2011, and the official matches were recorded on January 14, 2011. All participants maintained secrecy about the outcome until the match was broadcast in February.[46]

Practice match

In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question. None of the three players responded incorrectly to a clue.[47]

First match

The first round was broadcast February 14, 2011, and the second round, on February 15, 2011. The right to choose the first category had been determined by a draw won by Rutter.[48] Watson, represented by a computer monitor display and artificial voice, responded correctly to the second clue and then selected the fourth clue of the first category, a deliberate strategy to find the Daily Double as quickly as possible.[49] Watson's guess at the Daily Double location was correct. At the end of the first round, Watson was tied with Rutter at $5,000; Jennings had $2,000.[48]

Watson's performance was characterized by some quirks. In one instance, Watson repeated a reworded version of an incorrect response offered by Jennings. (Jennings said "What are the '20s?" in reference to the 1920s. Then Watson said "What is 1920s?") Because Watson could not recognize other contestants' responses, it did not know that Jennings had already given the same response. In another instance, Watson was initially given credit for a response of "What is a leg?" after Jennings incorrectly responded "What is: he only had one hand?" to a clue about George Eyser (the correct response was, "What is: he's missing a leg?"). Because Watson, unlike a human, could not have been responding to Jennings's mistake, it was decided that this response was incorrect. The broadcast version of the episode was edited to omit Trebek's original acceptance of Watson's response.[50] Watson also demonstrated complex wagering strategies on the Daily Doubles, with one bet at $6,435 and another at $1,246.[51] Gerald Tesauro, one of the IBM researchers who worked on Watson, explained that Watson's wagers were based on its confidence level for the category and a complex regression model called the Game State Evaluator.[52]

Watson took a commanding lead in Double Jeopardy!, correctly responding to both Daily Doubles. Watson responded to the second Daily Double correctly with a 32% confidence score.[51]

Although it wagered only $947 on the clue, Watson was the only contestant to miss the Final Jeopardy! response in the category U.S. CITIES ("Its largest airport was named for a World War II hero; its second largest, for a World War II battle"). Rutter and Jennings gave the correct response of Chicago, but Watson's response was "What is Toronto?????"[51][53][54] Ferrucci offered reasons why Watson would appear to have guessed a Canadian city: categories only weakly suggest the type of response desired, the phrase "U.S. city" did not appear in the question, there are cities named Toronto in the U.S., and Toronto in Ontario has an American League baseball team.[55] Dr. Chris Welty, who also worked on Watson, suggested that it may not have been able to correctly parse the second part of the clue, "its second largest, for a World War II battle" (which was not a standalone clause despite it following a semicolon, and required context to understand that it was referring to a second-largest airport).[56] Eric Nyberg, a professor at Carnegie Mellon University and a member of the development team, stated that the error occurred because Watson does not possess the comparative knowledge to discard that potential response as not viable.[54] Although not displayed to the audience as with non-Final Jeopardy! questions, Watson's second choice was Chicago. Both Toronto and Chicago were well below Watson's confidence threshold, at 14% and 11% respectively. (This lack of confidence was the reason for the multiple question marks in Watson's response.)

The game ended with Jennings with $4,800, Rutter with $10,400, and Watson with $35,734.[51]

Second match

During the introduction, Trebek (a Canadian native) joked that he had learned Toronto was a U.S. city, and Watson's error in the first match prompted an IBM engineer to wear a Toronto Blue Jays jacket to the recording of the second match.[57]

In the first round, Jennings was finally able to choose a Daily Double clue,[58] while Watson responded to one Daily Double clue incorrectly for the first time in the Double Jeopardy! Round.[59] After the first round, Watson placed second for the first time in the competition after Rutter and Jennings were briefly successful in increasing their dollar values before Watson could respond.[59][60] Nonetheless, the final result ended with a victory for Watson with a score of $77,147, besting Jennings who scored $24,000 and Rutter who scored $21,600.[61]

Final outcome

The prizes for the competition were $1 million for first place (Watson), $300,000 for second place (Jennings), and $200,000 for third place (Rutter). As promised, IBM donated 100% of Watson's winnings to charity, with 50% of those winnings going to World Vision and 50% going to World Community Grid.[62] Similarly, Jennings and Rutter donated 50% of their winnings to their respective charities.[63]

In acknowledgment of IBM and Watson's achievements, Jennings made an additional remark in his Final Jeopardy! response: "I for one welcome our new computer overlords", echoing a similar memetic reference to the episode "Deep Space Homer" on The Simpsons, in which TV news presenter Kent Brockman speaks of welcoming "our new insect overlords".[64][65] Jennings later wrote an article for Slate, in which he stated:

IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of 'thinking' machines. 'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last.[23]

Philosophy

Philosopher John Searle argues that Watson—despite impressive capabilities—cannot actually think.[66] Drawing on his Chinese room thought experiment, Searle claims that Watson, like other computational machines, is capable only of manipulating symbols, but has no ability to understand the meaning of those symbols; however, Searle's experiment has its detractors.[67]

Match against members of the United States Congress

On February 28, 2011, Watson played an untelevised exhibition match of Jeopardy! against members of the United States House of Representatives. In the first round, Rush D. Holt, Jr. (D-NJ, a former Jeopardy! contestant), who was challenging the computer with Bill Cassidy (R-LA, later Senator from Louisiana), led with Watson in second place. However, combining the scores between all matches, the final score was $40,300 for Watson and $30,000 for the congressional players combined.[68]

IBM's Christopher Padilla said of the match, "The technology behind Watson represents a major advancement in computing. In the data-intensive environment of government, this type of technology can help organizations make better decisions and improve how government helps its citizens."[68]

Current and future applications

According to IBM, "The goal is to have computers start to interact in natural human terms across a range of applications and processes, understanding the questions that humans ask and providing answers that humans can understand and justify."[37] It has been suggested by Robert C. Weber, IBM's general counsel, that Watson may be used for legal research.[69] The company also intends to use Watson in other information-intensive fields, such as telecommunications, financial services and government.[70]

Watson is based on commercially available IBM Power 750 servers that have been marketed since February 2010. IBM also intends to market the DeepQA software to large corporations, with a price in the millions of dollars, reflecting the $1 million needed to acquire a server that meets the minimum system requirement to operate Watson. IBM expects the price to decrease substantially within a decade as the technology improves.[22]

Commentator Rick Merritt said that "there's another really important reason why it is strategic for IBM to be seen very broadly by the American public as a company that can tackle tough computer problems. A big slice of [IBM's profit] comes from selling to the U.S. government some of the biggest, most expensive systems in the world."[71]

In 2013, it was reported that three companies were working with IBM to create apps embedded with Watson technology. Fluid is developing an app for retailers, one called "The North Face", which is designed to provide advice to online shoppers. Welltok is developing an app designed to give people advice on ways to engage in activities to improve their health. MD Buyline is developing an app for the purpose of advising medical institutions on equipment procurement decisions.[72][73]

In November 2013, IBM announced it would make Watson's API available to software application providers, enabling them to build apps and services that are embedded in Watson's capabilities. To build out its base of partners who create applications on the Watson platform, IBM consults with a network of venture capital firms, which advise IBM on which of their portfolio companies may be a logical fit for what IBM calls the Watson Ecosystem. Thus far, roughly 800 organizations and individuals have signed up with IBM, with interest in creating applications that could use the Watson platform.[74]

On January 30, 2013, it was announced that Rensselaer Polytechnic Institute would receive a successor version of Watson, which would be housed at the Institute's technology park and be available to researchers and students.[75] By summer 2013, Rensselaer had become the first university to receive a Watson computer.[76]

On February 6, 2014, it was reported that IBM plans to invest $100 million in a 10-year initiative to use Watson and other IBM technologies to help countries in Africa address development problems, beginning with healthcare and education.[77]

On June 3, 2014, three new Watson Ecosystem partners were chosen from more than 400 business concepts submitted by teams spanning 18 industries from 43 countries. "These bright and enterprising organizations have discovered innovative ways to apply Watson that can deliver demonstrable business benefits", said Steve Gold, vice president, IBM Watson Group. The winners were Majestyk Apps with their adaptive educational platform, FANG (Friendly Anthropomorphic Networked Genome);[78][79] Red Ant with their retail sales trainer;[80] and GenieMD[81] with their medical recommendation service.[82]

On July 9, 2014, Genesys Telecommunications Laboratories announced plans to integrate Watson to improve their customer experience platform, citing the sheer volume of customer data to analyze is staggering.[83]

Watson has been integrated with databases including Bon Appétit magazine to perform a recipe generating platform.[84]

Watson is being used by Decibel, a music discovery startup, in its app MusicGeek which uses the supercomputer to provide music recommendations to its users. The use of the artificial intelligence of Watson has also been found in the hospitality industry. GoMoment uses Watson for its Rev1 app, which gives hotel staff a way to quickly respond to questions from guests.[85] Arria NLG has built an app that helps energy companies stay within regulatory guidelines, making it easier for managers to make sense of thousands of pages of legal and technical jargon.

OmniEarth, Inc. uses Watson computer vision services to analyze satellite and aerial imagery, along with other municipal data, to infer water usage on a property-by-property basis, helping water districts in drought-stricken California improve water conservation efforts.[86]

In September 2016, Condé Nast has started using IBM's Watson to help build and strategize social influencer campaigns for brands. Using software built by IBM and Influential, Condé Nast's clients will be able to know which influencer's demographics, personality traits and more best align with a marketer and the audience it is targeting.[87]

In February 2017, Rare Carat, a New York City-based startup and e-commerce platform for buying diamonds and diamond rings, introduced an IBM Watson-powered artificial intelligence chatbot called "Rocky" to assist novice diamond buyers through the daunting process of purchasing a diamond. As part of the IBM Global Entrepreneur Program, Rare Carat received the assistance of IBM in the development of the Rocky Chat Bot.[88][89][90] In May 2017, IBM partnered with the Pebble Beach Company to use Watson as a concierge.[91] Watson's artificial intelligence was added to an app developed by Pebble Beach and was used to guide visitors around the resort. The mobile app was designed by IBM iX and hosted on the IBM Cloud. It uses Watson's Conversation applications programming interface.

In November 2017, in Mexico City, the Experience Voices of Another Time was opened at the National Museum of Anthropology using IBM Watson as an alternative to visiting a museum.[92]

Healthcare

In healthcare, Watson's natural language, hypothesis generation, and evidence-based learning capabilities are being investigated to see how Watson may contribute to clinical decision support systems and the increase in Artificial intelligence in healthcare for use by medical professionals.[93] To aid physicians in the treatment of their patients, once a physician has posed a query to the system describing symptoms and other related factors, Watson first parses the input to identify the most important pieces of information; then mines patient data to find facts relevant to the patient's medical and hereditary history; then examines available data sources to form and test hypotheses;[93] and finally provides a list of individualized, confidence-scored recommendations.[94] The sources of data that Watson uses for analysis can include treatment guidelines, electronic medical record data, notes from healthcare providers, research materials, clinical studies, journal articles and patient information.[93] Despite being developed and marketed as a "diagnosis and treatment advisor", Watson has never been actually involved in the medical diagnosis process, only in assisting with identifying treatment options for patients who have already been diagnosed.[95]

In February 2011, it was announced that IBM would be partnering with Nuance Communications for a research project to develop a commercial product during the next 18 to 24 months, designed to exploit Watson's clinical decision support capabilities. Physicians at Columbia University would help to identify critical issues in the practice of medicine where the system's technology may be able to contribute, and physicians at the University of Maryland would work to identify the best way that a technology like Watson could interact with medical practitioners to provide the maximum assistance.[96]

In September 2011, IBM and WellPoint announced a partnership to utilize Watson's data crunching capability to help suggest treatment options to physicians.[97] Then, in February 2013, IBM and WellPoint gave Watson its first commercial application, for utilization management decisions in lung cancer treatment at Memorial Sloan–Kettering Cancer Center.[9]

IBM announced a partnership with Cleveland Clinic in October 2012. The company has sent Watson to the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, where it will increase its health expertise and assist medical professionals in treating patients. The medical facility will utilize Watson's ability to store and process large quantities of information to help speed up and increase the accuracy of the treatment process. "Cleveland Clinic's collaboration with IBM is exciting because it offers us the opportunity to teach Watson to 'think' in ways that have the potential to make it a powerful tool in medicine", said C. Martin Harris, MD, chief information officer of Cleveland Clinic.[98]

In 2013, IBM and MD Anderson Cancer Center began a pilot program to further the center's "mission to eradicate cancer".[99][100] However, after spending $62 million, the project did not meet its goals and it has been stopped.[101]

On February 8, 2013, IBM announced that oncologists at the Maine Center for Cancer Medicine and Westmed Medical Group in New York have started to test the Watson supercomputer system in an effort to recommend treatment for lung cancer.[102]

On July 29, 2016, IBM and Manipal Hospitals[103][104][105](a leading hospital chain in India), announced launch of IBM Watson for Oncology, for cancer patients. This product provides information and insights to physicians and cancer patients to help them identify personalized, evidence-based cancer care options. Manipal Hospitals is the second hospital[106] in the world to adopt this technology and first in the world to offer it to patients online as an expert second opinion through their website.[103][107]. Manipal has discontinued this contract on 2018, December.

On January 7, 2017, IBM and Fukoku Mutual Life Insurance entered into a contract for IBM to deliver analysis to compensation payouts via its IBM Watson Explorer AI, this resulted in the loss of 34 jobs and the company said it would speed up compensation payout analysis via analysing claims and medical record and increase productivity by 30%. The company also said it would save ¥140m in running costs.[108]

It is said that IBM Watson will be carrying the knowledge-base of 1000 cancer specialists which will bring a revolution in the field of healthcare. IBM is regarded as a disruptive innovation. However the stream of oncology is still in its nascent stage.[109]

Several startups in the healthcare space have been effectively using seven business model archetypes to take solutions based on IBM Watson to the marketplace. These archetypes depends on the value generate for the target user (e.g. patient focus vs. healthcare provider and payer focus) and value capturing mechanisms (e.g. providing information or connecting stakeholders).[110]

In 2019 Eliza Strickland calls "the Watson Health story [...] a cautionary tale of hubris and hype" and provides a "representative sample of projects" with their status.[111]

IBM Watson Group

On January 9, 2014 IBM announced it was creating a business unit around Watson, led by senior vice president Michael Rhodin.[112] IBM Watson Group will have headquarters in New York's Silicon Alley and will employ 2,000 people. IBM has invested $1 billion to get the division going. Watson Group will develop three new cloud-delivered services: Watson Discovery Advisor, Watson Engagement Advisor, and Watson Explorer. Watson Discovery Advisor will focus on research and development projects in pharmaceutical industry, publishing, and biotechnology, Watson Engagement Advisor will focus on self-service applications using insights on the basis of natural language questions posed by business users, and Watson Explorer will focus on helping enterprise users uncover and share data-driven insights based on federated search more easily.[112] The company is also launching a $100 million venture fund to spur application development for "cognitive" applications. According to IBM, the cloud-delivered enterprise-ready Watson has seen its speed increase 24 times over—a 2,300 percent improvement in performance and its physical size shrank by 90 percent—from the size of a master bedroom to three stacked pizza boxes.[112] IBM CEO Virginia Rometty said she wants Watson to generate $10 billion in annual revenue within ten years.[113] On 20 September 2017, Anantha Chandrakasan, dean of the MIT School of Engineering announced Antonio Torralba as the MIT director of the MIT-IBM Watson AI Lab.[114] In March 2018, IBM's CEO Ginni Rometty proposed "Watson's Law," the "use of and application of business, smart cities, consumer applications and life in general."[115]

Chatterbot

Watson is being used via IBM partner program as a Chatterbot to provide the conversation for children's toys.[116]

Building codes

In 2015, the engineering firm ENGEO created an online service via the IBM partner program named GoFetchCode. GoFetchCode applies Watson's natural language processing and question-answering capabilities to the International Code Council's model building codes.[117]

Teaching assistant

IBM Watson is being used for several projects relating to education, and has entered partnerships with Pearson Education, Blackboard, Sesame Workshop and Apple.[118][119]

In its partnership with Pearson, Watson is being made available inside electronic text books to provide natural language, one-on-one tutoring to students on the reading material.[120]

As an individual using the free Watson APIs available to the public, Ashok Goel, a professor at Georgia Tech, used Watson to create a virtual teaching assistant to assist students in his class.[121] Initially, Goel did not reveal the nature of "Jill", which was created with the help of a few students and IBM. Jill answered questions where it had a 97% certainty of an accurate answer, with the remainder being answered by human assistants.[122]

The research group of Sabri Pllana developed an assistant for learning parallel programming using the IBM Watson.[123] A survey with a number of novice parallel programmers at the Linnaeus University indicated that such assistant will be welcome by students that learn parallel programming.

Weather forecasting

In August 2016, IBM announced it would be using Watson for weather forecasting.[124] Specifically, the company announced they would use Watson to analyze data from over 200,000 Weather Underground personal weather stations, and data from other sources, as a part of project Deep Thunder.[125]

Fashion

IBM Watson together with Marchesa designed a dress that changed the colour of the fabric depending on the mood of the audience. The dress lit up in different colours based on the sentiment of Tweets about the dress. Tweets were passed through a Watson tone analyzer and then sent back to a small computer inside the waist of the dress.[126]

Tax preparation

On February 5–6, 2017, tax preparation company H&R Block began nationwide use of a Watson-based program.[127]

Advertising

In September 2017, IBM announced that with its acquisition of The Weather Company's advertising sales division, and a partnership with advertising neural network Cognitiv, Watson will provide AI-powered advertising solutions.[128][129]

See also

References

  1. ^ IBM Watson: The Face of Watson on YouTube
  2. ^ a b c "DeepQA Project: FAQ". IBM. Retrieved February 11, 2011.
  3. ^ Ferrucci, David; Levas, Anthony; Bagchi, Sugato; Gondek, David; Mueller, Erik T. (2013-06-01). "Watson: Beyond Jeopardy!". Artificial Intelligence. 199: 93–105. doi:10.1016/j.artint.2012.06.009.
  4. ^ a b Hale, Mike (February 8, 2011). "Actors and Their Roles for $300, HAL? HAL!". The New York Times. Retrieved February 11, 2011.
  5. ^ "The DeepQA Project". IBM Research. Retrieved February 18, 2011.
  6. ^ "Dave Ferrucci at Computer History Museum – How It All Began and What's Next". IBM Research. December 1, 2011. Retrieved February 11, 2012. In 2007, when IBM executive Charles Lickel challenged Dave and his team to revolutionize Deep QA and put an IBM computer against Jeopardy!'s human champions, he was off to the races.
  7. ^ Loftus, Jack (April 26, 2009). "IBM Prepping 'Watson' Computer to Compete on Jeopardy!". Gizmodo. Retrieved April 27, 2009.
  8. ^ "IBM's "Watson" Computing System to Challenge All Time Henry Lambert Jeopardy! Champions". Sony Pictures Television. December 14, 2010. Archived from the original on June 16, 2013. Retrieved November 11, 2013.
  9. ^ a b Upbin, Bruce (February 8, 2013). "IBM's Watson Gets Its First Piece Of Business In Healthcare". Forbes.
  10. ^ Upbin, Bruce (February 8, 2013). "IBM's Watson Gets Its First Piece Of Business In Healthcare". Forbes. Retrieved March 10, 2013.
  11. ^ a b Ferrucci, D.; et al. (2010). "Building Watson: An Overview of the DeepQA Project". AI Magazine. 31 (3): 59. doi:10.1609/aimag.v31i3.2303. Retrieved February 19, 2011.
  12. ^ Rhinehart, Craig (January 17, 2011). "10 Things You Need to Know About the Technology Behind Watson". Entrepreneurial and Intrapreneurial Insights. Retrieved January 10, 2016.
  13. ^ "Watson, A System Designed for Answers: The Future of Workload Optimized Systems Design". IBM Systems and Technology. February 2011. p. 3. Retrieved September 9, 2015.
  14. ^ a b Jackson, Joab (February 17, 2011). "IBM Watson Vanquishes Human Jeopardy Foes". PC World. IDG News. Retrieved February 17, 2011.
  15. ^ Takahashi, Dean (February 17, 2011). "IBM researcher explains what Watson gets right and wrong". VentureBeat. Retrieved February 18, 2011.
  16. ^ Novell (February 2, 2011). "Watson Supercomputer to Compete on 'Jeopardy!' – Powered by SUSE Linux Enterprise Server on IBM POWER7". The Wall Street Journal. Archived from the original on April 21, 2011. Retrieved February 21, 2011.
  17. ^ a b "Is Watson the smartest machine on earth?". Computer Science and Electrical Engineering Department, University of Maryland Baltimore County. February 10, 2011. Retrieved February 11, 2011.
  18. ^ a b Rennie, John (February 14, 2011). "How IBM's Watson Computer Excels at Jeopardy!". PLoS blogs. Retrieved February 19, 2011.
  19. ^ Lucas, Mearian (February 21, 2011). "Can anyone afford an IBM Watson supercomputer? (Yes)". Computerworld. Retrieved February 21, 2011.
  20. ^ "Top500 List – November 2013". Top500.org.
  21. ^ Ferrucci, David; et al. "The AI Behind Watson – The Technical Article". AI Magazine (Fall 2010). Retrieved November 11, 2013.
  22. ^ a b c d e f g h i j k l m n o p q r Thompson, Clive (June 16, 2010). "Smarter Than You Think: What Is I.B.M.'s Watson?". The New York Times Magazine. Retrieved February 18, 2011.
  23. ^ a b c d Jennings, Ken (February 16, 2011). "My Puny Human Brain". Slate. Newsweek Interactive Co. LLC. Retrieved February 17, 2011.
  24. ^ "Will Watson Win On Jeopardy!?". Nova ScienceNOW. Public Broadcasting Service. January 20, 2011. Archived from the original on April 14, 2011. Retrieved January 27, 2011.
  25. ^ a b c d Needleman, Rafe (February 18, 2011). "Reporters' Roundtable: Debating the robobrains". CNET. Retrieved February 18, 2011.
  26. ^ a b c d "Jeopardy! Champ Ken Jennings". The Washington Post. February 15, 2011. Retrieved February 15, 2011.
  27. ^ a b Gondek, David (January 10, 2011). "How Watson "sees," "hears," and "speaks" to play Jeopardy!". IBM Research News. Retrieved February 21, 2011.
  28. ^ Avery, Lise (February 14, 2011). "Interview with Actor Jeff Woodman, Voice of IBM's Watson Computer" (MP3). Anything Goes!!. Retrieved February 15, 2011.
  29. ^ Kosinski, Robert J. (2008). "A Literature Review on Reaction Time". Clemson University. Retrieved January 10, 2016.
  30. ^ a b Baker (2011), p. 174.
  31. ^ Baker (2011), p. 178.
  32. ^ a b c Strachan, Alex (February 12, 2011). "For Jennings, it's a man vs. man competition". The Vancouver Sun. Archived from the original on February 21, 2011. Retrieved February 15, 2011.
  33. ^ Baker (2011), pp. 6–8.
  34. ^ Baker (2011), p. 30.
  35. ^ Radev, Dragomir R.; Prager, John; Samn, Valerie (2000). "Ranking potential answers to natural language questions" (PDF). Proceedings of the 6th Conference on Applied Natural Language Processing.
  36. ^ Prager, John; Brown, Eric; Coden, Annie; Radev, Dragomir R. (July 2000). "Question-answering by predictive annotation" (PDF). Proceedings, 23rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval.
  37. ^ a b Brodkin, Jon (February 10, 2010). "IBM's Jeopardy-playing machine can now beat human contestants". Network World. Archived from the original on June 3, 2013. Retrieved February 19, 2011.
  38. ^ Zimmer, Ben (February 17, 2011). "Is It Time to Welcome Our New Computer Overlords?". The Atlantic. Retrieved February 17, 2011.
  39. ^ Raz, Guy (January 28, 2011). "Can a Computer Become a Jeopardy! Champ?". National Public Radio. Retrieved February 18, 2011.
  40. ^ "Medical Students Offer Expertise to IBM's Jeopardy!-Winning Computer Watson as It Pursues a New Career in Medicine" (PDF). InTouch. New York Medical College. 18: 4. June 2012.
  41. ^ Stelter, Brian (December 14, 2010). "I.B.M. Supercomputer 'Watson' to Challenge 'Jeopardy' Stars". The New York Times. Retrieved December 14, 2010. An I.B.M. supercomputer system named after the company's founder, Thomas J. Watson Sr., is almost ready for a televised test: a bout of questioning on the quiz show "Jeopardy." I.B.M. and the producers of "Jeopardy" will announce on Tuesday that the computer, "Watson," will face the two most successful players in "Jeopardy" history, Ken Jennings and Brad Rutter, in three episodes that will be broadcast Feb. 14–16, 2011.
  42. ^ Baker (2011), p. 171.
  43. ^ Flatow, Ira (February 11, 2011). "IBM Computer Faces Off Against 'Jeopardy' Champs". Talk of the Nation. National Public Radio. Retrieved February 15, 2011.
  44. ^ Sostek, Anya (February 13, 2011). "Human champs of 'Jeopardy!' vs. Watson the IBM computer: a close match". Pittsburgh Post Gazette. Retrieved February 19, 2011.
  45. ^ Baker (2011), p. 117.
  46. ^ Baker (2011), pp. 232–258.
  47. ^ Dignan, Larry (January 13, 2011). "IBM's Watson wins Jeopardy practice round: Can humans hang?". ZDnet. Retrieved January 13, 2011.
  48. ^ a b "The IBM Challenge Day 1". Jeopardy. Season 27. Episode 23. February 14, 2011.
  49. ^ Lenchner, Jon (February 3, 2011). "Knowing what it knows: selected nuances of Watson's strategy". IBM Research News. IBM. Retrieved February 16, 2011.
  50. ^ Johnston, Casey (February 15, 2011). "Jeopardy: IBM's Watson almost sneaks wrong answer by Trebek". Ars Technica. Retrieved February 15, 2011.
  51. ^ a b c d "Computer crushes the competition on 'Jeopardy!'". Associated Press. February 15, 2011. Archived from the original on February 19, 2011. Retrieved February 19, 2011.
  52. ^ Tesauro, Gerald (February 13, 2011). "Watson's wagering strategies". IBM Research News. IBM. Retrieved February 18, 2011.
  53. ^ Staff (February 15, 2011). "IBM's computer wins 'Jeopardy!' but... Toronto?". CTV News. Retrieved February 15, 2011. Watson, IBM's quiz-master computer with the strangely serene voice, beat the humans on "Jeopardy!" tonight. But it got the final question on U.S. cities wrong, answering: Toronto.
  54. ^ a b Robertson, Jordan; Borenstein, Seth (February 16, 2011). "For Watson, Jeopardy! victory was elementary". The Globe and Mail. The Associated Press. Archived from the original on February 20, 2011. Retrieved February 17, 2011. A human would have considered Toronto and discarded it because it is a Canadian city, not a U.S. one, but that's not the type of comparative knowledge Watson has, Prof. Nyberg said.
  55. ^ Hamm, Steve (February 15, 2011). "Watson on Jeopardy! Day Two: The Confusion over and Airport Clue". A Smart Planet Blog. Retrieved February 21, 2011.
  56. ^ Johnston, Casey (February 15, 2011). "Creators: Watson has no speed advantage as it crushes humans in Jeopardy". Ars Technica. Retrieved February 21, 2011.
  57. ^ Oberman, Mira (February 17, 2011). "Computer creams human Jeopardy! champions". Vancouver Sun. Agence France-Presse. Archived from the original on February 20, 2011. Retrieved February 17, 2011. But a Final Jeopardy flub prompted one IBM engineer to wear a Toronto Blue Jays jacket to the second day of taping and Trebek to joke that he'd learned Toronto was a U.S. city.
  58. ^ Johnston, Casey (February 17, 2011). "Bug lets humans grab Daily Double as Watson triumphs on Jeopardy". Ars Technica. Retrieved February 21, 2011.
  59. ^ a b Upbin, Bruce (February 17, 2011). "IBM's Supercomputer Watson Wins It All With $367 Bet". Forbes. Retrieved February 21, 2011.
  60. ^ Oldenburg, Ann (February 17, 2011). "Ken Jennings: 'My puny brain' did just fine on 'Jeopardy!'". USA Today. Retrieved February 21, 2011.
  61. ^ "Show 6088 – The IBM Challenge, Day 2". Jeopardy!. February 16, 2011. Syndicated.
  62. ^ "World Community Grid to benefit from Jeopardy! competition". World Community Grid. February 4, 2011. Retrieved February 19, 2011.
  63. ^ "Jeopardy! And IBM Announce Charities To Benefit From Watson Competition". IBM Corporation. January 13, 2011. Retrieved February 19, 2011.
  64. ^ "IBM's Watson supercomputer crowned Jeopardy king". BBC News. February 17, 2011. Retrieved February 17, 2011.
  65. ^ Markoff, John (February 16, 2011). "Computer Wins on 'Jeopardy!': Trivial, It's Not". The New York Times. Yorktown Heights, New York. Retrieved February 17, 2011.
  66. ^ Searle, John (February 23, 2011). "Watson Doesn't Know It Won on 'Jeopardy!'". The Wall Street Journal. Retrieved July 26, 2011.
  67. ^ Lohr, Steve (December 5, 2011). "Creating AI based on the real thing". The New York Times..
  68. ^ a b "NJ congressman tops 'Jeopardy' computer Watson". Associated Press. March 2, 2011. Archived from the original on March 7, 2011. Retrieved March 2, 2011.
  69. ^ Weber, Robert C. (February 14, 2011). "Why 'Watson' matters to lawyers". The National Law Journal. Retrieved February 18, 2011.
  70. ^ Nay, Chris (September 6, 2011). "Putting Watson to work: Interview with GM of Watson Solutions Manoj Saxena". Smarter Planet Blog. IBM. Retrieved November 12, 2013.
  71. ^ Merritt, Rick (February 14, 2011). "IBM playing Jeopardy with tax dollars". EE Times. Retrieved February 19, 2011.
  72. ^ Dusto, Amy (December 3, 2013). "IBM's Watson computer helps shoppers via a new app". Internet Retailer. Retrieved January 10, 2016.
  73. ^ Comstock, Jonah (November 15, 2013). "With Watson API launch, IBM turns to WellTok for patients, MD Buyline for docs". MobiHealthNews. Retrieved January 10, 2016.
  74. ^ Upbin, Bruce (November 14, 2013). "IBM Opens Up Its Watson Cognitive Computer For Developers Everywhere". Forbes. Retrieved January 10, 2016.
  75. ^ "IBM's Watson to Join Research Team at Rensselaer". Rensselaer Polytechnic Institute. January 30, 2013. Retrieved October 1, 2013.
  76. ^ "The Independent Sector: Cultural, Economic and Social Contributions of New York's 100+, Not-for-Profit Colleges and Universities" (PDF). Commission on Independent Colleges and Universities. Summer 2013. p. 12. Retrieved October 1, 2013.
  77. ^ Cocks, Tim (February 6, 2014). "IBM starts rolling out Watson supercomputer in Africa". Reuters. Retrieved January 10, 2016.
  78. ^ Coolidge, Donald (May 29, 2014). "IBM Watson Mobile Developers Challenge Finalists: Majestyk". Majestyk Apps. Retrieved January 10, 2016.
  79. ^ "Majestyk Apps – An IBM Watson Mobile Developer Challenge Winner". Flickr. June 3, 2014. Retrieved January 10, 2016.
  80. ^ "Red Ant – An IBM Watson Mobile Developer Challenge Winner". Flickr. June 3, 2014. Retrieved January 10, 2016.
  81. ^ "GenieMD – An IBM Watson Mobile Developer Challenge Winner". Flickr. June 3, 2014. Retrieved January 10, 2016.
  82. ^ "IBM Announces Watson Mobile Developer Challenge Winners". IBM News. June 3, 2014. Retrieved January 10, 2016.
  83. ^ All, Ann (July 9, 2014). "Genesys to Put IBM's Watson to Work". Enterprise Apps Today. Retrieved January 10, 2016.
  84. ^ Wilson, Mark (June 30, 2014). "IBM's Watson Is Now A Cooking App With Infinite Recipes". fastcodesign.com. Retrieved January 10, 2016.
  85. ^ Hardawar, Devindra. "IBM's big bet on Watson is paying off with more apps and DNA analysis". Engadget. Retrieved July 2, 2015.
  86. ^ Griggs, Mary Beth (May 20, 2016). "IBM Watson can help find water wasters in drought-stricken California". Popular Science. Retrieved July 4, 2016.
  87. ^ Marty Swant (6 September 2016). "Condé Nast Has Started Using IBM's Watson to Find Influencers for Brands". Adweek. Retrieved 8 September 2016.
  88. ^ "Rare Carat Releases World's First Artificial Intelligence Jeweler Using IBM Watson Technology". PRNewswire. February 28, 2017.
  89. ^ "Rare Carat's Watson-powered chatbot will help you put a diamond ring on it". TechCrunch. February 15, 2017.
  90. ^ "10 ways you may have already used IBM Watson". VentureBeat. March 10, 2017.
  91. ^ "IBM Watson to help Pebble Beach create a virtual concierge for guests". VentureBeat. 2017-05-09. Retrieved 2017-05-10.
  92. ^ IBM Watson: artificial intelligence arrives at the Museum of Anthropology. Aban Tech. October 31, 2017. Retrieved May 11, 2018 – via Youtube.
  93. ^ a b c "Putting Watson to Work: Watson in Healthcare". IBM. Retrieved November 11, 2013.
  94. ^ "IBM Watson Helps Fight Cancer with Evidence-Based Diagnosis and Treatment Suggestions" (PDF). IBM. Retrieved November 12, 2013.
  95. ^ Saxena, Manoj (February 13, 2013). "IBM Watson Progress and 2013 Roadmap (Slide 7)". IBM. Retrieved November 12, 2013.
  96. ^ Wakeman, Nick (February 17, 2011). "IBM's Watson heads to medical school". Washington Technology. Retrieved February 19, 2011.
  97. ^ Mathews, Anna Wilde (September 12, 2011). "Wellpoint's New Hire: What is Watson?". The Wall Street Journal.
  98. ^ Miliard, Mike (October 30, 2012). "Watson Heads to Medical School: Cleveland Clinic, IBM Send Supercomputer to College". Healthcare IT News. Retrieved November 11, 2013.
  99. ^ "MD Anderson Taps IBM Watson to Power "Moon Shots" Mission Aimed at Ending Cancer, Starting with Leukemia" (Press release). IBM.
  100. ^ "IBM's Watson Now Tackles Clinical Trials At MD Anderson Cancer Center". Forbes.
  101. ^ "MD Anderson Benches IBM Watson In Setback For Artificial Intelligence In Medicine". Forbes.
  102. ^ Leske, Nikola (February 9, 2013). "Doctors Seek Help on Cancer Treatment from IBM Supercomputer". Reuters. Retrieved November 11, 2013.
  103. ^ a b "Manipal Hospitals | Watson for Oncology | Cancer Treatment". watsononcology.manipalhospitals.com. Retrieved 2017-01-17.
  104. ^ "MANIPAL HOSPITALS ANNOUNCES NATIONAL LAUNCH OF IBM WATSON FOR ONCOLOGY". www-03.ibm.com. 2016-07-29. Retrieved 2017-01-17.
  105. ^ "Manipal Hospitals is first adopter of IBM Watson in India". www-03.ibm.com. 2015-12-02. Retrieved 2017-01-17.
  106. ^ ANI (2016-10-28). "Manipal Hospitals to adopt IBM's 'Watson for Oncology' supercomputer for cancer treatment". Business Standard India. Retrieved 2017-01-17.
  107. ^ "Hospitals in Asia use Watson supercomputer for cancer treatment". STAT. 2016-08-19. Retrieved 2017-01-17.
  108. ^ McCurry, Justin (2017-01-05). "Japanese company replaces office workers with artificial intelligence". The Guardian. ISSN 0261-3077. Retrieved 2017-01-29.
  109. ^ Satell, Greg. "How IBM's Watson Will Change The Way We Work". Forbes. Retrieved 2017-08-08.
  110. ^ Garbuio, Massimo; Lin, Nidthida (2019). "Artificial Intelligence as a Growth Engine for Health Care Startups: Emerging Business Models". California Management Review. 61 (2): 59–83. doi:10.1177/0008125618811931.
  111. ^ Strickland, Eliza (2019-04-02). "How IBM Watson Overpromised and Underdelivered on AI Health Care". IEEE Spectrum: Technology, Engineering, and Science News. Retrieved 2019-04-04.
  112. ^ a b c "IBM Watson Group Unveils Cloud-Delivered Watson Services to Transform Industrial R&D, Visualize Big Data Insights and Fuel Analytics Exploration". IBM News. January 9, 2014. Retrieved January 10, 2016.
  113. ^ Ante, Spencer E. (January 9, 2014). "IBM Set to Expand Watson's Reach". The Wall Street Journal. Retrieved January 9, 2014.
  114. ^ "New leadership for MIT-IBM Watson AI Lab". news.mit.edu. September 21, 2017. Retrieved September 21, 2017.
  115. ^ "IBM CEO Rometty Proposes 'Watson's Law': AI In Everything", Adrian Bridgewater, Forbes, March 20, 2018
  116. ^ Takahashi, Dean. "Elemental's smart connected toy CogniToys taps IBM's Watson supercomputer for its brains". Venture Beat. Retrieved May 15, 2015.
  117. ^ "About Us GoFetchCode". GoFetchCode. 2015-10-21. Retrieved 2017-07-04.
  118. ^ Leopold, Todd. "A professor built an AI teaching assistant for his courses — and it could shape the future of education". Business Insider. Business Insider. Retrieved 26 September 2017.
  119. ^ Straumsheim, Carl. "'Augmented Intelligence' for Higher Ed". Inside Higher Ed. Inside Higher Ed. Retrieved 26 September 2017.
  120. ^ Plenty, Rebecca (October 25, 2016). "Pearson Taps IBM's Watson as a Virtual Tutor for College Students" (October 25, 2016). Bloomberg. Bloomberg. Retrieved 26 September 2017.
  121. ^ Maderer, Jason. "Artificial Intelligence Course Creates AI Teaching Assistant". Georgia Tech News. Georgia Tech News. Retrieved 26 September 2017.
  122. ^ McFarlane, Matt. "Professor reveals to students that his assistant was an AI all along". Sydney Morning Herald. Retrieved May 14, 2016.
  123. ^ Memeti, Suejb; Pllana, Sabri (January 2018). "PAPA: A parallel programming assistant powered by IBM Watson cognitive computing technology". Journal of Computational Science. 26: 275–284. doi:10.1016/j.jocs.2018.01.001.
  124. ^ Jancer, Matt (26 August 2016). "IBM's Watson Takes On Yet Another Job, as a Weather Forecaster". Smithsonian. Retrieved 29 August 2016.
  125. ^ Booton, Jennifer (15 June 2016). "IBM finally reveals why it bought The Weather Company". Market Watch. Retrieved 29 August 2016.
  126. ^ https://www.ibm.com/blogs/internet-of-things/cognitive-marchesa-dress/
  127. ^ Moscaritolo, Angela (2 February 2017). "H&R Block Enlists IBM Watson to Find Tax Deductions". PC Magazine. Retrieved 14 February 2017.
  128. ^ Swant, Marty (September 24, 2017). "As IBM Ramps Up Its AI-Powered Advertising, Can Watson Crack the Code of Digital Marketing?". www.adweek.com. Retrieved 2019-03-18.
  129. ^ "AI is A Rocket About to Launch - Here's How to Get On Board". www.ibm.com. Retrieved 2019-03-18.
Bibliography
  • Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. ISBN 978-0-547-48316-0.

Further reading

  • Baker, Stephen (2012) Final Jeopardy: The Story of Watson, the Computer That Will Transform Our World, Mariner Books.
  • Jackson, Joab (2014). IBM bets big on Watson-branded cognitive computing PCWorld: Jan 9, 2014 2:30 PM
  • Greenemeier, Larry. (2013). Will IBM’s Watson Usher in a New Era of Cognitive Computing? Scientific American. Nov 13, 2013 |* Lazarus, R. S. (1982).
  • Kelly, J.E. and Hamm, S. ( 2013). Smart Machines: IBM's Watson and the Era of Cognitive Computing. Columbia Business School Publishing

External links

J! Archive

Videos

AlchemyAPI

AlchemyAPI is an IBM-owned company that uses machine learning (specifically, deep learning) to do natural language processing (specifically, semantic text analysis, including sentiment analysis) and computer vision (specifically, face detection and recognition) for its clients both over the cloud and on-premises.

Ask The Doctor

Ask The Doctor is a medical information website that was founded in Toronto, Canada. The website allows users to browse previous medical questions & answers or ask their own personalized medical question to either a General Physician or Specialist for a fee.

The platform was co-founded by the former NFL player, Israel Idonije, Dr. Patrick A. Golden, serial entrepreneurs Prakash Chand and Q. Dhalla

Automation

Automation is the technology by which a process or procedure is performed with minimal human assistance. Automation or automatic control is the use of various control systems for operating equipment such as machinery, processes in factories, boilers and heat treating ovens, switching on telephone networks, steering and stabilization of ships, aircraft and other applications and vehicles with minimal or reduced human intervention.

Automation covers applications ranging from a household thermostat controlling a boiler, to a large industrial control system with ten of thousands of input measurements and output control signals. In control complexity, it can range from simple on-off control to multi-variable high-level algorithms.

In the simplest type of an automatic control loop, a controller compares a measured value of a process with a desired set value, and processes the resulting error signal to change some input to the process, in such a way that the process stays at its set point despite disturbances. This closed-loop control is an application of negative feedback to a system. The mathematical basis of control theory was begun in the 18th century and advanced rapidly in the 20th.

Automation has been achieved by various means including mechanical, hydraulic, pneumatic, electrical, electronic devices and computers, usually in combination. Complicated systems, such as modern factories, airplanes and ships typically use all these combined techniques. The benefit of automation includes labor savings, savings in electricity costs, savings in material costs, and improvements to quality, accuracy, and precision.

The World Bank's World Development Report 2019 shows evidence that the new industries and jobs in the technology sector outweigh the economic effects of workers being displaced by automation.The term automation, inspired by the earlier word automatic (coming from automaton), was not widely used before 1947, when Ford established an automation department. It was during this time that industry was rapidly adopting feedback controllers, which were introduced in the 1930s.

Bob Dylan

Bob Dylan (born Robert Allen Zimmerman; May 24, 1941) is an American singer-songwriter, author, and visual artist who has been a major figure in popular culture for six decades. Much of his most celebrated work dates from the 1960s, when songs such as "Blowin' in the Wind" (1963) and "The Times They Are a-Changin'" (1964) became anthems for the Civil Rights Movement and anti-war movement. His lyrics during this period incorporated a wide range of political, social, philosophical, and literary influences, defied pop-music conventions and appealed to the burgeoning counterculture.

Following his self-titled debut album in 1962, which mainly comprised traditional folk songs, Dylan made his breakthrough as a songwriter with the release of The Freewheelin' Bob Dylan the following year. The album featured "Blowin' in the Wind" and the thematically complex "A Hard Rain's a-Gonna Fall". For many of these songs he adapted the tunes and sometimes phraseology of older folk songs. He went on to release the politically charged The Times They Are a-Changin' and the more lyrically abstract and introspective Another Side of Bob Dylan in 1964. In 1965 and 1966, Dylan encountered controversy when he adopted electrically amplified rock instrumentation, and in the space of 15 months recorded three of the most important and influential rock albums of the 1960s: Bringing It All Back Home (1965), Highway 61 Revisited (1965) and Blonde on Blonde (1966). The six-minute single "Like a Rolling Stone" (1965) radically expanded what a pop song could convey.

In July 1966, Dylan withdrew from touring after being injured in a motorcycle accident. During this period he recorded a large body of songs with members of the Band, who had previously backed him on tour. These recordings were released as the collaborative album The Basement Tapes, in 1975. In the late 1960s and early 1970s, Dylan explored country music and rural themes in John Wesley Harding (1967), Nashville Skyline (1969), and New Morning (1970). In 1975, he released Blood on the Tracks, which many saw as a return to form. In the late 1970s, he became a born-again Christian and released a series of albums of contemporary gospel music before returning to his more familiar rock-based idiom in the early 1980s. The major works of his later career include Time Out of Mind (1997), "Love and Theft" (2001), Modern Times (2006) and Tempest (2012). His most recent recordings have comprised versions of traditional American standards, especially songs recorded by Frank Sinatra. Backed by a changing lineup of musicians, he has toured steadily since the late 1980s on what has been dubbed the Never Ending Tour.

Since 1994, Dylan has published eight books of drawings and paintings, and his work has been exhibited in major art galleries. He has sold more than 100 million records, making him one of the best-selling music artists of all time. He has also received numerous awards including ten Grammy Awards, a Golden Globe Award, and an Academy Award. Dylan has been inducted into the Rock and Roll Hall of Fame, Minnesota Music Hall of Fame, Nashville Songwriters Hall of Fame, and the Songwriters Hall of Fame. The Pulitzer Prize jury in 2008 awarded him a special citation for "his profound impact on popular music and American culture, marked by lyrical compositions of extraordinary poetic power". In 2012, Dylan received the Presidential Medal of Freedom, and in 2016, he was awarded the Nobel Prize in Literature "for having created new poetic expressions within the great American song tradition".

Chatbot

A chatbot (also known as a spy, conversational bot, chatterbot, interactive agent, conversational interface, Conversational AI, talkbot or artificial spy entity) is a computer program or an artificial intelligence which conducts a conversation via auditory or textual methods. Such programs are often designed to convincingly simulate how a human would behave as a conversational partner, thereby passing the Turing test. Chatbots are typically used in dialog systems for various practical purposes including customer service or information acquisition. Some chatbots use sophisticated natural language processing systems, but many simpler ones scan for keywords within the input, then pull a reply with the most matching keywords, or the most similar wording pattern, from a database.

The term "ChatterBot" was originally coined by Michael Mauldin (creator of the first Verbot, Julia) in 1994 to describe these conversational programs. Today, most chatbots are accessed via virtual assistants such as Google Assistant and Amazon Alexa, via messaging apps such as Facebook Messenger or WeChat, or via individual organizations' apps and websites. Chatbots can be classified into usage categories such as conversational commerce (e-commerce via chat), analytics, communication, customer support, design, developer tools, education, entertainment, finance, food, games, health, HR, marketing, news, personal, productivity, shopping, social, sports, travel and utilities.Beyond chatbots, Conversational AI refers to the use of messaging apps, speech-based assistants and chatbots to automate communication and create personalized customer experiences at scale.

David Ferrucci

David Ferrucci was the principal investigator who in 2007–2011 led a team of IBM and academic researchers and engineers to the development of the Watson computer system that won a television quiz.

Ferrucci graduated from Manhattan College, with a B.S. degree in biology and from Rensselaer Polytechnic Institute, in 1994 with a Ph.D. degree in computer science specializing in knowledge representation and reasoning.

He joined IBM's Thomas J. Watson in 1995 and left in 2012 to join Bridgewater Associates. He is also the Founder, CEO, and Chief Scientist of Elemental Cognition, a venture exploring a new field of study called natural learning, which Ferrucci describes as 'artificial intelligence that understands the world the way people do.'.Ferrucci is interviewed in the 2018 documentary on artificial intelligence Do You Trust This Computer?

Deep Blue (chess computer)

Deep Blue was a chess-playing computer developed by IBM. It is known for being the first computer chess-playing system to win both a chess game and a chess match against a reigning world champion under regular time controls.

Deep Blue won its first game against a world champion on 10 February 1996, when it defeated Garry Kasparov in game one of a six-game match. However, Kasparov won three and drew two of the following five games, defeating Deep Blue by a score of 4–2. Deep Blue was then heavily upgraded, and played Kasparov again in May 1997. Deep Blue won game six, therefore winning the six-game rematch 3½–2½ and becoming the first computer system to defeat a reigning world champion in a match under standard chess tournament time controls. Kasparov accused IBM of cheating and demanded a rematch. IBM refused and dismantled Deep Blue.

Development for Deep Blue began in 1985 with the ChipTest project at Carnegie Mellon University. This project eventually evolved into Deep Thought, at which point the development team was hired by IBM. The project evolved once more with the new name Deep Blue in 1989. Grandmaster Joel Benjamin was also part of the development team.

IBM Deep Thunder

Deep Thunder is a research project by IBM that aims to improve short-term local weather forecasting through the use of high-performance computing. It is part of IBM's Deep Computing initiative that also produced the Deep Blue chess computer.

Deep Thunder is intended to provide local, high-resolution weather predictions customized to weather-sensitive specific business operations. For example, it could be used to predict the wind velocity at an Olympic diving platform, destructive thunderstorms, and combined with other physical models to predict where there will be flooding, damaged power lines and algal blooms. The project is now headquartered at IBM's Thomas J. Watson Research Center in Yorktown Heights, New York.

Jeopardy!

Jeopardy! is an American television game show created by Merv Griffin. The show features a quiz competition in which contestants are presented with general knowledge clues in the form of answers, and must phrase their responses in the form of questions. The original daytime version debuted on NBC on March 30, 1964, and aired until January 3, 1975. A weekly nighttime syndicated edition aired from September 1974 to September 1975, and a revival, The All-New Jeopardy!, ran on NBC from October 1978 to March 1979. The current version, a daily syndicated show produced by Sony Pictures Television, premiered on September 10, 1984.

Both NBC versions and the weekly syndicated version were hosted by Art Fleming. Don Pardo served as announcer until 1975, and John Harlan announced for the 1978–1979 show. Since its inception, the daily syndicated version has featured Alex Trebek as host and Johnny Gilbert as announcer.

With over 8,000 episodes aired, the daily syndicated version of Jeopardy! has won a record 33 Daytime Emmy Awards as well as a Peabody Award. In 2013, the program was ranked No. 45 on TV Guide's list of the 60 greatest shows in American television history. Jeopardy! has also gained a worldwide following with regional adaptations in many other countries. The daily syndicated series' 35th season premiered on September 10, 2018.

John McCarthy (computer scientist)

John McCarthy (September 4, 1927 – October 24, 2011) was an American computer scientist and cognitive scientist. McCarthy was one of the founders of the discipline of artificial intelligence. He coined the term "artificial intelligence" (AI), developed the Lisp programming language family, significantly influenced the design of the ALGOL programming language, popularized timesharing, and was very influential in the early development of AI.

McCarthy spent most his career at Stanford University. He received many accolades and honors, such as the 1971 Turing Award for his contributions to the topic of AI, the United States National Medal of Science, and the Kyoto Prize.

Ken Jennings

Kenneth Wayne Jennings III (born May 23, 1974) is an American game show contestant, computer scientist, and author. He is the second highest-earning American game show contestant of all time. Jennings holds the record for the longest winning streak on the U.S. game show Jeopardy! with 74 wins. He also holds the record for the highest average correct responses per game in Jeopardy! history (for those contestants with at least 300 correct responses) with 35.9 during his original run (no other contestant has exceeded 30) and 33.1 overall including tournaments and special events. In 2004, Jennings won 74 consecutive Jeopardy! games before he was defeated by challenger Nancy Zerg on his 75th appearance. His total earnings on Jeopardy! are $3,522,700, consisting of: $2,520,700 over his 74 wins; a $2,000 second-place prize in his 75th appearance; a $500,000 second-place prize in the Jeopardy! Ultimate Tournament of Champions (2005); a $300,000 second-place prize in Jeopardy's IBM Challenge (2011), when he lost to the Watson computer but became the first human to best third-place finisher Brad Rutter; a $100,000 second-place prize in the Jeopardy! Battle of the Decades (2014); and a $100,000 second-place prize (his share of his team's $300,000 prize) in the Jeopardy! All-Star Games (2019).

During his first run of Jeopardy! appearances, Jennings earned the record for the highest American game show winnings. His total was surpassed by Rutter, who defeated Jennings in the finals of the Jeopardy! Ultimate Tournament of Champions, adding $2,000,000 to Rutter's existing Jeopardy! winnings. Jennings regained the record after appearing on several other game shows, culminating with his results on an October 2008 appearance on Are You Smarter Than a 5th Grader?, though Rutter retained the record for highest Jeopardy! winnings and once again passed Jennings' total after his victory in the Jeopardy Battle of the Decades tournament.

After his success on Jeopardy!, Jennings wrote about his experience and explored American trivia history and culture in his book Brainiac: Adventures in the Curious, Competitive, Compulsive World of Trivia Buffs, published in 2006.

Phil Mason

Philip E. Mason is a British chemist and YouTuber with the online pseudonym Thunderf00t. He is known for criticising religion, pseudoscience (including creationism) and feminism. He works at the Institute of Organic Chemistry and Biochemistry of the Academy of Sciences of the Czech Republic.

Rich Skrenta

Richard "Rich" Skrenta (born 1967 (age 51–52) in Pittsburgh, Pennsylvania) is a computer programmer and Silicon Valley entrepreneur who created the web search engine blekko.

Robert Watson (computer scientist)

Robert Nicholas Maxwell Watson (born 3 May 1977) is a FreeBSD developer, and founder of the TrustedBSD Project. He is currently employed as a University Lecturer in Systems, Security, and Architecture in the Security Research Group at the University of Cambridge Computer Laboratory.

Scientific American

Scientific American (informally abbreviated SciAm or sometimes SA) is an American popular science magazine. Many famous scientists, including Albert Einstein, have contributed articles to it. It is the oldest continuously published monthly magazine in the United States (though it only became monthly in 1921).

Wolfram Alpha

Wolfram|Alpha (also styled WolframAlpha or Wolfram Alpha) is a computational knowledge engine or answer engine developed by Wolfram Alpha LLC, a subsidiary of Wolfram Research. It is an online service that answers factual queries directly by computing the answer from externally sourced "curated data", rather than providing a list of documents or web pages that might contain the answer as a search engine might.Wolfram|Alpha, which was released on May 18, 2009, is based on Wolfram's earlier flagship product Wolfram Mathematica, a computational platform or toolkit that encompasses computer algebra, symbolic and numerical computation, visualization, and statistics capabilities. Additional data is gathered from both academic and commercial websites such as the CIA's The World Factbook, the United States Geological Survey, a Cornell University Library publication called All About Birds, Chambers Biographical Dictionary, Dow Jones, the Catalogue of Life, CrunchBase, Best Buy, the FAA and optionally a user's Facebook account.

Yorktown, New York

Yorktown is a town on the northern border of Westchester County, New York. A suburb of the New York City metropolitan area, it is approximately 38 miles (61 km) north of midtown Manhattan. The population was 36,081 at the 2010 U.S. Census.

History
Products
Business entities
Facilities
Initiatives
Inventions
Terminology
CEOs
Board of directors
Other
Computable knowledge
Topics and
concepts
Proposals and
implementations
In fiction
Tournaments
Notable contestants
In popular culture
Adaptations
Related articles
Active
Discontinued
Fictional

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.