IBM Advances Watson’s Ability to Understand the Language of Business

– Announces first commercial availability of key technologies from Project Debater;

Integrated into IBM Watson, the new capabilities help enable businesses to begin mining & analyzing some of the most challenging aspects of human language

NEW YORK, March 11, 2020 — IBM, the leader in artificial intelligence for business1, is announcing several new IBM Watson technologies designed to help organizations begin identifying, understanding and analyzing some of the most challenging aspects of the English language with greater clarity, for greater insights.

The new technologies represent the first commercialization of key Natural Language Processing (NLP) capabilities to come from IBM Research’s Project Debater, the only AI system capable of debating humans on complex topics. For example, a new advanced sentiment analysis feature is defined to identify and analyze idioms and colloquialisms for the first time. Phrases, like ‘hardly helpful,’ or ‘hot under the collar,’ have been challenging for AI systems because they are difficult for algorithms to spot. With advanced sentiment analysis, businesses can begin analyzing such language data with Watson APIs for a more holistic understanding of their operations. Further, IBM is bringing technology from IBM Research for understanding business documents, such as PDF’s and contracts, to also add to their AI models.

“Language is a tool for expressing thought and opinion, as much as it is a tool for information,” said Rob Thomas, General Manager, IBM Data and AI. “This is why we’re harvesting technology from Project Debater and integrating it into Watson – to enable businesses to capture, analyze, and understand more from human language and start to transform how they utilize intellectual capital that’s codified in data.”

Today IBM is announcing that it plans to integrate Project Debater technologies into Watson throughout the year, with a focus on advancing clients’ ability to exploit natural language:

A.    Analysis – Advanced Sentiment Analysis. IBM has enhanced sentiment analysis to be able to better identify and understand complicated word schemes like idioms (phrases and expressions) and so called, sentiment shifters, which are combinations of words that, together, take on new meaning, such as, “hardly helpful.” This technology will be integrated into Watson Natural Language Understanding this month. In addition, we are announcing a new classification technology that will enable clients to create AI models that can more easily classify clauses that occur in business documents, like procurement contracts. Based on Project Debater’s deep learning-based classification technology, the new capability can learn from as few as several hundred samples to do new classifications quickly and easily. It is planned to be added to Watson Discovery later this year.

B.    Briefs – Summarization. This technology pulls textual data from a variety of sources to provide users with a summary of what is being said and written about a particular topic. An early version of Summarization was leveraged at The GRAMMYS this year to analyze over 18 million articles, blogs and bios to produce bite-sized insights on hundreds of GRAMMY artists and celebrities. The data was then infused into the red carpet live stream, on-demand videos and photos across www.grammy.com to give fans deeper context about the leading topics of the night. It is planned to be added to IBM Watson Natural Language Understanding later in the year.

C.    Clustering – Advanced Topic Clustering. Building on insights gained from Project Debater, new topic clustering techniques will enable users to “cluster” incoming data to create meaningful “topics” of related information, which can then be analyzed. The technique, which is planned to be integrated into Watson Discovery later this year, will also allow subject matter experts to customize and fine-tune the topics to reflect the language of specific businesses or industries, like insurance, healthcare and manufacturing.

IBM, has long been a leader in NLP, developing technologies that enable computer systems to learn, analyze and understand human language – including sentiment, dialects, intonations, and more – with increasing accuracy and speed. IBM has brought its NLP technology, much of which was born in IBM Research, to market via Watson. Product such as, Watson Discovery for document understanding, IBM Watson Assistant for virtual agents, and Watson Natural Language Understanding for advanced sentiment analysis, are all infused with NLP.

ESPN Fantasy Football uses Watson Discovery and Watson Knowledge Studio to analyze millions of football data sources each day during the season to offer millions of fantasy football players real-time insights. By processing natural language, Watson identifies the tone and sentiment of news articles, blogs, forums, rankings, projections, podcasts and tweets that cover everything from locker room insights to injury analysis. ESPN Fantasy Football surfaces these insights in player cards that snapshot the “boom” and “bust” potential of each player, as well as a “Player Buzz” section that summarizes the positive or negative commentary about a player.

KPMG, a multinational professional services network, and one of the Big Four accounting organizations, worked with IBM to create an AI solution based on a variety of Watson services, including Watson Natural Language Understanding. This technology makes it more effective for companies to identify, claim and retain potential R&D income tax credits. Developed by KPMG, the solution can help clients increase the amount of R&D income tax credits they capture because the Watson technology is able to review more documentation quickly while minimizing disruption to the client’s business.

In the past year, KPMG clients have seen more potential for R&D tax credits, with some projects even seeing more than a 1000% increase in the number of documents reviewed. The solution helps clients uncover more potential activities that qualify for additional income tax credits, while reducing business disruption. As a result, engineers and scientists can stay focused on innovative R&D work by spending less time on income tax compliance activities.

IBM Watson Health and EBSCO Information Services Collaborate to Launch Integrated Clinical Decision Support Solution

Solution to combine real-world drug and disease content, natural language processing and cloud-based tools with the goal of streamlining clinical decision-making at the point-of-care

CAMBRIDGE, Mass., March 11, 2020 — IBM Watson Health and EBSCO Information Services (EBSCO) today announced a strategic collaboration aimed towards enhancing clinical decision support (CDS) and operations for healthcare providers and health systems. The companies are combining their respective solution suites – DynaMed® and IBM® Micromedex® with Watson™ – into a single, high-value global solution called “DynaMed and Micromedex with Watson.” The combined solution suite will be designed to bring together drug and disease content, into a single source for evidence-based insights to help inform clinical decisions.

DynaMed provides peer-reviewed clinical content, including systematic literature reviews in 28 specialties for comprehensive disease topics, health conditions, and abnormal findings to highly focused topics on evaluation, differential diagnosis, and management. The content undergoes a rigorous, seven-step process, giving clinicians access to current, evidence-based diagnostic and therapeutic recommendations. IBM Micromedex is one of the largest online reference databases for medication information. It is used by more than 4,500 hospitals and health systems worldwide to support decision-making in medication therapy management, disease and condition management, toxicology, alternative medicine and patient education.

“Research shows that healthcare provider confidence in clinical decision support comes from knowing the evidence-based methodologies that helped build the content foundation1,” said Anil Jain, Vice President, Chief Health Information Officer, IBM Watson Health. “When clinicians are confident that their clinical decision support is drawing recommendations from accurate and timely information, they are more likely to use the technology to support their care decisions.2

IBM Micromedex with Watson is designed to use artificial intelligence (AI) and natural language processing (NLP) to bypass keyword searches in favor of a more conversational approach to searching drug content. DynaMed and Micromedex with Watson will be designed to provide clinicians direct access to drug and disease content at the point-of-care to support clinical decision making.

“The synergy of our world class, evidence-based content is expected to drive efficiencies for more healthcare systems using data-driven insights for decision making,” said Betsy Jones, Executive Vice President, EBSCO Clinical Decisions. “Built on the latest cloud-based technology, we will harness world class content from DynaMed and IBM Micromedex onto a seamless and personalized solution. A terrific alternative for the care team.”

“When it comes to clinical decision support, content is king. Nearly nine out of 10 physicians in the US currently implement electronic health record technology3” said Todd Nolen, General Manager, IBM Micromedex Solutions, IBM Watson Health. “We believe DynaMed and Micromedex with Watson can deliver value and innovation to healthcare organizations, to help enable rapid access to high-quality medical evidence that is essential for clinicians as they work to provide safe and effective patient care within their clinical workflow.”

DynaMed and Micromedex with Watson is expected to be available for general adoption in April, 2020.4 IBM Watson Health and EBSCO Information Services will also continue to sell the IBM Micromedex with Watson and DynaMed solution suites separately in order to offer flexible options to help meet customers’ needs.

Sea Trials Begin for Mayflower Autonomous Ship’s ‘AI Captain’

Promare and IBM engineers develop new class of marine AI to advance $90BN autonomous shipping market

IBM Edge, AI technologies and over a million images at core of ship’s ability to sense, think and act autonomously at sea

PLYMOUTH, England, March 5, 2020 — IBM and marine research organization Promare, have announced that a new ‘AI Captain’, which will enable the Mayflower Autonomous Ship (MAS) to self-navigate across the Atlantic later this year, is to go to sea this month for testing. The trial, which will take place on a manned research vessel off the coast of Plymouth in the UK, will evaluate how the AI Captain uses cameras, AI and edge computing systems to safely navigate around ships, buoys and other ocean hazards that it is expected to meet during its transatlantic voyage in September 2020.

MAS will trace the route of the original 1620 Mayflower to commemorate the 400th anniversary of the famous voyage. Sailing from Plymouth, UK to Plymouth, Massachusetts with no human captain or onboard crew, it will become one of the first full-sized, fully autonomous vessels to cross the Atlantic. The mission will further the development of commercial autonomous ships and help transform the future of marine research.

“While the autonomous shipping market is set to grow from $90BN today to over $130BN by 2030*, many of today’s autonomous ships are really just automated – robots which do not dynamically adapt to new situations and rely heavily on operator override,” said Don Scott, CTO of the Mayflower Autonomous Ship.  “Using an integrated set of IBM’s AI, cloud, and edge technologies, we are aiming to give the Mayflower the ability to operate independently in some of the most challenging circumstances on the planet.”

The trial begins
MAS will rely on IBM’s advanced AI and edge computing systems to sense, think and make decisions at sea, even with no human intervention. With the three hulls of the trimaran MAS currently reaching the final phase of construction in Gdansk, Poland, a prototype of the AI Captain will first take to the water on a manned vessel – the Plymouth Quest – a research ship owned and operated by the Plymouth Marine Laboratory in the UK. The March sea trials, which will be conducted in waters of Smart Sound Plymouth, under the watchful eye of the Plymouth Quest’s human crew, will help determine how the Mayflower’s AI Captain performs in real-world maritime scenarios, and provide valuable feedback to help refine the ship’s machine learning models.

Two years of training and a million nautical images
Over the past two years, the Mayflower team has been training the ship’s AI models using over a million nautical images collected from cameras in the Plymouth Sound in the UK as well as open source databases. To meet the processing demands of machine learning, the team used an IBM Power AC922 fuelled by IBM Power9 CPUs and NVIDIA V100 Tensor Core GPUs, the same technologies behind the world’s smartest AI supercomputers. Now, using IBM’s computer vision technology, the Mayflower’s AI Captain should be able to independently detect and classify ships, buoys and other hazards such as land, breakwaters and debris.

Keeping things local
As the Mayflower will not have access to high-bandwidth connectivity throughout its transatlantic voyage, it will use a fully autonomous IBM edge computing system powered by several onboard NVIDIA Jetson AGX Xavier devices. While at sea, the Mayflower will process data locally on NVIDIA Jetson, increasing the speed of decision making and reducing the amount of data flow and storage on the ship.

“Edge computing is critical to making an autonomous ship like the Mayflower possible. The Mayflower needs to sense its environment, make smart decisions about its situation and then act on these insights in the minimum amount of time – even in the presence of intermittent connectivity, and all while keeping data secure from cyber threats,” said Rob High, VP and CTO for Edge Computing, IBM. “IBM’s edge computing solutions are designed to support mission-critical workloads like the Mayflower Autonomous Ship, extending the power of the cloud and the security and flexibility of Red Hat Enterprise Linux all the way out to the edge of the network, even in the middle of the ocean.”

Getting there (safely)
As well as following the overall mission objectives to reach Plymouth, Massachusetts in the shortest amount of time, the AI Captain will draw on IBM’s rule management system (Operational Decision Manager – ODM) to follow the International Regulations for Preventing Collisions at Sea (COLREGs) as well as recommendations from the International Convention for the Safety of Life at Sea (SOLAS). Used widely across the financial services industry, ODM is particularly suited to the Mayflower project as it provides a completely transparent record of its decision-making process, avoiding ‘black box’ scenarios.

As the weather is one of the most significant factors impacting the success of the voyage, the AI Captain will use forecast data from The Weather Company to help make navigation decisions. A Safety Manager function (running on RHEL) will review all of the AI Captain’s decisions to ensure they are safe – for the Mayflower, and for other vessels in its vicinity.

A real-world scenario – how the Mayflower senses, thinks and acts at sea
For example, let’s assume that the Mayflower is in the open ocean, approaching Cape Cod, with no current satellite connectivity. In its path ahead is a cargo ship which has had a collision with a fishing vessel and spilt some of its load. In this hypothetical scenario, the Mayflower’s AI Captain will use the following technologies and processes to independently assess the situation, and decide what action to take:

Senses (assesses current environment & identifies hazards)

  • Radar detects multiple hazards in MAS’s path, 2.5 nautical miles ahead
  • Onboard cameras provide visual input to IBM computer vision system which identifies hazards as: a cargo ship, a fishing vessel and three partially submerged shipping containers floating in the water
  • Automatic Identification System (AIS) provides specific information about the cargo ship’s class, weight, speed, cargo, etc.
  • GPS Navigation System – provides MAS’s current location, heading, speed and course
  • MAS’s nautical chart server provides geospatial information about its chosen route
  • Weather data provided by The Weather Company
  • Attitude Sensors – assess local sea state (how MAS pitches and rolls due to waves)
  • Fathometer – provides water depth measurements
  • Vehicle Management System – provides operational data such as MAS’s battery charge level, power consumption, communications, science payloads etc.

Thinks (evaluates options)

  • IBM Operational Decision Manager (ODM) evaluates COLREGs with respect to the other vessels in the vicinity and generates a risk map indicating an “unsafe” situation ahead
  • MAS’s AI Captain ingests the ODM recommendation, computer vision input, current and forecasted weather and assesses several options to avoid hazard

Acts (chooses best actions and instructs vessel)

  • AI Captain determines the best action for MAS, in this hypothetical scenario, is to steer to starboard to avoid the unexpected navigation hazard
  • MAS’s Safety Manager verifies the decision as safe
  • AI Captain instructs MAS’s Vehicle Management system to change course and speed.

As the ocean is an ever-changing dynamic environment, the AI Captain will constantly re-evaluate the situation and update the course of the Mayflower as situations evolve.

The March sea trials will take place for approximately two months on the Plymouth Quest with the ship’s human captain and crew at the helm. In the first stage if testing, the Mayflower AI Captain’s inference engine will receive input from the Quest’s radar, AIS, GPS and navigation system, as well as data about visibility. Cameras, computer vision, edge and autonomy capabilities will be added in the next phase of testing from April.