AI and automation are driving business transformation by empowering individuals to do work without expert knowledge of business processes and applications. Whether it’s an employee who knows what they need but doesn’t know how to do it, a knowledge worker who knows how to do the task but needs help doing it more efficiently, or a customer who wants to resolve an issue but struggles with self-help tools, artificial intelligence (AI) unlocks new levels of productivity by empowering individuals to work smarter with less effort.
IBM Watson Assistant is a market-leading conversational AI platform that transforms fragmented and inconsistent experiences into fast, friendly and personalized customer and employee care. Powered by AI you can trust, and an intuitive user interface, Watson Assistant empowers users with the tools to build intelligent virtual agents and deliver automated self-service support across all channels and touch-points. We are now taking a major step to unlock new levels of productivity by introducing advanced generative AI capabilities to a variety of new use cases.
Last week during Think 2023, IBM introduced watsonx, a new platform that enables businesses to build AI models at scale and unveiled that Digital Labor offerings Watson Assistant and Watson Orchestrate will be infused with large language models (LLMs) trained and tuned on watsonx to enable enhanced employee productivity and customer service experiences.
With that, we are introducing the new accelerated authoring and conversational search capabilities for Watson Assistant. Watson Assistant uses these new LLMs to drive user outcomes including:
Accelerated authoring fast-tracks the delivery of new conversational flows. With this generative functionality, Watson Assistant makes it easy for your organization to accelerate build-time and deliver outcome-oriented conversational flows. The new “generate” feature for building actions now offers users the option to automate the build process for creating new actions by identifying multiple intents within context and automatically generating the steps to accomplish these tasks at once, whether it’s helping the customer of a telecommunications company add family members to their account, or assisting a bank branch manager as they hire a new customer service representative.
Conversational search generates accurate, contextual conversational answers grounded in enterprise-specific content. Not only does conversational search provide the customer with a response to their question, but that customer can trace the response back to its source content and understand from where the answer is coming. The new generative AI capabilities and custom extension easily blend this experience with “Actions”, our conversation flow builder, to both answer the customer’s question and progress them towards reaching their actual goal.
IBM Watson Assistant offers an intuitive build experience and powerful AI capabilities that can empower anyone in your organization to build and scale intelligent virtual agents faster and without writing a line of code. Powered by watsonx LLMs and generative AI, organizations will be able to accelerate build-time, automate operational tasks, boost productivity, and deliver exceptional customer and employee experiences. The new Watson Assistant capabilities are planned to be released this year. Stay tuned for more announcements.
Learn more about IBM Watson Assistant
The post Transform digital experiences and unlock productivity with advanced generative AI appeared first on IBM Blog.
Matrices are a key concept not only in linear algebra but also with regard to…
This paper delves into the challenging task of Active Speaker Detection (ASD), where the system…
Based on original post by Dr. Hemant Joshi, CTO, FloTorch.ai A recent evaluation conducted by…
As AI creates opportunities for business growth and societal benefits, we’re working to reduce their…
PlayStation characters may one day engage you in theoretically endless conversations, if a new internal…
The latest 15-inch MacBook Air is bluer and better than ever before—and it dropped in…