You are opening our English language website. You can keep reading or switch to other languages.

GenAI Chat-Enabled Document Search for Efficient Patient Support

Client

A US-based healthcare organization has a custom internal platform (Care Management Platform – CMP) that tracks and manages patient care data and provides company operators with information needed for patient guidance and patients requests processing from the documents database. 

Challenge

CMP operators must consistently reference documents guiding operators through patient management. However, the complexity, volume, and variety of the information formats within the CMP made it difficult for operators to quickly find and utilize relevant data, impacting decision-making and real-time operational efficiency.

The client needed a solution that allows quick finding and refering to the necessary documents with high accuracy and enables operators to ask natural language questions (not using any templates) without generating responses but providing source documents with relevant information.

Solution

To address the challenge, DataArt’s team has created a Generative AI chatbot-style solution leveraging AWS Bedrock for LLM and AWS Kendra for knowledge management. It operates as a conversational tool that allows operators to ask natural language questions and get precise answers referring to the source documents (operators also view contextual information, which can be added if needed). 

The main challenge was to keep the chatbot within the request context, provide relevant answers, and avoid solution’s hallucinations – prompt guardrails were added, and the chatbot was tested using irrelevant questions as a part of a responsible AI implementation strategy. Also, the assistant’s personality was finetuned, and the answers were edited to make them sound human-friendly and present a helpful and professional personality. 

Technologies

AWS Bedrock
AWS Kendra
AWS ECS
AWS ECR
LangChain

Outcomes

  • DataArt designed a GenAI conversational chatbot-like solution integrated directly into a CMP. 
  • The solution provides real-time support for the CMP operators who can ask natural language questions.
  • Query processing time was reduced by up to 30% - operators’ efficiency increased.
  • 90% accuracy score (for the references matching the request).
  • Solution features auto-scaling capabilities, which allows it to manage an unlimited number of queries, adapting to increased demand effortlessly. 
  • The intuitive AI tool simplifies the training process for new CMP operators, enabling quicker onboarding and proficiency.
Contact Us
Please provide your contact details, and we will get back to you promptly.