Project Overview
Extract, a workforce optimization and payment solutions provider, needed a fast, intelligent way to generate on-demand insights from a variety of internal documents and online resources. Manual report creation took time and effort, delaying critical decisions. Avahi developed an AI-driven chatbot leveraging Amazon Bedrock and AWS Lambda, delivering quick, data-backed responses to user queries while ensuring a secure, scalable foundation on AWS.
About the Customer
Extract operates in the business services industry, focusing on software solutions that streamline HR, payroll, and other workforce-related processes. The company serves organizations looking to simplify financial transactions and improve overall operational efficiency.
The Problem
Extract’s internal teams frequently needed to compile data and produce analytics from disparate sources, including manuals and websites. Without a unified system, they spent significant time consolidating information, which slowed responses to employee and customer inquiries. This manual approach risked inaccuracies, created bottlenecks, and limited Extract’s ability to provide real-time insights.
Why AWS
Extract chose AWS for its robust ecosystem of AI and serverless services, along with proven scalability and reliability. Amazon Bedrock provided an easy path to adopt advanced language models, while AWS Lambda, Amazon S3, and Amazon SageMaker allowed rapid experimentation and efficient data handling. With AWS, Extract could quickly prototype and refine a modern chatbot solution without managing complex infrastructure.
Why RoboArt Labs Chose Avahi
Avahi’s specialized expertise in AWS AI/ML services and serverless architectures made it the ideal partner for this initiative. The team’s collaborative approach ensured that Extract’s requirements were fully understood from the outset. By focusing on efficient development and clear milestones, Avahi delivered a tailored solution that integrated seamlessly with Extract’s data sources.
Solution
Avahi began by setting up Amazon S3 for secure data storage, gathering Extract’s manuals and website content in a centralized bucket. AWS Lambda functions then processed and stored embeddings in Pinecone, a vector database that accelerated real-time retrieval of relevant documents. Queries from users triggered another AWS Lambda function, which applied Amazon Bedrock language models to generate charts or textual reports based on the retrieved content.
To demonstrate feasibility, Avahi built a lightweight interface that allowed users to submit questions and quickly receive AI-driven responses. All data transformations and model interactions were orchestrated through AWS Lambda and Amazon SageMaker, ensuring minimal latency and easy scalability. Avahi also conducted quality checks to confirm that the chatbot performed reliably across the limited dataset provided by Extract.
Key Deliverables
- Centralized document storage in Amazon S3
- Lambda-based data processing and embedding pipeline
- Vector database integration with Pinecone
- Amazon Bedrock model orchestration for chatbot responses
- Basic user interface for chatbot testing and validation
Project Impact
The new AI-driven chatbot automated the generation of charts and reports, reducing manual effort and turnaround times for Extract’s internal stakeholders. This solution demonstrated the viability of modern AI tools in transforming Extract’s data into actionable insights.
Metrics:
- Onboarded up to 7 manuals and 3 websites for the chatbot’s knowledge base
- Completed project delivery within a strict three-week timeline