Project Overview
Everest AI, a leading provider of advanced AI solutions for the trucking industry, needed an automated way to extract key data from invoices and documents. Manually collecting this information caused delays, introduced errors, and limited the scalability of their services. Avahi developed a Generative AI-driven data extraction pipeline leveraging AWS services, streamlining Everest AI’s processes and enabling faster, more accurate retrieval of critical logistics information.
About the Customer
Everest AI operates in the logistics technology space, focusing on automating workflows for carriers and brokers. The company’s AI-driven solutions help customers optimize routes, manage invoices, and ensure compliance by extracting and analyzing vast amounts of transportation-related data.
The Problem
Everest AI needed a consistent, efficient approach to capture essential information such as invoice numbers, pricing, and dates from multiple document formats. Manual data entry slowed their operations, risking potential errors and compliance issues. If left unaddressed, Everest AI’s customers would face escalating labor costs and delayed billing cycles, hampering overall operational effectiveness and the company’s growth potential.
Why AWS
Everest AI chose AWS for its proven reliability, breadth of services, and strong AI/ML capabilities. Amazon Bedrock and Amazon SageMaker offered a robust framework to train and deploy large language models, while Amazon S3 and Amazon RDS provided scalable, secure data storage. By adopting AWS, Everest AI could rapidly iterate on its data extraction models and integrate new features without worrying about infrastructure constraints.
Why Everest AI Chose Avahi
Avahi brought extensive experience in building AI solutions on AWS, including deep expertise in Amazon Bedrock, Amazon SageMaker, and serverless architectures. This track record gave Everest AI confidence that Avahi would deliver a tailored, reliable solution aligned with the company’s logistics focus. Avahi’s collaborative approach—providing transparent communication and thorough documentation—ensured that Everest AI’s needs were understood and addressed from start to finish.
Solution
Avahi designed a fully automated data extraction pipeline centered on AWS Lambda and Amazon S3. Whenever new documents are uploaded, AWS Lambda functions trigger a workflow that ingests the data, transforms it, and sends it to large language models in Amazon Bedrock and Amazon SageMaker for analysis. Relevant information—like invoice IDs and amounts—is then stored in Amazon RDS, enabling Everest AI to access clean, structured data quickly.
To ensure security and accuracy, Avahi configured event-driven processes to handle document updates, re-check sensitive fields, and mask or encrypt personally identifiable information when necessary. Amazon API Gateway was implemented to provide a secure interface for retrieving processed data, while Amazon CloudWatch metrics gave Everest AI visibility into performance and usage patterns. The final solution delivered a scalable, serverless architecture that grows with Everest AI’s expanding customer base.
Key Deliverables
- Automated data extraction pipeline using AWS Lambda and Amazon S3
- Large language model integration through Amazon Bedrock and Amazon SageMaker
- Secure, high-availability database storage in Amazon RDS
- Event-driven workflows and monitoring for continuous accuracy checks
- Documentation and knowledge transfer sessions
Project Impact
By automating the capture of key information from logistics documents, Everest AI significantly reduced manual data entry and associated errors. The new solution provided near real-time insights into invoices and other crucial records, speeding up billing cycles and improving customer satisfaction.
- Validated accuracy, bias, and scalability of the model using standard industry tools.
- Verified that sensitive data was masked inline with internal compliance requirements.