Building an Intelligent Life Insurance Assistant: LangChain and Local LLMs with MLX

Tirta Adi Gunawan
3 min readJan 8, 2025

--

The Problem with Insurance Consultations

Let’s face it — getting life insurance quotes is about as fun as watching paint dry. You typically:

  1. Call an agent (during business hours only)
  2. Explain your needs (repeatedly)
  3. Wait human input for illustrations
  4. users like to repeat until they find the right plan

Our Solution: An AI Assistant That Gets Insurance

We built an AI assistant that:

  • Chats naturally about insurance products
  • Works 24/7
  • Runs completely locally (your company data stays private)
  • Understands all your insurance products, precisely

The Secret Sauce: Local AI + Domain Knowledge

The Brain: Llama 3.1 on MLX

Instead of using ChatGPT, we’re running Llama 3.1 locally. Why? It’s:

  • Private: No data leaves your system
  • Fast: with MLX running on apple silicon
  • Cost-effective: No API fees
  • Customizable: We control everything

The cool part? We squeezed an 8B parameter model into just 4GB of memory using MLX’s optimization magic. It runs smoothly even on a MacBook Air.

python3 -m mlx_lm.server --model mlx-community/Meta-Llama-3.1-8B-Instruct-4bit --log-level DEBUG

The Smarts: LangChain’s Magic

LangChain helps our AI:

  • Remember conversation context (“Oh, you mentioned critical illness coverage earlier…”)
  • Validate inputs (No more “I’m 999 years old”)
  • Generate proper insurance illustrations
  • Switch contexts naturally (“Let’s talk about education savings instead”)
def _create_tools(self) -> List[Tool]:
validation = ValidationService()

def create_validation_tool(name: str, description: str, validator) -> Tool:
def tool_func(value: str) -> str:
valid, message = validator(value)
return message

return Tool(
name=name,
description=description,
func=tool_func
)

return [
create_validation_tool(
"validate_sum_assured",
"Validates if a number is a valid sum assured amount. Input should be a string representing a number.",
validation.validate_sum_assured
),
create_validation_tool(
"validate_payment_type",
"Validates if a string is either 'yearly' or 'monthly'. Input should be a string.",
validation.validate_payment_type
),
create_validation_tool(
"validate_date_of_birth",
"Validates if a string is a valid date in DD-MM-YYYY format. Input should be a string.",
validation.validate_date_of_birth
),
create_validation_tool(
"validate_gender",
"Validates if a string is either 'M' or 'F'. Input should be a string.",
validation.validate_gender
)
]

Real Features That Matter

  1. Smart Product Recommendations
User: "I need coverage for my family"
Bot: "Let's look at our Family Shield plan. It covers:
- Critical illness
- Accident protection
- Children's education
Which aspect interests you most?"

2. Instant Illustrations

  • Input your details
  • Get PDF illustrations immediately
  • Compare different coverage options
  • Adjust and regenerate instantly
        """Submit collected insurance data"""
logger.info(f"Submitting insurance data: {self.insurance_data}")

# Show loading message
msg = cl.Message(content="⏳ Processing your request...")
await msg.send()

# Add fake loading delay
await asyncio.sleep(2)

success, message = await self.api_service.submit_insurance_data(self.insurance_data)

# Attach the PDF file
elements = [
cl.File(name="All_The_Best_Life_Insurance.pdf",
path="/Users/tirtagunawan/Documents/RESEARCH2/localLLM/mlx-chainlit-llama3/All_The_Best_Life_Insurance.pdf",
display="inline")
]
await cl.Message(
content="📄 Here's your insurance illustration document:",
elements=elements

3. Education Fund Planning

  • Calculate required savings
  • Project education costs
  • Plan for multiple children
  • Factor in inflation

The Tech Stack

  • MLX: Apple’s ML framework running Llama local in apple silicon.
  • LangChain: The Logic use LLM
  • Chainlit: For smooth chat interface

This project is open-source and available on GitHub. Feel free to adapt it for your use case.

#AI #InsurTech #Python #MachineLearning #LangChain #MLX #LLM #Llama #Ollama #RAG #AIAgent

--

--

Responses (1)