🧠 AI with Java & Spring Boot – Part 3: Building a Memory-Aware Chatbot with LangChain4j
Hey devs! 👋
Welcome back to the third part of our AI-with-Java journey. So far, we've:
Today, we’ll go even further — and introduce LangChain4j, the Java adaptation of LangChain, to create a chatbot with memory and context.
Imagine asking your bot:
"Who’s the CEO of Tesla?"
"What year was he born?"
And the bot remembers who “he” refers to. That’s what we’re building today. 🧠
🧰 Tools We’ll Use
- LangChain4j – Java library for building LLM apps
- Spring Boot
- OpenAI API
- Session Memory
🔧 Step-by-Step: AI Chatbot with Memory (LangChain4j + Spring Boot)
1. Add Dependencies (Maven)
Add this to your pom.xml
:
dev.langchain4j
langchain4j
0.25.0
dev.langchain4j
langchain4j-openai
0.25.0
2. Configure OpenAI
In application.yml
:
openai:
api-key: YOUR_OPENAI_API_KEY
model: gpt-3.5-turbo
3. Create the ChatMemoryService
@Component
@Scope("session") // memory per user session
public class MemoryChatService {
private final ChatLanguageModel model;
private final ChatMemory memory;
public MemoryChatService(@Value("${openai.api-key}") String apiKey) {
this.model = OpenAiChatModel.builder()
.apiKey(apiKey)
.modelName("gpt-3.5-turbo")
.temperature(0.7)
.build();
this.memory = MessageWindowChatMemory.withMaxMessages(10);
}
public String chat(String userInput) {
ChatLanguageModelChain chain = new ChatLanguageModelChain(model, memory);
return chain.execute(userInput);
}
}
4. Expose It via REST Controller
@RestController
@RequestMapping("/api/langchain")
@SessionAttributes("memoryChatService")
public class LangChainController {
private final MemoryChatService memoryChatService;
public LangChainController(MemoryChatService memoryChatService) {
this.memoryChatService = memoryChatService;
}
@PostMapping("/chat")
public ResponseEntity<String> chat(@RequestBody Map<String, String> request) {
String prompt = request.get("prompt");
String response = memoryChatService.chat(prompt);
return ResponseEntity.ok(response);
}
}
5. Try It Out 🚀
Make a POST request:
curl -X POST http://localhost:8080/api/langchain/chat \
-H "Content-Type: application/json" \
-d '{"prompt": "Who is the CEO of Google?"}'
Then ask:
curl -X POST http://localhost:8080/api/langchain/chat \
-H "Content-Type: application/json" \
-d '{"prompt": "Where was he born?"}'
It’ll remember!
🔚 Wrapping Up
Now you have an intelligent chatbot that remembers your conversation! 🎉
You can also:
- Store memory to DB or Redis
- Customize personas
- Add tools like calculators or file readers (LangChain4j supports them too)
🔜 Coming in Part 4...
- File-based Q&A
- Upload PDFs and ask questions
- Embedding + vector search (e.g., using ChromaDB or Pinecone with Java!)
If you're enjoying the series, make sure to:
❤️ React
🗨️ Comment
📌 Follow for Part 4!
Stay sharp!
— RF 👨💻