A chatbot framework is the set of tools used to build, test, and deploy conversational AI. Compare Rasa, Botpress, LangChain, and no-code options.
More about Chatbot Framework
A chatbot framework is the underlying software that developers use to build conversational agents. It typically handles the mechanics of message ingestion, intent recognition, state management, response generation, and channel integration, so engineering teams can focus on the content and logic of the bot itself rather than on plumbing.
The term has shifted meaning over time. A decade ago a chatbot framework meant something like Microsoft Bot Framework or IBM Watson Assistant, built around rule-based dialog trees. Today it can also mean an LLM orchestration library like LangChain or LlamaIndex, or a hosted platform like SiteSpeak that wraps the whole stack and abstracts the code away.
What a Chatbot Framework Provides
Most frameworks bundle some combination of:
- Message handling: routing inbound messages from one or more channels into the bot.
- Dialog management: tracking conversation state, slots, and follow-ups.
- NLU and intent classification: interpreting what the user wants.
- Integration points: function calling, webhooks, and APIs to do real work outside the bot.
- Channel adapters: ready-made connectors for web widgets, Slack, WhatsApp, Messenger, SMS, and voice.
- Testing and evaluation: tools to run conversations through regression suites.
Modern frameworks also include or integrate cleanly with a large language model and a vector database for grounded responses.
Categories of Chatbot Framework
Open-source developer frameworks
Rasa, Botpress (self-hosted), and the older Microsoft Bot Framework sit here. You write code, control the data pipeline end to end, and get full flexibility in exchange for taking on the operational burden. Good choice for teams with specialised requirements or strict data residency needs.
LLM orchestration libraries
LangChain, LlamaIndex, Semantic Kernel, and the Vercel AI SDK are the modern alternative. They do not ship a UI or a channel layer, but they handle the hard parts of wiring an LLM together with a retrieval augmented generation pipeline, tool use, and agent memory. Best for developer teams that want a custom bot experience.
Hosted chatbot platforms
SiteSpeak, Intercom Fin, Ada, and many others. These bundle a trained LLM, a knowledge base, a chat widget, analytics, and team tools into a single product. You hand over configuration rather than writing code. The right choice when the goal is shipping an AI chatbot for a website or a support team quickly, without hiring ML engineers.
How to Choose a Chatbot Framework
A few questions cut through the marketing:
- Who is going to maintain this? If the answer is non-developers, a hosted platform almost always wins.
- Where does the data need to live? Regulated industries often rule out multi-tenant SaaS.
- How custom is the behaviour? If most answers come from a website and a FAQ, a hosted platform is faster. If the bot has to orchestrate half a dozen internal APIs with complex logic, a developer framework or an orchestration library gives more control.
- What is the support and observability story? Production chatbots fail silently all the time. The framework needs good logging, a way to replay conversations, and useful metrics on AI hallucination, escalation, and resolution.
- Lock-in and portability: can you export conversations and training data if you decide to switch?
Typical Architecture
Regardless of which framework you pick, the working architecture looks roughly the same:
- Channel layer (widget, messaging apps)
- Orchestration layer (framework)
- LLM and embedding models
- Vector store and knowledge base
- Integration layer for business systems
- Analytics and observability
SiteSpeak provides the last five pieces out of the box, tightly integrated. Developer-focused frameworks give you the building blocks and expect you to wire the stack together.