System Overview
The Jeen backend is a distributed system of 13 microservices plus an MCP tool ecosystem. Services are grouped into four layers.
Layer 1: Client-Facing (BFF)
These services receive requests from the frontend.
| Service | Role |
|---|---|
| user-service | Aggregates data from downstream services. Handles user profile, sharing, tags, pins, favorites, admin settings, Langflow flows, documents, activity logs. Does not own a database -- delegates all persistence. |
| auth-service | Authenticates users via Firebase or ZITADEL, issues JWT tokens, manages sessions in Redis. |
Layer 2: Core Business Logic
These services implement the platform's AI capabilities.
| Service | Role |
|---|---|
| llm-core | Central orchestration. Manages conversations, agents, model/provider catalog, templates, canvas, text conversions, and LLM usage tracking. Calls completion-service for LLM responses and agent-service for agentic execution. |
| completion-service | LLM proxy. Routes completion requests to the correct provider (OpenAI, Azure, Anthropic, Google, Mistral, Jamba, Ollama, vLLM, or custom). Handles streaming. Emits token usage events to RabbitMQ. |
| agent-service | Runs agentic loops. Receives a message + allowed tools, calls completion-service for LLM reasoning, executes tools via MCP, feeds results back, and repeats until done (max 15 iterations). |
| document-service | Orchestrates the full document lifecycle: upload to Azure Blob/S3, trigger parsing via RabbitMQ, trigger embedding via RabbitMQ, track status, serve downloads, manage folders. |
Layer 3: Processing Services
These services do the heavy computational work.
| Service | Role |
|---|---|
| parser-service | Converts documents to markdown. Supports 4 parser backends: Azure Document Intelligence, PyMuPDF, MinerU, Marker. Runs an HTTP API and a RabbitMQ worker. |
| embedding-service | Chunks text, generates vector embeddings (OpenAI/Azure OpenAI), optionally translates to English. Runs an HTTP API and a RabbitMQ worker. |
| rag-service | Queries the vector store. Embeds the user query, runs pgvector cosine similarity search, optionally reranks results (BGE model or LLM-based). |
Layer 4: Data and Administration
These services manage persistent data and configuration.
| Service | Role |
|---|---|
| user-base-ms | User data persistence. Stores users, roles, tags, favorites, pins, activity logs, features, shares, locks, integration tools, Langflow accounts, connectors, languages. |
| admin-base-ms | Organization/tenant management. Stores organizations, RBAC (roles, permissions, modules, features, actions), resource units, and per-org configuration (models, agents, connectors, templates, workflows, parsing techniques, languages). |
| integration-service | Third-party integration gateway. Manages account provisioning, token lifecycle, and API key management for external tools (currently Langflow). |
MCP Ecosystem
The MCP (Model Context Protocol) layer extends the agent's capabilities with external tools.
| Component | Role |
|---|---|
| mcp-client-service | Gateway that discovers all MCP tool servers at startup, maintains a tool registry, and routes call-tool requests to the correct server. |
| 7 MCP tool servers | Individual servers exposing tools: document search (RAG), spreadsheet analysis, web search, video/image generation, Python code execution, Atlassian (Jira/Confluence), and interactive UI components. |
High-Level Architecture
Frontend / Client Apps
|
+-------------+-------------+
| |
auth-service user-service
(login/register) (BFF / aggregator)
| |
| +------+------+------+------+------+
| | | | | | |
| llm-core admin user doc- ident. langflow
| | base base svc svc
| | -ms -ms
| +------+------+
| | |
| completion agent-service
| -service |
| | mcp-client-service
| | |
| LLM APIs +----+----+----+----+----+----+----+
| | rag|sprd|web |veo2|code|atl.|ui |
| | tool|sht |srch| |exec| |comp|
|
+-----+-----+
| Firebase |
| ZITADEL |
| Redis |
+------------+
[document-service] --RabbitMQ--> [parser-service worker]
[document-service] --RabbitMQ--> [embedding-service worker]
[completion-service] --RabbitMQ--> [llm-core] (transactions)
Shared DB (document database): document-service, parser-service,
embedding-service, rag-service