LangMate is a modular and extensible AI chat application and SDK platform built with .NET 9 and fully compatible with .NET Aspire.
It provides a Blazor-powered Web UI, Ollama model integrations, persistent chat history via MongoDB, and a flexible SDK for .NET developers to integrate and use local LLMs (like Gemma, LLaMA2, Mistral) easily and securely.
LangMate/
βββ Apps/
βββββ LangMate.AppHost.AppHost β .NET Aspire for orchestrating and deploying the apps on Docker, Kubernetes, or any other cloud platform.
βββββ LangMate.AppHost.BlazorUI β Blazor ChatBot UI sample project using LangMate SDK
βββββ LangMate.AppHost.ApiService β Web API sample project using LangMate SDK
βββ SDK/
βββββ LangMate.Core β Reusable .NET SDK to interact with Ollama
βββββ LangMate.Middleware β Polly-based Resiliency Middleware (retry, timeout, circuit breaker)
βββββ LangMate.Extensions β Utilities, helpers and extension methods
βββββ LangMate.Persistence β MongoDB chat history, caching layer, repositories and configuration
git clone https://github.com/raminesfahani/LangMate.git
cd LangMate/src/apps/LangMate.AppHost.AppHost
# Restore dependencies and build
dotnet restore
dotnet build
# Run the .NET Aspire Dashboard
dotnet run
Then open https://localhost:17198/
in your browser to see the .NET Aspire dashboard. You can launch Blazor ChatBot or Web API apps over there by their own links.
LangMate.Core is a lightweight, extensible .NET SDK designed to make working with Ollama-powered local AI models seamless and developer-friendly. It abstracts away the complexity of managing conversations, interacting with Ollama endpoints, and persisting chat history β all while offering resiliency, caching, and extensibility.
To use LangMate.Core
, install the required NuGet package , or include the project reference in your solution.
dotnet add package LangMate.Core
You can see Full Documentation and sample usage in this link as well.
The LangMate Blazor App is an intelligent, real-time chat UI built with Blazor Server and integrated with the powerful local AI models provided by Ollama using LangMate.Core SDK.
It provides a complete frontend experience for interacting with AI models, managing chat history, uploading files, and dynamically updating chat state.
π Chat with Ollama Models: Seamlessly send and stream messages from local Ollama instances.
π¬ Persistent Conversations: Every chat session is stored in MongoDB and can be resumed anytime.
π File Uploads: Upload image files and pass them to models like llava for multimodal interactions.
π§ Sidebar Navigation: Access previous chats and start new ones from a clean sidebar UI.
π¦ Model Switching: Easily switch between available Ollama models.
π Streaming Responses: Uses async streaming to display tokens as theyβre generated.
βοΈ Resilient Middleware: Protected with timeout, retry, and circuit breaker policies.
π Global Error Toasts: All unhandled exceptions surface as toast notifications.
The LangMate API Service is the backend layer of the LangMate system, exposing RESTful HTTP APIs for external integration, orchestration, and automation.
It serves as a stateless gateway for interacting with the LangMate core functionalities β such as chat sessions, file uploads, model management, and streaming chat completions β powered by the LangMate.Core SDK and Ollama.
π Chat Completion API: Start or continue chat sessions supporting stream mode with local Ollama models.
π§ Model Discovery: Query available and pulled models from the Ollama runtime.
π¬ Conversation APIs: Read, delete, and manage persistent chat history.
πΌοΈ File Upload: Upload image files to be used with multimodal models (e.g., gemma).
π Middleware-Enhanced Resilience: Protected by retry, timeout, and circuit breaker policies via LangMate.Middleware.
βοΈ Scalar Integration: Auto-generated OpenAPI documentation (easily added).
dotnet build --configuration Release
dotnet test
Contributions are welcome!
Licensed under the MIT License.
Created and maintained by @raminesfahani.
For issues and features, open a GitHub Issue.
This project uses the Ollama repository for local AI model integration. I am thankful to the maintainers and contributors of Ollama for making this technology available.