20250525-litellm.jpeg

<aside> 💡

Instructions on installing LiteLLM using Docker compose with a database to persist information and use its WebUI to set up some providers, models, and a budget. Also discusses the integration of the tool with OpenWebUI.

</aside>

Revision: 20250525-0 (init: 20250525)

LiteLLM is a lightweight Python library that streamlines Large Language Model (LLM) integration for user-created Teams, offering a unified OpenAI-compatible API Proxy to access popular LLM providers. A feature of the tool is its built-in cost tracking and budget management, which helps monitor and control expenses, allowing teams to set spending limits.

https://docs.litellm.ai/docs/simple_proxy

The LiteLLM Proxy Server acts as an OpenAI API-compatible gateway, enabling seamless access to multiple LLMs where teams can access the proxy to centralize model management, enforce usage policies, and generate virtual API keys with rate limits and budget controls for different users or projects through its proxyUI, which further simplifies spend tracking and team management, making it easy to invite users, allocate resources, and monitor consumption in real time.

https://docs.litellm.ai/docs/proxy/ui

In the following, we will not focus on creating a team or setting a budget, as their documentation covers it. Instead, we will discuss its integration with Dockge and OpenWebUI; the setup of those was covered previously.

Dockge (Rev: 20251129-0)

Ollama with OpenWebUI (Rev: 20240730-0)

LiteLLM setup

The compose.yaml and example .env files for this Docker compose deployment are available in:

geekierblog-artifacts/20250524-litellm at main · mmartial/geekierblog-artifacts

Deploy this compose.yaml and its corresponding .env file, noting that:

Traefik Proxy (Rev: 20251213-0)

HomePage: Services Dashboard (Rev: 20251129-0)

CleanShot 2025-05-25 at 16.18.24.png

To access the WebUI, go to the base URL ending with /ui (or use the link in the page) to get access to your configuration frontend. The admin password is the LITELLM_MASTER_KEY value.

CleanShot 2025-05-25 at 16.20.50.png

Among the following steps, configure access to models before setting a team and a budget:

CleanShot 2025-05-25 at 16.51.59.png

CleanShot 2025-05-25 at 16.56.20.png

OpenWebUI integration

Installation of OpenWebUI was covered in a previous post:

From the Admin Panel, in Settings -> Connections -> Manage OpenAI API Connections, add (+) a new entry:

CleanShot 2025-05-25 at 17.06.17.png

Once added (with Ollama API disabled), when going to the Settings -> Models list, we will see the entire list of models allowed to be used by this API key.

After this, the added models will appear in the External Models list when selecting New Chat. After a few prompts, we will see our Usage reflected on the LiteLLM dashboard and our Team Usage's Cost and Model Activity.

Category

Tags


Untitled

Untitled