An extensible, self-hosted web interface for interacting with various LLM models (like ChatGPT, Claude, Ollama) in one unified platform.

Price Calculator
Data Centers Around the Globe
Frequently asked questions
The minimum system requirements for deploying Open WebUI are highly dependent on whether you plan to run a local Large Language Model (LLM) using an embedded system like Ollama or connect to an external API. The Open WebUI interface itself has very low requirements.
For the Open WebUI interface only, using an external API, the requirements are:
RAM: 1 GB for the container when idle.
CPU: 1 CPU core.
Storage: 10 GB of free space for the Docker installation and container.
To run models locally with the integrated Ollama system, requirements increase with the size of the models.
For more detailed information, please refer to the installation guide.
Our 30-day free trial includes one server worth up to $100. You can set up your free VPS server, install an operating system, and select a location from one of our 20+ data centers worldwide.
If you choose monthly billing, you will receive your first invoice the month after the free trial expires. For example, if you start your free trial on November 20, the free trial will be until December 20. If you choose to continue using our services and don’t terminate your server, your first invoice will be sent out after January 1. That invoice will include a prorated charge for December 20-31, as well as the month of January.
Our flexible monthly and hourly pricing models allow you to keep your costs under control. If you choose an hourly server, we bill for the resources you use. You’re only charged for the time your server is running. You can see real-time usage in your dashboard, and there are no surprise charges or hidden fees.
