Skip links
Open WebUI on computer in 5 minutes

Open WebUI explained simply – Data protection-compliant ChatGPT for businesses in 5 minutes

jeremie-constant-founder
Jérémie ConstantCo-founder of novalutions

In this article, we will show you how to set up a data protection-compliant ChatGPT-like tool in your company in just a few minutes, completely on your own (e.g. German) servers, without API access or data transfer to third parties. With Open WebUI and local AI models like Ollama, you can provide your workforce with a powerful, secure AI solution that is simple, flexible and GDPR-compliant.

Even two years after the start of the generative AI wave, we continue to see the same picture in companies: employees have long been using tools such as ChatGPT, Claude and other AI chatbots – but often with private accounts, on their own initiative and without clear guidelines.

The reason for this?
In many cases, it is the uncertainty surrounding data protection. Companies are reluctant to release official solutions for fear that sensitive data could fall into the wrong hands or violate internal compliance rules.
This is understandable – but it is precisely because of this that a lot of potential remains untapped:
AI could have long since been used to assist with writing emails, structuring knowledge, or quickly answering internal questions. However, without a secure, data protection-compliant solution, many places continue to work manually.

The good news is that there are now simple options, Operate ChatGPT alternatives in your own company in compliance with data protection regulations – without any API access to US servers or legal grey areas. In this article, we will show you how., how you can set up your own internal "ChatGPT" with Open WebUI in just a few minutes – secure, local, flexible and intuitive for your teams.

open webui company screenshot

What is Open WebUI – and why is it exciting for businesses?

Open WebUI is an open source interface that allows companies to use generative AI securely and in compliance with data protection regulations in their own network - without any external API access or cloud connection. Operation is based on ChatGPT and is so simple that no prior technical knowledge is required. In combination with local language models such as Ollama or Mistral, the entire application runs directly on the company's own servers, ideally in Germany or the EU. In this way, companies retain full control over their data and create a legally secure basis for the use of AI. We have analysed the advantages and disadvantages of using local AI solutions in summarised in this article.

Open WebUI offers a flexible and customisable solution, especially for organisations that want to make AI available internally, for example for text creation, knowledge management or internal assistance functions. It can be seamlessly integrated into existing processes and forms the basis for the confident, future-proof use of generative AI within the organisation.

The most important advantages of Open WebUI at a glance:

Open WebUI offers companies a flexible and secure solution for the use of AI technologies. The most important advantages at a glance:

Data protection and security first and foremost

Complete data control through own hosting You can run Open WebUI on your own servers – for example, in German data centres, or even on a Mac Mini (we also use this solution internally). This means that all sensitive company data remains under your control and never leaves your IT infrastructure.

Maximum flexibility for AI integration

Free choice of AI providers Open WebUI gives you complete freedom of choice: use external services such as OpenAI, rely on local AI models, or combine different models with a hybrid approach. An API connection is optional, not mandatory.

User-friendliness without technical hurdles

Easy access for all employees The intuitive web interface enables your teams to be productive immediately. No technical expertise is required – simply log in to use the AI tools.

Precise access control

Granular rights management for administrators As an administrator, you retain complete control over which employees have access to which functions and AI models. This detailed rights management ensures both security and needs-based utilisation.

Efficient teamwork

Joint development and utilisation of AI resources Teams can create, share and continuously develop prompts, AI assistants and knowledge databases together. This promotes knowledge sharing and increases productivity throughout the entire company.

Uncomplicated implementation

Setup in a few minutes Thanks to platforms such as elestio, you can put Open WebUI into operation without any complex technical configuration. The solution is ready for use in just a few minutes and requires no special IT set-up.

Portrait of Jérémie Constant, co-founder of Novalutions.
Contact us

We Configure Open WebUI and local AI solutions for Companies

Open WebUI brings Order in the AI utilisation -  one central, safe User interface,  the Employees direct more productive makes.   We show you, like you Open WebUI in yours Companies establish.  

info@novalutions.de

0221 - 29245920

To the digital appointment booking

Setting up Open WebUI with Docker and Ollama - step by step

The self-hosted solution with Docker and Ollama offers maximum control over your AI infrastructure and guarantees complete data protection. This guide will take you through the entire setup process.

Prerequisites

Before you start, make sure that the following components are available on your server:

  • Docker and Docker Compose (latest version)
  • At least 8 GB RAM (16 GB recommended for larger models)
  • 50+ GB free storage space for models
  • GPU support (optional, but recommended for better performance)

Step 1: Install Ollama with Docker

First create a working directory and the necessary Docker Compose configuration:

				
					mkdir open-webui-setup
cd open-webui-setup
				
			

Create a docker-compose.yml File:

				
					version: '3.8'

services:
  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    volumes:
      - ollama_data:/root/.ollama
    ports:
      - "11434:11434"
    environment:
      - OLLAMA_ORIGINS=http://localhost:3000,http://127.0.0.1:3000
    restart: unless-stopped

  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    container_name: open-webui
    depends_on:
      - ollama
    ports:
      - "3000:8080"
    environment:
      - OLLAMA_BASE_URL=http://ollama:11434
      - WEBUI_SECRET_KEY=your-secret-key-here
    volumes:
      - open_webui_data:/app/backend/data
    restart: unless-stopped

volumes:
  ollama_data:
  open_webui_data:
				
			

Step 2: Start services

Start both containers with a single command:

				
					docker-compose up -d
				
			

The services are now available:

Step 3: Download the first AI model

Download a first language model. The compact but powerful Llama 3.2 is ideal for getting started:

				
					docker exec -it ollama ollama pull llama3.2:3b
				
			

For better performance with sufficient RAM, you can also install larger models:

				
					# For advanced applications
docker exec -it ollama ollama pull llama3.2:8b

# For maximum performance (requires a lot of RAM)
docker exec -it ollama ollama pull llama3.2:70b
				
			

Step 4: Configure Open WebUI

  1. Create admin accountOpen : Open http://localhost:3000 in the browser and create the first user account (automatically created as administrator).
  2. Check model availabilityIn the Open WebUI, the downloaded models should be automatically recognised and displayed in the model selection.
  3. Manage users and authorisations: You can create additional user accounts and define access rights via the admin settings.

Step 5: Set up network access for the team

Customise the Docker Compose configuration for access from the local network:

				
					# In the docker-compose.yml under open-webui:
ports:
  - "0.0.0.0:3000:8080" # Access from the entire network
				
			

Add firewall rule (exemplary for Ubuntu):

				
					sudo ufw allow 3000/tcp
				
			

Your teams can now access the interface via the server IP address: http://[Server-IP]:3000

Step 6: Further models and customisations

Install additional models:

				
					# Coding-specialised models
docker exec -it ollama ollama pull codellama

# German language models
docker exec -it ollama ollama pull mistral

# Specialised models for different tasks
docker exec -it ollama ollama pull deepseek-coder
				
			

Performance optimisation:

  • For GPU availability: Add deploy.resources.reservations.devices to the Docker Compose
  • Increase model cache by adjusting the Ollama environment variables
  • Regular updates about docker-compose pull && docker-compose up -d

Step 7: Backup and maintenance

Backup of the data:

				
					Back up # volumes
docker run --rm -v open-webui-setup_open_webui_data:/data -v $(pwd):/backup alpine tar czf /backup/webui-backup.tar.gz /data
docker run --rm -v open-webui-setup_ollama_data:/data -v $(pwd):/backup alpine tar czf /backup/ollama-backup.tar.gz /data
				
			

Perform updates:

				
					docker-compose pull
docker-compose up -d
				
			

Once you have completed these steps, you will have a fully functional, self-hosted AI solution that fulfils the highest data protection standards while offering maximum flexibility.

Conclusion: Using AI in the company in compliance with data protection regulations - easier than expected

The use of AI in companies does not have to be associated with legal grey areas or loss of control. With solutions like Open WebUI a powerful, intuitive AI chatbot can be integrated into your own infrastructure quickly and in compliance with data protection regulations - either via elestio with just a few clicks or by hosting it on your own company server.

Particularly in sensitive sectors such as Law, finance, healthcare or industry a locally hosted system can offer enormous advantages: Text creation, internal knowledge databases, code support or the automation of routine tasks - All of this is possible without sensitive data leaving the company. But even with data protection-compliant tools such as Open WebUI, the following applies: a good solution is not created by the technology alone, but by the right implementation.

Which language model suits your team? Is the existing infrastructure sufficient? How can internal processes be efficiently combined with local AI - without increasing complexity? This is exactly where we come in.

We support you in developing the right AI architecture for your company - pragmatically, with data protection in mind and with a clear focus on your specific use case. Whether Choosing the right models, Setup on German servers or Training your teams - Together we will bring AI into your company in a meaningful and sustainable way.

These are ourFrequently asked questions

AI agency in Cologne

Your personalContact us

We look forward to every enquiry and will respond as quickly as possible.

AI Agency in Cologne - Contact form
Name

Good Business relations begin in person.

Contact us us with pleasure per Mail or Telephone, and we agree a personal Date.

0221 - 29245920

info@novalutions.de

Book an appointment

Portrait of Kevin Schwarz, co-founder of novalutions.

Leave a comment