Open WebUI : Install2025/02/13 |
Install Open WebUI which allows you to run LLM on Web UI. Open WebUI can be easily installed with pip3, however in this example, we will start it in a container. |
|
[1] | |
[2] | |
[3] | Pull and start the Open WebUI container image. |
[root@dlp ~]# podman pull ghcr.io/open-webui/open-webui:main [root@dlp ~]# podman images REPOSITORY TAG IMAGE ID CREATED SIZE ghcr.io/open-webui/open-webui main a088eea70396 7 days ago 4.33 GB
[root@dlp ~]#
[root@dlp ~]# podman run -d -p 3000:8080 --add-host=host.containers.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
podman ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 7983190066ff ghcr.io/open-webui/open-webui:main bash start.sh 4 seconds ago Up 4 seconds 0.0.0.0:3000->8080/tcp open-webui |
[4] | Start any web browser on client computer and access to Open WebUI. When you access the application, the following screen will appear. Click [Get started] to proceed. |
![]() |
[5] | When you access for the first time, you will need to create an administrator account. Enter the required information and click [Create Admin Account]. |
![]() |
[6] | Once your admin account is created, the Open WebUI default page will be displayed. |
![]() |
[7] | Next time, you can log in with your registered email address and password. |
![]() |
[8] | You can add subsequent users by registering them on the administrator account management screen. |
![]() |
![]() |
![]() |
[9] | To use Chat, select the model you have loaded into Ollama from the menu at the top, enter a message in the box below, and you will receive a reply. |
![]() |
![]() |
|