Deploying Ollama + Open WebUI App
Last updated
Last updated
Lets deploy a 'Use-Case' which is multiple server app templates deployed together. For this guide we will be deploying Ollama LLM framework + Open-webui frontend UX.
This is a powerful combination of 2 apps, that allows you to access and interact with powerful AI LLM's directly from your web browser, allowing you to switch between current industry-leading open source models like Llama3.1 & Mistral. This template streamlines the process, to get an LLM and GenAI chatbot frontend running on your bare metal server!
Make sure to connect your web3 wallet and create a session, then visit the App Store from within your Xnode Studio dashboard and select "Use-Cases" tab at the top of the page.
Once you have selected the Ollama + Open-webui Use Case, you need to select your deployment target. For this deployment guide we will be using a bare metal Xnode we have previously redeemed.
Selecting the DVM will then deploy it, with an estimated time of 2 minutes. Of course, you can select "Go to Deployment" button to see it's live deployment status.
Going back to the bare metal page, you can see what Apps are deployed on your server.
As we have selected a Use Case, it has deployed 2 app templates, Ollama and Open WebUI.
By selecting the the arrow icon, it will take you to your chatbot frontend website URL.