The UI for the inference-gateway, providing a user-friendly interface to interact with and visualize inference results and manage models.
# Install dependencies
npm install
# Run development server
npm run dev
This application can be containerized using Docker:
# Build the Docker image locally
docker build -t inference-gateway-ui --target dev .
# Run the container
docker run -v $(PWD):/app -w /app -e INFERENCE_GATEWAY_URL=http://localhost:8080/v1 -p 3000:3000 inference-gateway-ui
The application is automatically packaged as a Docker image and published to GitHub Container Registry (ghcr.io) when a new release is created.
To pull the latest release:
docker pull ghcr.io/inference-gateway/ui:latest