Skip to content

This repository provides a microservice for running intelligent models through the Docker daemon.

License

Notifications You must be signed in to change notification settings

ABI-Deployment-Thesis/docker-engine

Repository files navigation

ABI Deployment Thesis - Docker Engine

Welcome to the docker-engine repository! This project is a key part of a master's thesis at the University of Minho. It's a Proof of Concept for a proposed architecture designed to deploy and integrate intelligent models within Adaptive Business Intelligence (ABI) systems.

This repository provides a microservice for running intelligent models through the Docker daemon.

For a detailed explanation of the proposed architecture and its deployment strategy, please refer to the published article: Architecture proposal for deploying and integrating intelligent models in ABI.

Quick Start

Networking

  • Ingress
    • Consumes data from a RabbitMQ queue.
    • Retrieves data from storage.
  • Egress
    • Communicates with the model-runner microservice to send data about intelligent model runs.
  • Both
    • Communicates with the Docker daemon to build and retrieve information about images.

Development

# Generate gRPC client and server code
python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. ./client/protos/ModelRunnerService.proto

# Start RabbitMQ
docker run -it --rm --name rabbitmq -p 5672:5672 -p 15672:15672 rabbitmq:management-alpine

Author

License