Welcome to the docker-engine repository! This project is a key part of a master's thesis at the University of Minho. It's a Proof of Concept for a proposed architecture designed to deploy and integrate intelligent models within Adaptive Business Intelligence (ABI) systems.
This repository provides a microservice for running intelligent models through the Docker daemon.
For a detailed explanation of the proposed architecture and its deployment strategy, please refer to the published article: Architecture proposal for deploying and integrating intelligent models in ABI.
- For setup instructions and initial configuration, please follow the guidelines provided in the infrastructure repository.
- Ingress
- Consumes data from a RabbitMQ queue.
- Retrieves data from storage.
- Egress
- Communicates with the model-runner microservice to send data about intelligent model runs.
- Both
- Communicates with the Docker daemon to build and retrieve information about images.
# Generate gRPC client and server code
python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. ./client/protos/ModelRunnerService.proto
# Start RabbitMQ
docker run -it --rm --name rabbitmq -p 5672:5672 -p 15672:15672 rabbitmq:management-alpine
- Rui Gomes (LinkedIn)