MLflow Server Docker Deployment
An MLflow Docker deployment accompanied by a Postgresql database for storing logged metrics and a MinIO storage for storing artifacts.
MLflow is an open source platform for managing the end-to-end machine learning lifecycle. It tackles four primary functions:
- Tracking experiments to record and compare parameters and results (MLflow Tracking).
- Packaging ML code in a reusable, reproducible form in order to share with other data scientists or transfer to production (MLflow Projects).
- Managing and deploying models from a variety of ML libraries to a variety of model serving and inference platforms (MLflow Models).
- Providing a central model store to collaboratively manage the full lifecycle of an MLflow Model, including model versioning, stage transitions, and annotations (MLflow Model Registry)
MLflow is library-agnostic. You can use it with any machine learning library, and in any programming language, since all functions are accessible through a REST API and CLI. For convenience, the project also includes a Python API, R API, and Java API.
The backend storage of the deployment is MinIO. MinIO offers high-performance, S3 compatible object storage. Native to Kubernetes, MinIO is the only object storage suite available on every public cloud, every Kubernetes distribution, the private cloud and the edge. MinIO is software-defined and is 100% open source under GNU AGPL v3.
An MLflow Docker deployment accompanied by a Postgresql database for storing logged metrics and a Minio storage for storing artifacts. Developed by the Decision Support Systems Laboratory of the Institute of Communication and Computer Systems in the context of I-NERGY H2020 project.
Instructions:
Just run docker-compose up -d
in the VM that the MLflow server will be deployed.
A Postgres database (parameter logging) and a minIO server (model store) are also automatically deployed.