This is a simple end-to-end mlops project which takes data from capital bikeshare and transforms it with machine learning pipelines from training, model tracking and experimenting with mlflow, ochestration with prefect as workflow tool to deploying the model as a web service.
The project runs locally and uses AWS S3 buckets to store model artifacts during model tracking and experimenting with mlflow.
The chosen dataset for this project is the Capital Bikeshare Data
In the future I hope to improve the project by having the entire infrastructure moved to cloud using AWS cloud(managing the infrastructure with iac tools such as terraform), have model deployment as either batch or streaming with AWS lambda and kinesis streams, a comprehesive model monitoring.
Clone the project from the repository
Change to mlops-project directory
Setup and install project dependencies
Add your current directory to python path
In a new terminal window or tab run the command below to start prefect orion server
The mlflow points to S3 bucket for storing model artifacts and uses sqlite database as the backend end store
Create an S3 bucket and export the bucket name as an environment variable as shown below
In a new terminal window or tab run the following commands below
Start the mlflow server
Create work queues
Run deployments locally to schedule pipeline flows
Change to webservice
directory and follow the instructions here