question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[FR] Migrate MLflow SageMaker deployment functionality to deployments interface

See original GitHub issue

MLflow Roadmap Item

This is an MLflow Roadmap item that has been prioritized by the MLflow maintainers. We’re seeking help with the implementation of roadmap items tagged with the help wanted label.

For requirements clarifications and implementation questions, or to request a PR review, please tag @dbczumar in your communications related to this issue.

Proposal Summary

MLflow provides a deployments API / CLI for CRUDing real-time scoring deployments from MLflow models. This is a pluggable / extendable interface, and plugins currently exist for deployment to Azure ML, TorchServe, Redis, and more. MLflow’s tool for deploying models to SageMaker isn’t currently exposed through the deployments API / CLI; rather, it has a separate mlflow sagemaker CLI / mlflow.sagemaker API for legacy reasons. For consistency, we propose to move SageMaker deployment functionality into the mlflow deployments CLI / API.

Motivation

  • What is the use case for this feature? This feature unifies MLflow’s real-time scoring deployment integrations behind a single API.
  • Why is this use case valuable to support for MLflow users in general? It’s difficult for users to reason about separate APIs for different deployment environments. A unified API will reduce friction.
  • Why is this use case valuable to support for your project(s) or organization? Databricks customers deploy models to a variety of locations. It’s difficult for them to reason about separate APIs for SageMaker deployment.
  • Why is it currently difficult to achieve this use case? There is no SageMaker deployment functionality conforming to the MLflow plugin specification.

What component(s), interfaces, languages, and integrations does this feature affect?

Components

  • area/artifacts: Artifact stores and artifact logging
  • area/build: Build and test infrastructure for MLflow
  • area/docs: MLflow documentation pages
  • area/examples: Example code
  • area/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registry
  • area/models: MLmodel format, model serialization/deserialization, flavors
  • area/projects: MLproject format, project running backends
  • area/scoring: MLflow Model server, model deployment tools, Spark UDFs
  • area/server-infra: MLflow Tracking server backend
  • area/tracking: Tracking Service, tracking client APIs, autologging

Interfaces

  • area/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev server
  • area/docker: Docker use across MLflow’s components, such as MLflow Projects and MLflow Models
  • area/sqlalchemy: Use of SQLAlchemy in the Tracking Service or Model Registry
  • area/windows: Windows support

Languages

  • language/r: R APIs and clients
  • language/java: Java APIs and clients
  • language/new: Proposals for new client languages

Integrations

  • integrations/azure: Azure and Azure ML integrations
  • integrations/sagemaker: SageMaker integrations
  • integrations/databricks: Databricks integrations

Details

We propose to migrate the mlflow.sagemaker module to a plugin specification that still lives within the main MLflow repository and implements the deployment plugin interface: https://github.com/mlflow/mlflow/blob/master/mlflow/deployments/base.py.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
dbczumarcommented, Feb 18, 2022

Closing this out now that https://github.com/mlflow/mlflow/pull/4971 is merged. Looking forward to following on with an implementation for predict()! Thanks @jamestran201 !

0reactions
jamestran201commented, Nov 2, 2021

@dbczumar No worries at all! That PR is quite long 😅. Both of your suggestions look very interesting. I’ll try them out, thanks!

Read more comments on GitHub >

github_iconTop Results From Across the Web

mlflow.sagemaker — MLflow 2.0.1 documentation
Deploy an MLflow model on AWS SageMaker. The currently active AWS account must have correct permissions set up. This function creates a SageMaker...
Read more >
Deploying Models to Production with Mlflow and Amazon ...
Lastly, you'll need a role with access to SageMaker. On AWS, go to the IAM management console and create a new role. Then,...
Read more >
Managing your machine learning lifecycle with MLflow and ...
SageMaker is a fully managed service that provides developers and data scientists the ability to build, train, and deploy ML models quickly.
Read more >
How to Deploy ML model to AWS Sagemaker with mlflow and ...
This video tutorial demonstrates how to deploy your Machine Learning (ML) model to AWS Sagemaker with mlflow and with Docker Desktop ...
Read more >
Deploying Machine Learning Models with mlflow ... - YouTube
In this video, I first train an XGBoost model on my local machine (I use PyCharm), and visualize results in the mlflow UI....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found