question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How does the mlflow artifact proxy server configure AWS credentials?

See original GitHub issue

I am trying to use an mlflow server with proxied artifact storage set up in s3.
https://www.mlflow.org/docs/latest/tracking.html#logging-to-a-tracking-server

The mlflow server is running in its own container in an ec2 instance. So far it is either unable to access the s3 bucket (invalid access token) or I’ve had to configure client side credentials. When I configure client side credentials i am able to load a model to the registry. Also, I can use the mlflow client to list registered models just fine.

since the mlflow server is running in a container do you need to map a folder so that it can access aws credentials? Do the credentials need to be stored in environment variables when the server is started? How else would the mlflow server in the container get the aws credentials to access the S3 bucket?

This is what I’m running to start the mlflow server: CMD mlflow server --backend-store-uri ${BACKEND_URI} --default-artifact-root ${ARTIFACT_ROOT} \ --host 0.0.0.0 --port 5000 --artifacts-destination ${ARTIFACT_ROOT} \ --serve-artifacts

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:7 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
zsterncommented, Jul 19, 2022

Created new experiment through gui and tried to upload artifacts and also created new experiment via the code snippet you listed above. Tried to log artifacts and get NoCredentialsError: Unable to locate credentials , though in doing some other tests there may be some other admin settings preventing access. For some reason the boto3 access, which was working, does not work now. Feel free to close this ticket and I will reopen if necessary if mlflow appears to be the source of the error again. Thank you.

0reactions
harupycommented, Jul 19, 2022

@zstern Any updates here?

Read more comments on GitHub >

github_iconTop Results From Across the Web

MLflow Tracking — MLflow 2.0.1 documentation
The MLflow server can be configured with an artifacts HTTP proxy, passing artifact requests through the tracking server to store and retrieve artifacts...
Read more >
MLflow proxied artifact access: Unable to locate credentials
By default, data will be logged to the mlflow-artifacts:/ uri proxy if the --serve-artifacts option is enabled. Otherwise, the default location ...
Read more >
Building Blocks of MLOps —Model Tracking with AWS and ...
This article will explore how you can set up a password-protected MLflow server for your entire team. The main focus will be model...
Read more >
MLFlow on GCP for Experiment Tracking | by Isaac Kargar
Here is the explanation from MLFLow documentation: MLflow's Tracking Server supports utilizing the host as a proxy server for operations involving artifacts.
Read more >
Deploy MLflow with docker compose - Towards Data Science
MLflow obtains credentials to access S3 from your machine's IAM role, a profile in ~/.aws/credentials, or the environment variables AWS_ACCESS_KEY_ID and ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found