Support loading model from local storage
See original GitHub issue/kind feature
Describe the solution you’d like
kfserving should support onprem cluster, since there are some user cases which are privisioned for on-premise cluster, and the trained model is stored in local storgae. That’s better to have a easy way to configure local storage such as PV/PVC and then allow modelUri
point to local path that’s mounted by PVC.
Issue Analytics
- State:
- Created 4 years ago
- Comments:13 (11 by maintainers)
Top Results From Across the Web
Support loading model from local storage · Issue #129 - GitHub
@rakelkar Thanks. We have real Pipeline case for on-premise cluster, the model is trained in the upstream steps of Kubeflow Pipeline, and stored...
Read more >Save and load models | TensorFlow.js
Loading a tf.Model. Local Storage (Browser only) ... This tutorial will focus on saving and loading TensorFlow.js models (identifiable by ...
Read more >How to save loaded model in localstorage or IndexedDB
Let's check whether you really can save the model by checking the methods that exist on it. function getMethods(o) { return Object.
Read more >localStorage in JavaScript: A complete guide - LogRocket Blog
In this tutorial, we'll show you how to use the localStorage mechanism and Window.localStorage property and review the basics of web storage ...
Read more >How to Use Local Storage with JavaScript - Section.io
This method fetches items from the localStorage using the getItem function. var paragraph = document.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@rakelkar In our on-prem cluster, we have similar pipelines where serving component loads from mounted PVCs backed by NFS. This is a likely scenario in on-prem KF environments(though underlying storage can be different)
@rakelkar Thanks. We have real
Pipeline
case for on-premise cluster, the model is trained in the upstream steps of Kubeflow Pipeline, and stored local storage (such as NFS), and just want using kfserving to start service. You know, not all user have policy for access clould storage.I strongly suggest to implement this. I thinlk this is a common way to resolve similar problems. And I had a trying webhook way, it works as expected. Will create a PR once finishing tests. Thanks.