pvc problem with custom inferenceservice
See original GitHub issue/kind bug
What steps did you take and what happened:
I’m trying to deploy a ‘custom’ inferenceservice (just using the base tensorflow/serving image). while i am able to declare the volumeMounts within spec/default/predictor/custom/container, there does not seem to be a place to specify the volumes themselves. looking at the base InferenceService spec, only default and canary seem to be valid at the top level of the spec. Here i’m using a volume i know to work when creating a standalone pod (and it works in general with a tensorflow predictor when referenced with ‘storageUri: pvc://…’
apiVersion: serving.kubeflow.org/v1alpha2
kind: InferenceService
metadata:
labels:
controller-tools.k8s.io: "1.0"
name: nlu-confidence-exam
spec:
volumes:
- name: azure-file-volume
persistentVolumeClaim:
claimName: azurefile
default:
predictor:
custom:
container:
image: tensorflow/serving
env:
- name: MODEL_NAME
value: "nlu-confidence-exam"
- name: MODEL_BASE_PATH
value: "/data/nlu-models/confidence/exam"
ports:
- containerPort: 8500
volumeMounts:
- mountPath: /data
name: azure-file-volume
readOnly: true
What did you expect to happen: Expected the volume to be mounted but instead I get this error:
Error from server: error when creating "nlu-confidence-exam.yaml": admission webhook "inferenceservice.kfserving-webhook-server.validator" denied the request: Custom container validation error: volumeMount has no matching volume: volumeMounts[0].name
Anything else you would like to add:
Environment: Running with tag 0.2.1 on top of KF 0.7 base install.
Issue Analytics
- State:
- Created 4 years ago
- Comments:9 (6 by maintainers)
Top GitHub Comments
Is there a way to write to pvc? The PVC mounted here is read only. But I confirmed this method is working for read only purpose pvc.
Can you try this? there is a well-known environment variable
STORAGE_URI
for custom container which kfserving understands and create the pvc volume.