'Import of exported Vertex-AI AutoML model in production fails
I want to deploy a Vertex-AI model in a production project which has been trained in a training project.
----TRAINING PRJ----- --------PRODUCTION PRJ---------
Train > test > export > import > deploy | batch predict
I follow these instructions and get a success email:
Vertex AI finished uploading model "xxxxxx".
Operation State: Succeeded
but when I try to test the model with a Batch prediction I always get a failed message:
Due to an error, Vertex AI was unable to batch predict using model "TEST".
Operation State: Failed with errors Error
Messages: INTERNAL
Please note deploying the model to an endpoint and testing with a JSON request it does provide the expected response.
I tried several container types besides the one suggested here, included the one stated in the exported model's environment.json
container_uri
: the batch prediction always fails with message INTERNAL
Any clue?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|