'Uncaught Error: Based on the provided shape, [1024,3], the tensor should have 3072 values but has 30

I am trying to build a Tensorflow.js app, it works all good locally, but when I hosted it (I tried Netlify and Vercel on both Firefox and Chrome) I got this error: Uncaught Error: Based on the provided shape, [1024,3], the tensor should have 3072 values but has 30. Any ideas? My only one was that it would have to do something with CORS but I don't know.

Hosted: enter image description here

Localhost: enter image description here

JavaScript code: here



Solution 1:[1]

you will get such error messages if the bin file is corrupt. This can happen, for example, if you downloaded a repo from GitHub as a ZIP file, which uses Git LFS instead. To avoid that you need to clone it or download the bin file separately. In your case I assume that your website hoster does not allow the file extension .bin and you might get an error message as content of the file back. Also the download times of the json and bin are a bit surprising. Both are almost identical, although the bin file should be much bigger and therefore requires a longer download. Best regards, Sascha

Solution 2:[2]

I too faced this issue while running face-api.js models integrated on a React app with a NodeJS server. It was working fine on the local dev environment , but on building and running it, it was not able to resolve the model's promises, with the same error. On exploring the issue, earlier I was using "serve" (npm i -g serve) to host my build app, and it was getting this error. But on changing hosting server to Nginx web server, I application was successfully hosted with this issue resolved. Switch to Nginx server to host, it will work.

Solution 3:[3]

Thank you so much for the reply. I was downloading the model files from the github repository and had the same error.

Way Around [Solution]:

Opened up the demo web application provided in the same github repo and then from the developer tools from the network tab, downloaded the model shared files and model.json file and then ran the local demo again and it worked.

Thanks for the help.

Solution 4:[4]

I got the similar error with making use of tensorflow.js in a react application even though neither have I downloaded the bin file from github nor extracted it from a zip file. I was able to serve the model.json and binary file locally perfectly fine but when I deploy it, I was first getting a runtime error and then a on the second try a similar error with different numbers.

It pulled out a sheet metal until I come here and see that maybe I need to check the checksum of the binary files. Well yes, this was the issue. They were different. Git wasn't able to see it. Even though I remove the binary and created a .gitattributes configuration to avoid git making changes to the bin file, It didn't solve the problem. As a temporary solution I upload the binary file manually to our repo for now and it works. I will update this answer as soon as I get to meet with our devOps engineer and have a better understanding for this issue.

Solution 5:[5]

I was getting a similar error using IIS Express however I could run the exact same files in VS Code using Live Server with no issues.

After a few days I found this is because the extensionless shard files not having a MIME mapping for them and may be downloaded wrongly. I ended up serving these files from a public folder in local IIS and adding a MIME mapping for "." (extensionless files) to "application/octet-stream" (MIME for binary files).

Later I found this comment on Github that says you can just add the ".bin" extension but you have to change both paths in the corresponding manifest.json

I think either solution should work. The former worked on my Blazor Server project

Solution 6:[6]

I am using live server extension for Visual Studio Code IDE, and found it gives me this error: GET http://127.0.0.1:5500/model.weights.bin 404 (Not Found) Uncaught (in promise) Error: Based on the provided shape, [3,3,3,8], the tensor should have 216 values but has 39

So I created a new folder named model, and moved the two files, model.json and model.weights.bin, into it. And changed the file path to the new location: classifier.load("./model/model.json", customModelReady);

Then the error is gone.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Sascha Dittmann
Solution 2 Akarsh Srivastava
Solution 3
Solution 4 osmancakirio
Solution 5
Solution 6 wcb1