Use local model file but have an error "Not allowed to load local resource: file:///D:/model/gemma-2b-it-gpu-int4.bin" #5597
Labels
platform:javascript
MediaPipe Javascript issues
stat:awaiting googler
Waiting for Google Engineer's Response
task:LLM inference
Issues related to MediaPipe LLM Inference Gen AI setup
type:support
General questions
Have I written custom code (as opposed to using a stock example script provided in MediaPipe)
None
OS Platform and Distribution
Firebase Hosting
MediaPipe Tasks SDK version
No response
Task name (e.g. Image classification, Gesture recognition etc.)
/llm_inference /js/
Programming Language and version (e.g. C++, Python, Java)
html, javascript
Describe the actual behavior
can not access local model file
Describe the expected behaviour
can access model file
Standalone code/steps you may have used to try to get what you need
Other info / Complete Logs
No response
The text was updated successfully, but these errors were encountered: