Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Mobile] Error: Can't load a model: Error Code - ORT_INVALID_PROTOBUF #22927

Open
rohit-monere opened this issue Nov 22, 2024 · 1 comment
Open
Labels
api:Javascript issues related to the Javascript API platform:mobile issues related to ONNX Runtime mobile; typically submitted using template platform:web issues related to ONNX Runtime web; typically submitted using template

Comments

@rohit-monere
Copy link

rohit-monere commented Nov 22, 2024

Describe the issue

Error loading ONNX model : Error: Can't load a model: Error Code - ORT_INVALID_PROTOBUF - message: Load model from /data/user/0/ai/monere.niada/files/ failed: Protobuf parsing failed

To reproduce

I am trying to load model to detect eyeLid and model has been places inside android/app/src/main/assets/model/model.onnx but somehow unable to load model and code isn't working. Local import isn't working as javascript import doesn't support by this package

Error: Error loading ONNX model : Error: Can't load a model: Error Code - ORT_INVALID_PROTOBUF - message: Load model from /data/user/0/ai/example/files/ failed: Protobuf parsing failed

Please help me with it so that I can use this model for eyelid detection

in package.json
"onnxruntime-react-native": "^1.20.0",

"onnxruntimeExtensionsEnabled": "true"

Code:

import * as ort from 'onnxruntime-react-native';

 React.useEffect(() => {
    loadONNXModel();
  }, []);

  const loadONNXModel = async () => {
    try {
      const modelPath = 'models/model.onnx';
      console.log('modelPath :', modelPath);
      const fileData = await RNFS.readFileAssets(modelPath, 'base64');
      console.log('fileData :', fileData);
      if (!fileData) {
        console.error('Model file not found in assets.');
        return;
      }

      // Save the file to the document directory
      const localPath = `${RNFS.DocumentDirectoryPath}/model.onnx`;
      console.log('localPath :', localPath);
      await RNFS.writeFile(localPath, fileData, 'base64');

      // Load the ONNX model
      const session = await ort.InferenceSession.create(localPath);
      console.log('session :', session); //this session will be used to capture eyelid is in frame or not.
      console.log('ONNX model loaded successfully:', session);
    } catch (error) {
      console.error('Error loading ONNX modelasdga:', error);
    }
  };



const onCaptureBtnPress = async () => {
    if (!camera.current || !session) {
      Alert.alert('Error', 'Camera or ONNX model is not ready.');
      return;
    }

    try {
      const photo = await camera.current.takePhoto({
        quality: 1,
        skipMetadata: true,
      });

      const base64Image = photo.base64;
      const isEyelidPresent = await checkEyelid(session, base64Image);

      if (isEyelidPresent) {
        Alert.alert('Success', 'Eyelid detected. Capturing image...');
        setCameraClicked(true);
        const imageFile = await camera.current.takePhoto({
          qualityPrioritization: 'speed',
        });
        console.log(imageFile);
        setImageData(imageFile);
      } else {
        Alert.alert(
          'Error',
          'Eyelid not detected. Please adjust and try again.',
        );
      }
    } catch (error) {
      console.error('Error capturing photo:', error);
    }
  };

  const checkEyelid = async (localSession, base64Image) => {
    try {
      const imageBuffer = Buffer.from(base64Image, 'base64');
      const inputTensor = new ort.Tensor(
        'float32',
        imageBuffer,
        [1, 3, 224, 224],
      ); // Adjust dimensions as per your model
      const results = await localSession.run({input: inputTensor});
      const output = results.output; // Replace with your model's output node name

      return output.data[0] > 0.5; // Assuming the model returns a confidence score
    } catch (error) {
      console.error('ONNX model inference failed:', error);
      return false;
    }
  };

Urgency

Yes, It is on high urgency as I have already wasted two days and I have deadline to finish this asap. Please help me here how to load the .onnx file exactly so that I can use it.

Platform

Android

OS Version

Android 13.0

ONNX Runtime Installation

Built from Source

Compiler Version (if 'Built from Source')

No response

Package Name (if 'Released Package')

onnxruntime-react-native

ONNX Runtime Version or Commit ID

^1.20.0

ONNX Runtime API

JavaScript

Architecture

X64

Execution Provider

Other / Unknown

Execution Provider Library Version

No response

@rohit-monere rohit-monere added the platform:mobile issues related to ONNX Runtime mobile; typically submitted using template label Nov 22, 2024
@github-actions github-actions bot added api:Javascript issues related to the Javascript API platform:web issues related to ONNX Runtime web; typically submitted using template labels Nov 22, 2024
@skottmckay
Copy link
Contributor

Have you tried putting the onnx file in the raw folder so it's not compressed and can be used directly?

Have you validated the model outside of react native to ensure you're starting with a good onnx file? e.g. use onnx python module to load it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api:Javascript issues related to the Javascript API platform:mobile issues related to ONNX Runtime mobile; typically submitted using template platform:web issues related to ONNX Runtime web; typically submitted using template
Projects
None yet
Development

No branches or pull requests

2 participants