Conversion to onnx
Conversion to onnx is not supported by ORTModelxxx .
Can support of architecture can be provided for gemma2 models.
Plz tell me how i can convert alternately to onnx to run on device model
Hi @Parma7876 , Sorry for late response, If you want converted ONNX model to be compatible with a certain ONNX version, please specify the target_opset parameter upon invoking the convert function and Gemma2 model can be converted to TensorFlow's SavedModel format, you can then use tf2onnx to convert it to ONNX. Kindly refer this link for more information. Thank you.
@lkv
Thanku for the response ,i was trying to export gemma2 2b to onnxx using Transformer library ORTModelxxx but it did not convert then i tried using torch.onnx.export and it also failed to export to onnx.
Your suggesting is to convert huggingface gemma2modelforcasuallm which is a pytorch model to convert it to tensorflow saved model and then convert it to onnx.i will definitely try but could u plz tell me how i can convert gemma2modelforcasuallm to tensorflow model first as i can see tensorflow model is not supported according to huggingface documentation.
Thanku in advance.