I have a very small custom model built with tensorflow 2.0 that I can convert to a model.tflite and run it on a Coral, but when I load this model into the jevois and run the jevois tensorflow saliency demo I get the following error, from the console.
<!--StartFragment-->INF TensorFlow::operator(): Loaded model /jevois/share/tensorflow/fma224/model.tflite
ERR TensorFlow::Report: Op builtin_code out or range: 74. Are you using old TFLite binary with newer model?
ERR TensorFlow::Report: Op builtin_code out or range: 82. Are you using old TFLite binary with newer model?
ERR TensorFlow::Report: Registration failed.
FTL TensorFlow::operator(): Failed to construct interpreter
Where does the TFLite binary live so that I can replace it with a new one?<!--EndFragment-->
what version of tensorflow does the jevois use by default?