There are basically three limitations:
- sensor (see max rates at http://jevois.org/doc/Hardware.html -- for example, for 1280x1024 the max is 15fps). You cannot speed it up, it has to do with clock rate in the sensor, minimum required exposure time, etc
- cpu speed, nonoptimal algo, complex algo. You can certainly work on speeding that up. As long as the CPU reading is not 400%, you are under-utilizing the quad-core CPU... TensorFlowEasy currently is the one that gets closest to 400% CPU usage.
- USB 2.0 bandwidth limit for camera data (it is 24 MB/s), so if you are streaming YUYV which is 2 bytes/pixel you can get at most about 9 fps at 1280x1024. See some discussion of that here, in the videomappings.cfg file: http://jevois.org/doc/VideoMapping.html
# Note on USB transfer rate: the maximum actual pixel data transfer rate is 3070*8000 = 23.9 Mbytes/s (which is 3kb/USB
# microframe, max "high bandwidth" setting). Although USB 2.0 has a maximum theoretical rate of 480 Mbit/s, this
# includes protocol overhead and not all of the bandwidth is available for isochronous (real-time stream) transfers,
# which we use. This means that SXGA YUYV (2 bytes/pixel) can only transfer at a max rate of ~9.3 fps over the USB
# link, although the camera can grab SXGA YUYV at 15 fps. SXGA in Bayer can achieve 15 fps transfer over USB since it
# only uses 1 byte/pixel.
but keep in mind that ultimately, the goal of JeVois really is not to transmit video, as this is usually intended for human consumption. An autonomous machine vision system ideally should in many cases just transmit some serial messages, telling a robot what it sees. In this case, you just have the first two limitations and drop the third one.
Heat is not a limit thanks to our extra fast fan!