Maybe your video mapping is wrong? You would need something like
YUYV 320 240 30.0 YUYV 640 480 30.0 JeVois MyPythonModule
which will output 320x240 video over USB while capturing 640x480 from the camera sensor.
then you would just resize your image to 320x240 and do an outframe.sendCvBGR(). Beware that OpenCV uses rows, columns (i.e., height, width) in some functions but width, height in others (like cv::Size), so maybe check the docs of the opencv resizing function to make sure width and height are not swapped.
If the problem persists, please post your mapping definition and python code and we can assist you further.