Welcome new user! You can search existing questions and answers without registering, but please register to post new questions and receive answers. Note that due to large amounts of spam attempts, your first three posts will be manually moderated, so please be patient.
We have moved to a new forum at http://jevois.usc.edu, please check it out. The forum at jevois.org/qa will not allow new user registrations but is maintained alive for its useful past questions and answers.
Welcome to JeVois Tech Zone, where you can ask questions and receive answers from other members of the community.

Aruco detection in Python

0 votes
Hi.

Thanks for the amazing work you do with the Jevois module.
I would like to use it for vision based aerial landing on a moving ground vehicle that has an Aruco tag on the landing platform. I have been working on this using normal camera for blob detection using Python. So, I would like to know if I can use the current Arucu detection on Jevois but in Python.

Thanks!
asked Dec 19, 2017 in User questions by Mzahana (170 points)

1 Answer

+1 vote

We have not tried, but a quick search on the web suggests that the opencv aruco port (which we use in our C++ module) has python bindings: 

http://answers.opencv.org/question/71208/aruco-python-bindings/

Jevois already includes a full opencv 3.3 with the python bindings, so you should be pretty much ready to use it. So it may just be a matter of taking a look at ArUco.H and ArUco.C and DemoArUco.C in jevoisbase and translating the sequence of calls me make to the opencv ArUco module to python.

Can you tell us more about why you would want a python based aruco detector in jevois? Do you want to further process the detections so that you can directly control your navigation system?

answered Dec 19, 2017 by JeVois (46,580 points)
Being able to make call from Python to methods that are already available from JeVois would a huge benefit.
I'm sure it is possible, but I have no clue where to find out what methods are available, let alone what to call and how.
Yes, we already provide python bindings for

1) all of opencv
2) many of the low-level JeVois functions (to deal with raw camera images, convert them to opencv, etc)

there is some doc here: http://jevois.org/doc/ProgrammerPython.html and http://jevois.org/doc/ModulePythonTutorial.html

Now, ArUco is a full vision algorithm which is not part of the core, think of it as an App for the JeVois operating system. Currently it is implemented as a jevois::Component, which allows for easy re-use by C++ vision modules. In fact, we already use the same ArUco component in several modules (DemoArUco, MarkersCombo). We are looking into providing python bindings for Components and their Parameters, so that one could instantiate components and set their parameters directly in python, but it will take a bit of time.

how about this in the meantime, for ArUco specifically: http://www.philipzucker.com/aruco-in-opencv/

I think you might get away with a few lines of code:

import cv2
import cv2.aruco as aruco

# define JeVois module, process function

aruco_dict = aruco.Dictionary_get(aruco.DICT_6X6_250)
parameters =  aruco.DetectorParameters_create()
 
# get a grayscale input frame from JeVois

corners, ids, rejectedImgPoints = aruco.detectMarkers(gray, aruco_dict, parameters=parameters)

gray = aruco.drawDetectedMarkers(gray, corners)

# send gray frame over to USB
Thank you very much for the thorough answer. I just wanted to add velocity estimation for the detected markers to be outputed from JeVois directly. I will look into your recommendation using OpenCV arucu module. It would be very nice to have Python bindings to JeVois components. The DemoArUco module is already amazing and I like its accurate 3D pose estimation. It would have been super to have velocity estimation as well!
Thanks.
Maybe we can add that in the base module as it sounds useful! How do you plan to get velocity? Just a kalman filter over successive positions, then getting filtered velocity out of it? I have not thought much about it so any pointers would be welcome.
...