OpenCV Charuco-Board Pose Estimation

The OpenCV Charuco-board pose estimation tools can provide localization estimation relative to your camera. The red-green-and blue axes drawn at the upper-right of the charuco-board taped to my wall indicates the pose estimate.

These instructions are not particularly user-friendly, but got me started. I’m not really sure that my distance coefficients are set up correctly. Hopefully I’ll have a chance to clean that up and post an update if I make improvements.

Please comment if you’ve got ideas for improvement or if you found this post useful.

Overview

  1. Obtain accurate configuration settings
  2. Compile the pose estimation program
  3. Run and verify configuration settings

Step 1: Obtain accurate configuration settings

For my program to work, you should have a board configuration file called See my previous post on opencv charuco calibration for details.

Step 2: Compile the included pose estimation program

Download the following python3 program and save it into the same folder as the two XML files from step 1.

# charuco_pose.py
#
# Peter F. Klemperer
# October 29, 2017
import pdb
import time
import numpy as np
import cv2
import glob

import cv2.aruco as aruco

cap = cv2.VideoCapture(0)

def read_node_real( reader, name ):
 node = reader.getNode( name )
 return node.real()

def read_node_string( reader, name ):
 node = reader.getNode( name )
 return node.string()

def read_node_matrix( reader, name ):
 node = reader.getNode( name )
 return node.mat()

# read defaultConfig.xml presented to
# opencv_interactive-calibration
config_reader = cv2.FileStorage()
config_reader.open("defaultConfig.xml",cv2.FileStorage_READ)

aruco_parameters = aruco.DetectorParameters_create()
aruco_dict_num = int(read_node_real( config_reader, "charuco_dict" ) )
aruco_dict = aruco.Dictionary_get(aruco_dict_num)
charuco_square_length = int(read_node_real( config_reader, "charuco_square_lenght" ))
charuco_marker_size = int(read_node_real( config_reader, "charuco_marker_size" ))
config_reader.release()

# TODO clean this up and use config parameters?
# look up definition o charuco_square_lenght, charuco_marker_size
# 5 was width (x) passed to interactive-calibration
# 7 was height (y) passed to interactive-calibration
# .034 is square_length (meters)
# .02 is marker_size (meters)
charuco_board = aruco.CharucoBoard_create( 5, 7, 0.034, 0.02, aruco_dict )

# read the cameraParameters.xml file generated by
# opencv_interactive-calibration
camera_reader = cv2.FileStorage()
camera_reader.open("cameraParameters.xml",cv2.FileStorage_READ)
camera_matrix = read_node_matrix( camera_reader, "cameraMatrix" )
dist_coeffs = read_node_matrix( camera_reader, "dist_coeffs" )
camera_reader.release()


while(True):
 time.sleep( 0.1 )
 # Read frame from Camera
 # convert frame to grayscale
 ret, frame = cap.read()
 gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)



corners, ids, rejectedImgPoints = aruco.detectMarkers(gray, aruco_dict, 
 parameters=aruco_parameters)

# if enough markers were detected
 # then process the board
 if( ids is not None ):
 #gray = aruco.drawDetectedMarkers(gray, corners)
 ret, ch_corners, ch_ids = aruco.interpolateCornersCharuco(corners, ids, gray, charuco_board )
 
 # if there are enough corners to get a reasonable result
 if( ret > 5 ):
 aruco.drawDetectedCornersCharuco(frame,ch_corners,ch_ids,(0,0,255) )
 retval, rvec, tvec = aruco.estimatePoseCharucoBoard( ch_corners, ch_ids, charuco_board, camera_matrix, dist_coeffs )

# if a pose could be estimated
 if( retval ) :
 frame = aruco.drawAxis(frame,camera_matrix,dist_coeffs,rvec,tvec,0.032)

# imshow and waitKey are required for the window
 # to open on a mac.
 cv2.imshow('frame', frame)

if( cv2.waitKey(1) & 0xFF == ord('q') ):
 break

cap.release()
cv2.destroyAllWindows()

Step 3: Run and Verify

Run the program and verify. I made the axis length the same as the black square edge to try to confirm I’d chosen the distance coefficients accurately, but I haven’t done any testing beyond that.

Summary

There are many of these tutorials on the web, and this one is mine. I hope that you find my sharing it useful.

Resources

  • https://dspace.mit.edu/bitstream/handle/1721.1/83725/864440992-MIT.pdf?sequence=2
This entry was posted in MOBOT, Projects. Bookmark the permalink.

Leave a Reply