Sunday, February 24, 2013

Live Position Updates in Maya through the Leap Motion Controller

I postponed updating my research last week because I was not able to make progress in my implementation. This update, however, serves to highlight the results I have obtained solely during this weekend. 

My work has been on two components-the first involves using the Python Leap SDK to filter frame data to isolate a particular finger, and then using that finger data to obtain motion information (specifically, the end tip position of an effector after a motion); the second is integrating my data into Maya to update finger positions in real time.

Leap Motion Controller Coordinate System
I managed to implement a quick filtering algorithm that does basic sorting of fingers by x-position (relative to the Leap's coordinate system). In this manner, it is possible to order the fingers on the hand such that we can easily determine each of them. I selected the index finger as my finger of focus for this preliminary implementation. After obtaining the index finger information, I used the direction vector of motion to determine the finger's positioning in relation to the base of the finger. From the Leap perspective, the palm position was used as the base position. When mapping this information into Maya, the base position is taken to be the knuckle position.

Initially, I wrote my program in the style of a Maya plug-in using the Maya Python API. The idea was that it would be portable, effective, and fully integrated. I soon uncovered that this implementation had terrible performance effects. To elaborate, the Python Leap Motion SDK features a base class that permits a Python program to listen for 'frame' events. On a new frame capture from the Leap Motion controller, a function can be called to handle the frame data (this is where I integrated my filtering and sorting algorithm). Actual capturing of the frame data runs on a thread that only terminates after user input.

When running my script in Maya, the rapid data capture coupled with the waiting for a user input process halted the user interface (including the script editor). This is the result of scripts in the Maya running on the Main thread). When I terminated program through user input, all of the captured Leap data popped off the program call stack. Each command was then processed, but the finger tip position only snapped to the finger tip position of the most recent frame. More specifically, this was not real time and was not animated.  

I approached the problem by attempting to launch multiple threads for the processes in Maya and using different means of obtaining user input. Each attempt only resulted in Maya crashing or a complete UI halt. With the help of a friend, who found this resource for opening socket connections in Maya, I was able to open a command port, and use my Python program from within a bash (Terminal in OS X). 

The features of this approach are:
  1. The open-socket permits commands to be run continuously without halting the UI in Maya. 
  2. There is no need to install python modules/packages
  3. Any version of Python can be used from the Bash. Likewise there are no limitations to what modules you use in the Bash. This is unlike the Maya version of Python that limits the use of modules and their functionality.
To demonstrate my progress, I have included a video showing the live updating of a finger joint position in Maya. The left pane is the Maya Editor window. The top-right pane is the Maya Script Editor. The bottom right pane is the Bash window that has my Python program running. It currently prints out the effector tip position for each frame capture.



My next goal is to implement the Cyclic Coordinate Descent algorithm to perform inverse kinematics and estimate the phalangeal joint angles.

No comments:

Post a Comment