Showing posts with label HCI. Show all posts
Showing posts with label HCI. Show all posts

Friday, February 8, 2013

Ideas and Some Progress

Currently, I am very far behind in implementation for both projects. That has been a result of not having the devices to work on my projects. Fortunately, for the next few days I do have the a Leap Motion to work to do some testing (I borrowed it from friend as a temporary fix). Hopefully, I will get to do some basic testing.

Some ideas I have for moving forward with the Leap Motion are:
  • Using multiple leaps- The idea is to have a Leap-Box. Essentially, a user can put their hand into this box and multiple leaps positions around in a circle will be used to detect individual finger motions. The hope is that with the extra spatial information, isolating finger motions will be easier. Likewise, identifying the bends of the finger will be as well.
  • Splitting an angle of motion across the finger joints- as an early prototype (or potentially method for dealing with unknowns like finger segment lengths and rate of angle change), it would be useful to assume that all finger joints bend the same amount as the finger is bent. Obviously, this is not the case (different finger segment lengths can var the angles) and bending the finger is not a simple motion- each segment is potentially independently bendable and they do not bend at once normally. The base of the finger (where the finger is connected to the knuckle) does first; followed by the middle segment; then the tip segment. I will not get into the thumb at this moment because its range of movement is much larger, posing more problems that will be dealt with later.
  • Building a few prototypes or examples would be very helpful. 
This weekend, I am going to get started on a new proposal for the Leap project. It may be wishful thinking, but hopefully something will be able to come from it.

For my Nuisical Project- I was able to get in touch with David Yang, a former Digital Media Design student, who also did his senior design on NUIs. He also used the Kinect, but his implementation dealt with the physical manipulation of shapes in 3D space. The chat between the two of us is pretty long but  the take away points are:
  • Break down movements into small action- by using velocity, relative change in position, etc. It is possible to identify specific movements and store them as 'recognizable gestures.
  • Use hip position as a standard marker for gesture detection- the Kinect is very good at identifying the location of the hips in space. David suggest using this as the basis for comparing my gestures/movements. 
  • Establish a database of gestures- put simply, we need to have previous gestures/a database filled with gestures that are recognized for comparison purposes.

Friday, February 1, 2013

In the midst-

I am currently in the midst of two projects ideas. 

The first is the one for which this research blog is named- Nuisical (pronounced like 'musical'). This was my first idea focusing on producing a Natural User Interface for Sound Interaction and Composition. I have been pondering it for quite some time, but after finding that a great deal has been done in similar respects using the Microsoft Kinect (see the V Motion Project), I am not certain the endeavor would (1) contribute much to the field,  or (2) be interesting enough to propel me through the semester-long project. 

That said, I have still been doing research on the topic, exploring Application Programming Interfaces (APIS) for the Kinect and audio platforms, documentation on NUIs. I intend on attempting basic prototypes once I gain access to the Center for Human Modeling and Simulation, but that may not begin until some time next week (ID-card issues).

My second project idea is an extension of a multi-user American Sign Language gesture translator  prototype, called Social Sign, that I built with 2 other individuals during the 40-hour PennApps Spring 2013 hackathon. Social Sign used the new Leap Motion controller, a new gesture-recognition technology that has unparalleled detection of nuanced hand and finger movements.  There is great promise for building responsive and easy to use NUIs using the device. 

The extension, for which I am currently working the details out, focuses on using the Leap Motion controller as a device to detect joint angles in hand and finger movements. To a degree, this information can be captured using developments in wearable globes but the gloves do not provide enough information and restrict hand movement. The feat of the Leap Motion device is that it is entirely free of hand obstruction and is roughly the size of two-stacked packs of gum. That is, it permits a user to move their hand freely. With information such as joint angles, rigging hand models for character animations could be as simple as waving a hand over the Leap Motion device. 

I have been combing through the Leap Motion API to find out what sorts of data the device can return so that I can determine if it will be useful for obtaining joint angles. So far I have come up with the following:
  • List of fingers/pointables visible  - (objects are declared as tools or fingers)
    • Finger: has a reference frame, id, and associated hand. The ID is unique while hand is still in view, but it is lost once the object disappears for the Leap’s visual field.
    • Frame: contains references to all the hands, fingers and pointables in the scene. Also can tell you the rotation matrix/angle/axis between two frames. Scale factor, Translation vector.
  • List of Hands visible
    • Hands have direction, fingers, an id, a frame. 
    • Palm-normal, palm velocity, palm position
    • Rotation angle/axis/matrix
    • Sphere Center, Sphere Radius (as if hand was holding an invisible ball).
    • Translation
One of the difficult tasks in dealing with this data from the Leap Motion, at least from what I understand, is that fingers are perceived as straight vectors, regardless of their bending. I need to do a bit more research into this. It may be the case that from the information provided, joint angles may be an impossibility. Tomorrow I will be meeting with one of my teammates to discuss the project further. Based on that conversation, my decision for switching may be finalized. (If that's the case, my blog will be re-locating to a more appropriate domain name).

For now, I have put together some of the materials I will be reviewing over the next week for both projects, until I have a clearer sense of which one is more feasible for the semester and more importantly, interesting to me.

Nuisical Resources
  • Music & Computers - an online resource book that digs into the aspects of digital sound and computation.
  • Brave NUI World - just received this book by a designer and researcher of the Microsoft Surface that describes methods for developing NUIs.
  • KinectSpace - a new discovery which allows a user to train gestures into the Kinect for later recognition.
Gesture-Recognition and Translation for Sign-Language