Also, I am very happy that our first woGue interview is a reality after all, as it is quite hard for a new blog to make people talk, and I can only assure you that there will be more to come soon, so stay tuned everyone!
First of all, tell us a few things about you and your relation with the FOSS world.
I got seriously into the FOSS world when I started my degree in Computer Science in the University of Évora, Portugal. That University is very fond of Free Software so most of the CS students get into Linux at some point.As part of the CS Students Association, I helped organize conferences related to FOSS and I also created my first FOSS projects at the University like BluePad or Rancho, among others. I am a member of the GNOME Foundation and I maintain the project called OCRFeeder which I originally created for my MSc’s Thesis and kept developing it using the GNOME Project’s infrastructure. What motivates me most is the creation of new solutions to tackle problems that are not yet solved or for which there is no FOSS alternative yet.
What exactly is Igalia, and how can someone be a part of it?
Igalia is actually a company, but a very special one! It is run by a flat structure, that means no bosses, no hierarchies. All decisions are made by an assembly and workers can be part of that when they have been in the company for more than a year. It has been running like this and working on Free Software for more than 10 years now. If you have a Nokia N900, N9 or use a Webkit-based browser, you’re running our code. The company’s main business is to offer consultancy over Free Software technologies as we happen to have very skilled individuals working in those.
How was skeltrack born as an idea and as a project?
Well, we had recently created a new team, called the Igalia Interactivity team, to explore new ways of interacting with users and the improve the state of Free Software technologies that power that. Many of these so called interactive applications use depth cameras, like the Kinect. We sadly discovered that the only solutions for skeleton tracking using those cameras were proprietary and close which is not the way we want to do our work so we decided to solve this problem ourselves and developed Skeltrack.
Skeltrack is briefly described as a library for human skeleton tracking from depth buffers. How does that work really?
Skeleton tracking is the process of getting the information of skeleton’s joints (and in this case, the human skeleton) so we can tell for example where a certain joint is spatially and what it is — a shoulder, an elbow, etc.
The part about “depth buffers” is because Skeltrack is device agnostic, that is, it does not connect directly to the Kinect or any other device. Instead, programmers should retrieve the depth information from those devices using other libraries and feed it to Skeltrack to obtain the skeleton’s information found. This information can then be used to implement many ideas.
Are you currently the only developer, or is there a team of developers who collaborate to make skeltrack better? Also, how can someone who is interested in your project get involved in the development of it, or just contribute an idea?
I am currently the only developer. The project is hosted in GitHub, licensed under LGPL and like any project Igalia has created, there is no funny copyright transfer agreements or any other weird deals, just hack on it and send us patches. For ideas or bug reports, people can also use GitHub’s issue tracker or email me directly.
What do we need in order to test skeltrack, and what can we do with it right now?
Skeltrack comes with an example that shows its use with the Kinect. This example uses the GFreenect library that the Igalia Interactivity also developed and connects directly to the Kinect. It will show a window with two areas, one shows a draw of a skeleton by connecting the joints that Skeltrack is detecting and the other area shows a video of what the Kinect is “seeing”. It contains instructions on how to control the Kinect and Skeltrack’s features. This is the way to quickly test Skeltrack.
As for what can be done, since it is a library, it can be used for whatever use case that needs to know about the user’s skeleton joints. It currently gives 7 joints, the head, shoulders, elbows and hands. With these joints, a world of ideas can be developed and, since it is Open Source, anyone can tweak it to their needs.
Are there any projects that you know of, or collaborate with aiming to create open source hardware for use with skeltrack?
Since Skeltrack is device agnostic, there is not a need for a specific device but of course I would be very happy if someone produced an open hardware version of the Kinect or the Asus Xtion Pro devices.
Although Skeltrack can work on any desktop environment, you chose to showcase it on GNOME Shell. Is there a particular reason for this?
We need to distinguish between Skeltrack and the desktop control application I wrote. Skeltrack is the library used to get the user’s skeleton joints’ positions in order to, on the application’s side, interpret the gestures and produce mouse and key events. Skeltrack is actually developed with GNOME technologies, particularly Glib, but of course in no way tied to the GNOME desktop so the reason for choosing it is that I have been for many years a user of GNOME and I am also a member of the GNOME Foundation. GNOME 3 or the GNOME Shell also provide a user experience that is getting very touch oriented and while we do not use touch events in the demo, several particularities make it easier to control using gestures.
What is the biggest difficulty you have in the development of skeltrack?
Skeltrack involves very interesting challenges. It uses mathematics to figure out where a user is and to infer the skeleton joints as opposed to other approaches where, for example, a database of poses might be used for comparison. Making this fast, simple and robust enough is very challenging so this might be the biggest difficulty but it is also what makes working on it so interesting.
Now that the smoothing is done, what will be the focus of development for the next few months?
One of the limitations of Skeltrack currently is that it tracks only one user who must be alone (no objects or other people in the depth field). I want it to detect more than a user at a time and to be robust enough to discard the nearby objects. Another goal is to infer the rest of the relevant joints that aren’t currently being tracked like the hips, knees and feet.
Can you tell us anything about when skeltrack will be relatively stable and usable for “serious” work?
Skeltrack is only a few months old and I think there’s is plenty of interesting things that can be done now if other people want to join the project. The recent smoothing implementation already gives a good stability to the tracked joints so it depends of course on the “seriousness” of the work we’re talking about :) but I would say that we might have interesting improvements coming throughout the year.
Do you believe that interacting with desktop environments using only your bare hands is the future, or is it something that will find use only in some certain sectors?
I personally think that the current way we interact with a computer in our desks is not going away for a long time but we will surely see gesture or “bare hands” based control getting more common in our lives.
For example the living room might be one of them, why using a remote if one can just wave something at the TV? Advertising is also something that works very well with these new technologies, for example, passing by a store window and realizing that a display reacts to one’s gestures is something engaging and attracts the attention of people.There is a never-ending number of ideas that these technologies make possible and that’s what Igalia Interactivity is currently doing while making sure that the technologies used are Open and available for all.
Thanks Joaquim! It was great getting to know more about the magnificent project of yours. I am sure that we will be getting more positive news from your side, and hope that we’ll see Skeltrack become a part of the GNOME DE thus greatly expanding our potential!