Select Page

Leap Motion

Our first word:

Give Us A Hand

For the better part of the last century, interfacing with a computer has been based almost exclusively on the idea of touch. You hold a mouse. You type on a keyboard. You tap on a screen. Computers have been driven by haptics, the idea that you interface with the digital world through a physical medium.

But what if you could cut out the interface middle man? What if your hands alone, without a device to hold, or touch could interface with your computer to check email, launch programs, or play music and games?

This is the promise that the Leap Motion brings with it.

What? How?

The Leap, which is a small rectangular device about 6 inches in length, uses cameras and infrared tracking to monitor the position of your hands in space. If you imaging assuming a normal typing stance a few inches above your keyboard, the Leap would typically be positioned on a flat surface between your thumbs. After a little setup, it translates the movement of your hands (like swiping, opening and closing your fists, pointing, etc) into meaningful actions which can be used to control your computer, allowing you to conduct basic operations without ever touching anything.

According to the folks at Leap, the sensor is capable of tracking movements in a rectangular space up to 2 feet above the sensor and 4 feet to the right or left of it. In our own experience, the range is a bit tighter, but you get the general idea.

The first time you flip through your music playlist by waving your hands  like a wizard, you’ll be sure you’ve time travelled into some sort of amazing future of computing.

Where It Fits

While the Leap is capable of some basic functions of your computer out of the box, it is ultimately an imperfect primary control device. In large part, this is because we have integrated so much typing into current computing practices and training yourself to type without the force feedback of a keyboard of touch screen is incredibly difficult (jumping between a keyboard and Leap is also an awkward proposition.

Fortunately, the Leap has its own app store where you can search from hundreds or free and paid apps designed specifically to take advantage of the benefits of a motion control device. While their are certainly a myriad of productivity and computer control related apps, the true gems are those which design interactions specifically around the hand as a control device. I spent some time working in a sculpting app, rotating 3D models of the human skeletal structure and exploring the world with a customized Google Earth app. They also have an available developer SDK, in case you are interested in leveraging the device for your own purposes.

 

Really awesome words, extremely awesome words and then even more awesome words about this product which does things.

Faculty Member

Our First Question

What does an omnipresent, voice activated support device like the Echo mean for classroom or learning space design?

Step Into A World:  Zac Zidik explores a virtual classroom environment using the Oculus

The Leap brings the future we saw in Minority Report to life.

The VR Connection

When most people put on a VR headset, one of the first things they do is reach is, looking for their hands in the virtual space. Natively, VR headsets do not support hand usage (at least not yet), which has created a new opportunity for the Leap to shine. By mounting the Leap to a VR headset (specifically the Oculus Rift, with whom Leap has a formal partnership), its hand tracking capabilities can be applied to VR space.

Somewhat ironically, the Leap might be even more useful in this context than it is as an interface device to more traditional, or even Leap specific programs. In our own experience, in an entirely virtual setting its a bit easier to let go of the need to “feel” something when you “touch” it, as compared to when you pair the Leap with a traditional computer and monitor. The brain is a strange beast.

The Opportunity

As we think about ways to virtualize more and more educational experiences (for example to facilitate online learning) the Leap offers an intriguing option as a means to facilitate what we’ll call “semi-natural” experiences. True, when you throw pottery in a Leap facilitated studio, or dissect an organ in a Leap facilitated lab, you don’t feel the clay or the scalpel in your hands, and important nuances like pressure are lost. But as an introductory experience, or for someone who lacks any other alternative, the Leap has the potential to serve as the next best thing to using your real hands.

As an interface device to more traditional computing, there is ample potential, but leveraging requires something of a paradigm shift in the thinking about the way we interact with devices and no small amount of personal training to get used to the idea of doing something with your hands without feeling a response.

The Challenge

Arguably the biggest challenge in an academic environment is the idea of scalability. The Leap is a fairly personalized device in terms of setup and calibration – not the kind of thing that is ideally suited to setup in a public space for general consumption. Using it effectively also takes some practice, for both technical and sensory reasons. The Leap is a little like skiing – its easy to get started but difficult to get good at.

Summary

 

Intrigued?

Are you interested in trying out the Leap Motion? Are there questions you have about the device or naturalistic input devices? Have you used it and would like to share your experiences?