Virtual Reality allows us to see new worlds as if we were inside of them, but without ability to touch and interact with them, it'll never be complete. I wrote down some of my thoughts on finger tracking for virtual reality.
One of the most common reactions to first putting the Oculus Rift on is putting the hands in front of one's eyes and asking why they can't see them. Right now, you're invisible in many of the VR experiences. Sure, you can look down and see a body, but it doesn't do anything that your body does, it's not really yours. The easy way around this is to position yourself in the same way as your VR counterpart to increase immersion.
But doing VR right isn't going to be easy. The head mounted display takes care only of the visual part, but there's a lot more to the world than just seeing. We have a very long road ahead to cover all the other senses, but even now some technologies are exciting and quite useful for VR, namely finger tracking.
Technologies like Leap Motion, ControlVR or Nimble Sense resolve the common question “Where are my hands?!”. Sure you could've used a Hydra/STEM, but I don't grab some object and move it in front of my face to see my hand... that's what I do when I want to see an object I've grabbed, which is what devices like STEM are good for – holding things in VR, like mech control handles or guns.
Don't get me wrong, they have amazing applications as well and I have great plans for such controllers in VR, involving some over-the-top fast-paced action with unique hi-tech virtual “guns” that make you feel like you're holding a portable nuclear bomb. Using them for various precise tool tracking applications, such as 3D drawing or modelling, makes perfect sense as well, since you're actually holding some tool, instead of using your bare hands.
However what I want is to see my hands as I would see them in real reality (can we call it RR? :3). But what's more, I want to use my fingers – and this is where devices like STEM are coming short. They can track overall position of my hand, but not my fingers – it's effectively like having stubs in place of your hand.
Very early version of World of Comenius using the Hydra controllers.
When Leap Motion introduced the VR tracking, I quickly glued my Leap Motion to the Oculus and started playing around and integrating it with my own applications, namely World of Comenius. At the time, it was sadly glitching to the point of being unusable. Luckily people from the Leap were very helpful and listened to the feedback and criticism and over the time improved the tracking software significantly and it's likely to continue on that road.
Early version of the Leap Motion integration, with quite painful glitches that made it mostly unusable.
Fingers are extremely versatile appendages. Just think about how many things we do with them. We write, we paint, we type, play games, interact with variety of devices or even gesticulate. The article you’re just reading is result of series of fine movements of my own fingers and so was the virtual reality software I have created.
That software is built on top of work of a lot of other people, from Leap Motion, Oculus or Unity, running on hardware designed by thousands of engineers, inputted into computers with their own fingers. Much of the technology, software, art and entertainment in our society was expressed and built by a long series of fine finger motions.
Interacting with virtual reality using your fingers is therefore crucial, as it opens a plethora of options. If you want to grab an object in front of you... well you go and grab it. Sounds better than moving a controller thingy near it and pressing and holding a button to express such action. No, not that one! Move your finger higher. No not that high, ugh let me just position it for you since you can't see through the Oculus...
Of course things are still far from perfect now. Not only the finger tracking solutions glitchy, high latency or expensive, but we are completely missing tactile sensation. We can interact with the world, but we can't feel it yet. It's like interacting with intangible holograms – which is the narrative I've actually chosen for World of Comenius, including a hologram-intersection effect when you put your virtual hand through the virtual object.
World of Comenius in its current stage, Leap Motion tracking improvements made it quite usable in most cases.
Fortunately there are already solutions appearing, such as using ultrasound array to give a tingly feeling to our fingers in the midair. Now we've got some semi-tangible holograms that we can interact with, which is a lot better.
But even with mid-air haptics there's an important element missing: Our sense of proprioception. When the fingers get near an object, sure it starts tingling, but we can still put our hand through it. We can't press the fingers against its surface, or grab it with your fingers and have the sense of not being able to push through and our fingers staying in the same position, despite the pressure we're exerting.
We're not interacting with tangible holograms, but still holograms. I'm not sure what the solution to proprioception is going to be. Maybe some simple one could involve some pads on fast robotic arms with the finger tracking, quickly moving near anything that the fingers can touch and exerting a force against?
I don't know, but I certainly hope it will get resolved before we go full VR and just directly stimulate our brains and read their outputs – because that's probably going to take a while.
Meanwhile, we'll have to make do with what we have. It may be glitchy and far from perfect, but it's still fun to play with. No technology was ever perfect from the beginning and ambitious goals are going to take a lot of work to achieve, so we can at least enjoy the increasingly more convincing glimpses of the perfect VR as we progress forward.
Coming next: Behind the scenes of SightLine.