SAN FRANCISCO – I love watching people experience virtual reality for the first time. The cumbersome headsets exaggerate their movements as they scan left to right, and then nod up and down. And every time they look in another direction and spot something new, you can almost feel their amazement through the goggles.
Lately, I feel as though I’ve put on a VR headset and never took it off. Because every time I turn, it seems, I spy something truly inspirational. There’s usually plenty to see this time of year, because it’s the season when tech giants like Apple, Google and Microsoft showcase the latest technology, and coax developers to incorporate it into their applications. In fact, the Samsung Developer Conference just wrapped up here.
But this no ordinary spring, ladies and gents. It is shaping up to be the High-Tech Spring, a rare period of hyper-innovation made possible by a perfect storm of computational horsepower, portability, communication and cognitive ability that’s amassing in the cloud, on the street, around the house, inside our pockets, and on our wrists. Oh yeah, and over our eyes.
All this pie-in-the-sky innovation is at odds with the past month’s string of layoffs and weak earnings reports in the sector, I know. But understand this: my “virtual” VR goggles aren’t rose-colored. It’s true that many high-tech companies are having difficulty maintaining investments in the future as they cope with today’s slow-growth business climate. But they’ll find a way. They have to. Because in this spring of springs, early sprigs of tech’s Next Big Things are just starting to poke through the ground.
In addition to VR, which takes us to new worlds, the season also features innovations in artificial intelligence and computer vision – two technologies that combine to help machines negotiate and interact with our world. All three were featured prominently at Samsung’s event. In fact, the company announced during the keynote that the Gear 360, Samsung’s first VR camera for consumers to make their own 3D content, would go on sale Friday.
On Thursday, Google CEO Sundar Pichai wrote in the tech company's annual founders' letter that the next big evolution for technology is artificial intelligence, and the very concept of the device will likely "fade away."
Late last month, Saqib Shaikh, a blind software engineer at Microsoft, unveiled his experimental “Seeing AI” glasses at the company’s Build event. In a video the company debuted during the keynote, Shaikh is seen sliding a finger along the glasses rim in the direction of a noise he hears on the street, and Seeing AI says, “I think it’s a man jumping in the air doing a trick on a skateboard.” It was.
The Seeing AI app marries both computer vision and artificial intelligence to help people with visual impairments navigate the world, though developers are pairing the technologies to solve myriad problems.
A video that Nvidia CEO Jen-Hsun Huang introduced at the company’s Graphics Technology Conference earlier this month offers a glimpse of its ad-hoc driving school for autonomous cars. In it, Nvidia engineers teach by example DaveNet, a deep learning engine in an experimental car. At first, the engineers intervened often as DaveNet swerved off the road, brushed trees and smashed cones. But after a month and 3,000 miles of learning, DaveNet was able to traverse both paved and dirt roads, in traffic, rain or shine.
During his GTC keynote, IBM Watson’s Chief Technology Officer Rob High said his teams not only focus on building computers that can think and learn, but also on how they interact with us. “We want them to be right at least as often as we are,” High said. “But more than that, we want humans to trust them.”
So rather than forcing humans to adapt to how computers operate, High said, IBM is trying to teach computers to communicate with us on our terms. Watson is becoming something of a pop psychologist, which is a big challenge, he said, because human interaction is far more imprecise and circuitous than computer language.
TEACHING THE COMPUTERS
Humans also pepper their speech with innuendo, slang, and inference. And our tone, facial expression and posture give clues as to what we’re really saying. Imagine how much insight a computer would need to be able to recognize that when your partner tells you something is “fine,” it’s so not.
IBM's High presented a video of an experimental robot powered by Watson’s human-like communication abilities, and the visual cues it uses to convey that it’s listening intently to questions and thinking about how to answer is impressive.
The current business difficulties are real and painful. But in the grand scheme of things, it’s a bump in the road. I’m not overly concerned.
More than anything, what worries me is whether the ongoing battle over our fading First Amendment right to privacy between Apple and the feds is going to stunt prospects for all this innovation. IBM’s High is right: trust is critical. And communicating naturally with machines is an important piece of that. More than that, though, if we don’t have assurances that these devices will keep secure and private what they learn about us, then we’re not going to let them in.
That would be a shame. Because this stuff has so much potential to make us safer, healthier and more effective. But only if it makes it out of the labs and into the world. The real world, that is.
You know, the place we find ourselves after we’re done playing with the VR goggles.
Mike Feibus is principal analyst at FeibusTech, a Scottsdale, Ariz., market strategy and analysis firm focusing on mobile ecosystems and client technologies. Reach him at firstname.lastname@example.org. Follow him on Twitter @MikeFeibus.