George Matus was still in high school when he began raising millions for his startup, Teal. The former quad drone racer's pitch to investors was a wish list of what he thought a drone should be. More than just an aerial camera, his quad would be freaky fast and easy to use — even fly in the rain. And, most challenging of all, Teal would think and learn. It would be a platform that developers might use for all kinds of complex applications, from counting a farmer's cows to following a target without using GPS. To do all that, Teal would need a tiny supercomputer…and a digital brain. That would have been impossible just a couple years ago.
But a handful of new technologies — sprung from research labs, small startups, and major tech companies — have converged to make this kind of innovation possible. It's paving the way for quadcopters and self-driving cars that can navigate by themselves. They can recognize what they're seeing and make independent decisions accordingly, freeing them from the old need for an internet connection. Breakthroughs in artificial intelligence (AI) lie at the root of this advancement. AI, the scientific shorthand for a machine's ability to copy human traits like thinking and learning, has transformed how we use technology. AI now permeates our life through Apple’s Siri, Google search, and Facebook newsfeeds. But that tech taps into the cloud. Ask Siri for help splitting the dinner tab, and your voice is sent off to Apple servers for some speedy calculations. It doesn't work without the web, or often even with it. “Robots and UAVs can’t depend on that connection back to the data center,” says Jesse Clayton, Nvidia’s senior manager of product for intelligent machines. Imagine the delay if your quadcopter's live feed had to bounce off the cloud before a computer could calculate the safest route. You'd be better off flying manual. That bottleneck has companies racing to build tiny, AI-capable supercomputers.
If I Only Had a Brain
When Max Versace started working on AI algorithms 25 years ago, computers weren't advanced enough to achieve his vision of an artificial brain. But by 2006, he and a colleague had cooked up a method for computing AI algorithms much faster. They patented it and formed a company, Neurala, around their equations. Then DARPA, the U.S. government's secretive military research agency, asked Neurala to build a software system that could emulate a fully functional brain. The physical part of that brain is made from computer processors built by Hewlett Packard and IBM. Neurala wrote the software. “In a sense, we build minds, which are algorithms,” Versace says. Neurala took its inspiration from a rat brain. With just half a gram of gray matter, a rodent can navigate obstacles, forage food, and evade predators using complex and efficient senses. Yet its brain is far simpler to model than a human brain. Then, once Neurala built DARPA this fake brain, or neural network, NASA asked them to make it work in a Mars rover. It can take half an hour to bounce signals off the Red Planet and hear back, which makes it somewhat tough to steer a robot. NASA wanted the rover to be able to make more decisions on its own. Neurala’s brain never flew to Mars, but that request from NASA pushed the company to start working on autonomous robots with artificial brains. And that brain will soon power Teal’s drones.
Three major advances have made the fusion of drones and AI possible. In recent years, researchers have amassed staggering amounts of data — mainly, vast image sets. This data is the proving ground for training new and complicated AI algorithms, the second major advancement. This progress in AI allows self-driving cars to recognize and track obstacles on the road. But that skill doesn't matter much if you can't free it from a supercomputer. So the third major advancement had to come from new computer processors. “We are really at the invention of the wheel in terms of AI,” Versace says. “This is just the beginning.” Versace adds that many current AI algorithms are trained on a supercomputer and then immediately stop learning. He compares it to graduating college at 25 years old and never getting any smarter. “You go to work every day, perform your duties, wake up tomorrow and you don’t know anything new,” he says. “You just know what you learned the last day of school.” But he believes AI shouldn’t stop learning. “We have come up with a different solution, which relies on how the brain works, how the cerebral cortex works,” he adds. “It enables machines to learn a little bit every day, every time they're used.” Versace and other scientists are now working on what’s called deep learning: You show a computer thousands of pictures of pedestrians, and eventually it will spot a little old lady in a crosswalk that it’s never seen before. Today, that kind of processing usually happens in the cloud.
The Body to Match
Enter Nvidia. The company is best known for graphics processors, or GPUs, for video games. It invented the tech. But in recent years, Nvidia — and other companies — has shown GPUs are great for more than playing Halo 5. They’re also well suited for parallel computing, when a large problem is broken into many smaller ones that are calculated all at once. It’s much faster than using standard microprocessors.
So, at the same time algorithms like Neurala's were advancing enough to start becoming practical outside a lab, Nvidia was turning its GPUs into tiny supercomputers. By 2014, Nvidia launched Jetson. It’s an AI-enabling, credit card-sized brain that functions “on the edge,” which is just a jargonized way of saying “no cloud needed.” You can use it to build a robot brain that learns. “Practically, what that means is that researchers can train a neural network in hours or days on what would once take months or years,” says Nvidia’s Clayton. It may seem like this tech is still a little ambitious, but Nvidia’s financials say otherwise. Nvidia’s move into AI- driven robotics caused its stock to skyrocket from under $30 per share at the start of 2016 to well over $200 this fall. And this tech is already in use: One company using Nvidia’s Jetson is commercial drone company Kespry. You might have seen President Donald Trump holding Kespry’s drone during his meeting with UAV industry leaders earlier this year — the first drone ever inside the White House.
'No shrinking violets' in room with Trump as drone CEOs lobbied hard for looser regulations @Kespry... https://t.co/sxf276IBXS
— UAS Insurance (@UASInsurance) June 23, 2017
Kespry bills itself as an automated “end-to-end” solution. Its company’s founders aren’t “drone people,” and neither are its customers. Instead, Kespry’s after high-quality data that’s easy to capture. “It’s an irony that all these companies that talk about automation also provide manual controls with joysticks to their customers that, if it’s fully automated, it really shouldn’t need,” says David Shearer, the company’s VP of marketing. Jetson — and Kespry’s homebuilt hardware — enables Kespry’s drone to fly with minimal user input. And once it’s done, it automatically uploads its data for processing by another AI algorithm. The tech has allowed the startup to grow fast. Kespry entered into a partnership with John Deere earlier this year that will put its drones in dealers around the country. And the company now has thousands of drones deployed, with plans to target the insurance and energy industries for taking on previously intensive tasks like roof and drill rig inspections. Jim Alison, Kespry’s VP of engineering, says the technology now exists to enter those markets. But first Kespry has to collect enough images of hail damaged roofs and oil derricks to train its AI what to look for.
Your Own Flying Brain
Teal is already selling its dumbed down but ultra-fast Sport version (at a suggested retail of $799). And in 2018, the company says its Teal 2 will hit the market powered by Nvidia’s Jetson GPU and Neurala’s artificial brain. “That's the real differentiator of our drone,” says Bob Miles, Teal’s head engineer and product manager. Their prototypes already have the tech. “So many of the drones out there perform computational work at the controller or in the cloud,” Miles says. So when you use your drone’s follow-me function, the number crunching happens on your cell phone. If you instead give the drone a brain, it can process the information itself without the delay from beaming that data somewhere else. That could cure problems other drones have had relying on non-GPS based tracking software.
However, despite Teal’s lofty goals, the company is keeping things pretty simple at Teal 2’s launch. The drone will reach early adopters with just a command-and-control app and follow me function. But by building an open-source platform and working with outside developers, Teal is forecasting a not-too-distant future where its customers are bringing the truly unique capabilities. “Without even putting out the call, we've received a lot of emails from developers about when we're going to be able to release the app to them,” Miles says. Several of those inquiries have come from farmers interested in counting their cows and taking stock of water levels on large properties. Another app is already in the works with a search-and-rescue company hoping to make use of Neurala's brain. That company wants to enable a Teal drone to fly search-and-rescue routes, constantly imaging until it returns a match for the lost subject, at which point it returns home and downlinks the pertinent images and GPS point. And if none of those smart skills suit you as a consumer, you can still reap the benefits. Just be thankful for a future of smarter, safer drones.