The road ahead for AI
Artificial intelligence (AI) is everywhere.
Even in the unlikely event that you have not yet heard of AI, you would probably already have used it, according to Mr Marc Hamilton, vice president of solutions architecture and engineering at NVIDIA, one of the world's leading manufacturers of graphics processing units (GPUs).
"If you do a search on Google, use Siri or Cortana, or shop on Amazon, all of those systems have deep neural networks that are trained on GPUs," explained Mr Hamilton.
He was speaking at the EmTech Asia conference organised by the MIT Technology Review held in Singapore from 14-15 February 2017. What is even more impressive is that GPU-enabled AI and deep learning have found their way into a diverse range of industries in a relatively short amount of time.
In contrast, previous advances such as mobile and cloud computing took more than ten years to mature from theory to applications, he said.
"To give you one metric of the speed of adoption, consider the ImageNet contest, a classic computer vision contest. In 2011, there were no entries that used GPUs; they were all writing 'if-then-else' code."
Then in 2012, a team won the contest not by writing 'if-then-else' code, but by training a deep neural network.
Mr Hamilton added: "The following year about 80 percent of the participants used GPUs, and by 2014 there was no one left writing traditional computer vision code."
Swimming in the data deluge
Despite how rapidly AI and deep learning algorithms have come to dominate nearly every field of computer science in recent years, the technology is actually not new.
In fact, it has been around for at least 30 to 40 years in academia, but never quite caught on outside of that, Mr Hamilton said.
But with the rise of the web over the past two decades, computers suddenly began generating more data — much more than could be handled at first.
"Many consumer web companies at the time really struggled just to store the data. The difference with AI is that rather than being deluged by all that data, AI actually works better the more data you have," Mr Hamilton said.
To build a deep neural network, researchers 'train' computers with billions or even trillions of calculations known as floating point operations, or flops.
This training is conducted on GPUs, which can do in a few hours what would take a central processing unit (CPU) 30 days.
"Once that deep neural network model is trained, for example to recognise a picture of a car or a cat, you can move it over to an end user application such as a self-driving car. Then you can pass the system an image that it hasn't seen before, and it can infer what's in the image."
The power of AI and deep learning has captivated the attention of young computer and software engineers, he added, sharing that nine out of ten job applicants he interviews today are interested in working in AI, up from just a year or two ago when people were just starting to talk about it.
"It's an amazing time for the computer science industry; we're seeing a very rapid change in the research coming out from the universities and industry. Institutions from Google to hospitals are now all paying attention to deep learning."
AI in the driver's seat
"It's hard to think of an industry that isn't using AI," he continued.
"However, there's no field that is as close to providing valuable results as the world of autonomous vehicles."
NVIDIA itself has been at the forefront of testing and developing self-driving cars, Mr Hamilton said, even though it is not an automotive company.
Working with cars provided by the Ford Motor Company, NVIDIA engineers have developed a self-driving car that was trained to drive using steering wheel angles applied by human drivers rather than lane markings.
Tesla Motors has included an NVIDIA supercomputer into every car sold from October last year. Small enough to fit behind the glovebox, the NVIDIA supercomputer increases the processing power by forty times compared to the previous system.
All that computing horsepower is required because self-driving cars make use of many different types of neural networks to drive, Mr Hamilton explained.
In addition to the 20 teraflops used for traditional AI processing, more computing power is needed to process other types of data that self-driving cars use to make sense of the roads, such as high definition maps.
"Auto accidents are one of the leading causes of death around the world, so being able to make vehicles safer would have a huge impact. In addition, here in Singapore, 26 percent of the landmass is covered by roads. If you could switch to autonomous vehicles, there would be fewer vehicles on the road, letting you potentially re-use some of that landmass," Mr Hamilton said.
"Transformational applications like these are why deep learning is really bringing back a new revolution in computing."