NVIDIA - Researchers Deploy GPUs to BuildWorld's Largest Artificial Neural Network
Posted By : Abhinav Jha At 18-06-2013 23:50:05
Tags : NVIDIA GPUs
Researchers Deploy GPUs to Build World’s Largest Artificial Neural Network
GPU-Accelerated Machine Learning and Data Mining Poised to Dramatically Improve Object, Speech, Audio, Image and Video Recognition Capabilities
NVIDIA today announced that it has collaborated with a research team at Stanford University to create the world’s largest artificial neural network built to model how the human brain learns. The network is 6.5 times bigger than the previous record-setting network developed by Google in 2012.
Computer-based neural networks are capable of “learning” how to model the behavior of the brain – including recognizing objects, characters, voices and audioin the same way that humans do.
Yet creating large-scale neural networks is extremely computationally expensive.For example, Google used approximately 1,000 CPU-based servers, or 16,000 CPU cores, to developits neural network, which taught itself to recognize cats in a series of YouTube videos. The network included 1.7 billion parameters, the virtual representation of connections between neurons.
In contrast, the Stanford team, led by Andrew Ng, director of the university’s Artificial Intelligence Lab, created an equally large network with only three servers usingNVIDIA®GPUs to accelerate the processing of the big data generated by the network. With 16 NVIDIA GPU-accelerated servers, the teamthen createdan11.2 billion-parameter neural network – 6.5 times bigger than a network Google announced in 2012.
The bigger and more powerful the neural network, the more accurate it is likely to be in tasks such as object recognition, enabling computers to model more human-like behavior. A paper on the Stanford research was published yesterdayat the International Conference on Machine Learning.
“Delivering significantly higher levels of computational performance than CPUs, GPU accelerators bring large-scale neural network modeling to the masses,” said Sumit Gupta, general manager of the Tesla Accelerated Computing Business Unit at NVIDIA. “Any researcher or company can now use machine learning to solve all kinds of real-life problems with just a few GPU-accelerated servers.”
GPU Accelerators Power Machine Learning
Machine learning, a fast-growing branch of the artificial intelligence (AI) field, is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, effective web search and a vastly improved understanding of the human genome. Many researchers believe that it is the best way to make progress towards human-level AI.
One of the companies using GPUs in this area is Nuance, a leader in thedevelopment of speech recognition and natural language technologies.Nuance trains its neural network models to understandusers’ speech by using terabytes of audio data.Once the models are trained, they can then recognize the pattern of spoken words by relating them to the patterns that the model learned earlier.
“GPUs significantly accelerate the training of our neural networks on very large amounts of data, allowing us to rapidly explore novel algorithms and training techniques,” said Vlad Sejnoha, chief technology officer at Nuance.“The resulting models improve accuracy across all of Nuance’s core technologies in healthcare, enterprise and mobile-consumer markets.”
NVIDIA will be exhibiting at the 2013 International Supercomputing Conference (ISC) in Leipzig, Germany this week, June 16-20, at booth #220
You May Also Like
comments powered by Disqus