The NVIDIA Blog | |
| A Dive Into Deep Learning, and How GTC Dives Deeper Still Posted: 19 Feb 2015 11:25 AM PST Cheap PCs can generate lush virtual worlds. Supercomputers can simulate the formation of galaxies. Even the phone in your hand is more capable than the world's most powerful computers of just a few decades ago. But until recently, ask a computer to identify a bird in a picture—a task a toddler can do—and the most advanced systems stumble. No longer. New neural network algorithms, access to vast troves of data and powerful GPUs have converged. The result is a revolution called "deep learning." Great Deep ForwardThe potential is vast. Researchers use deep learning to spot cancerous cells. To categorize corals in endangered reefs. To map the connections between neurons in the brain. That's why deep learning is the focus of this year's GPU Technology Conference. Our annual conference showcases how GPU technology is transforming science and industry. And no technology is as transformative right now as this. "Deep learning technology is getting really good—and it's happened very fast," says Jonathan Cohen, an engineering director at NVIDIA. "Problems people assumed weren't ever going to be solved—or wouldn't be solved anytime soon—are being solved every day." Deep learning refers to algorithms—step-by-step data-crunching recipes—for teaching machines to see patterns. That gives computers uncanny capabilities. Such as the ability to recognize speech—and translate it to another language on the fly. Better Than HumanThis is a major shift. Just a few years ago, computers struggled with tasks that, to people, are simple. One key benchmark: the annual ImageNet Large Scale Visual Recognition Challenge. To compete, teams build systems that assign one of a thousand possible descriptive labels to a set of 100,000 images. In 2011, participants misidentified objects about a quarter of the time. A year later, a GPU-equipped team from the University of Toronto led by Geoffrey Hinton halved that error rate. And recently, Microsoft Research brought the error rate down to just under 5 percent. That's better than what humans can do. So how does deep learning work? It starts by "training" an artificial neural network. That involves feeding powerful computers many examples of unstructured data—like images, video and speech. Why Is It Called "Deep" Learning?With deep learning, a neural network learns many levels of abstraction. They range from simple concepts to complex ones. This is what puts the "deep" in deep learning. Each layer categorizes some kind of information, refines it and passes it along to the next. Deep learning lets a machine use this process to build a hierarchical representation. So, the first layer might look for simple edges (computer vision researchers call these "Gabor filters"). The next might look for collections of edges that form simple shapes like rectangles, or circles. The third might identify features like eyes and noses. After five or six layers, the neural network can put these features together. The result: a machine that can recognize faces. GPUs are ideal for this, speeding a process that could otherwise take a year or more to just weeks or days. That's because GPUs perform many calculations at once—or in parallel. And once a system is "trained," with GPUs, scientists and researchers can put that learning to work. That work involves tasks once thought impossible. Speech recognition is one application. So is real-time voice translation from one language to another. Other researchers are building systems that analyze the sentiment in social media conversations. We're just scratching the surface. That's why researchers at top universities worldwide are rushing to put deep learning to work. So are Facebook, Microsoft, Twitter and Yahoo, and a host of startups. And many of these pioneers attend GTC to share their latest advances. Deep Dive: Join Us at GTC to Learn More About Deep LearningBecause of the GPU's central role, GTC is one of the best places to learn more. Facebook will talk about how it's using deep learning for object recognition. Baidu, Microsoft and Twitter will be there, too. So will experts from New York University, U.C. Berkeley and the University of Montreal. So join us at our GPU Technology Conference, in Silicon Valley, March 17-20. And learn how to put learning machines to work for you. To register for the conference, visit the deep learning GTC page. ### Related reading: Deep Learning Portal at NVIDIA Developer Zone Learn more about NVIDIA Solutions for Machine Learning The post A Dive Into Deep Learning, and How GTC Dives Deeper Still appeared first on The Official NVIDIA Blog. |
| How GPUs Are Helping Pinpoint Toxins in Everyday Household Items Posted: 19 Feb 2015 10:21 AM PST Editor's note: This is one of a series of five posts profiling finalists for NVIDIA's 2015 Global Impact Award, which provides $150,000 to researchers using NVIDIA technology for groundbreaking work that addresses social, humanitarian and environmental problems. Breakfast cereal. Baby shampoo. Cough syrup. Dishwasher detergent. This grocery list is also a list of products that can make you sick. Very sick, in fact, if they aren't thoroughly tested before hitting supermarket shelves. Some contain a variety of chemical compounds people are exposed to every day, which could be toxic. Toxins come from just about anywhere—environmental pollution, ingredients in cosmetics or cleaning products, medicines, food additives and pesticides, to name just a few sources. Before a box is picked off a shelf and dropped into a shopping cart, months of testing went into the contents listed on the label. With new products reaching the market every day, drug makers, food manufacturers and retailers no longer rely on laborious tests to assess data on thousands of chemicals. They rely on researchers using GPUs to help determine friend from foe. Big Data, With Computational Power to Match The Institute of Bioinformatics at the Johannes Kepler University in Linz, Austria, is the first organization to successfully apply a deep network approach for toxicity prediction. This placed it among five finalists for NVIDIA's 2015 Global Impact Award. The annual grant of $150,000 is given to researchers using NVIDIA technology for groundbreaking work that addresses social, humanitarian and environmental problems. Researchers found the high computational costs of this approach can only be tackled with GPUs. "High-throughput biotechnology gave us the data—big data—but how do you crack that data? Now, we have the hardware to process the data," said Dr. Sepp Hochreiter, who heads the institute. "We can use compute GPUs and standard graphics cards to open up the neural networks and we have access to vast computational power." "It's infeasible for government agencies or big pharmaceutical companies to test for chemical toxicity and undesired side effects using biological methods," he said. This would mean long test periods and live test subjects. "Using computational models makes it much easier," Hochreiter said. Tesla GPUs Crunch Deep Networks Numbers Deep networks are a type of artificial neural network characterized by a large number of layers and hidden coding units. To tackle the deep networks on toxicity prediction data sets, the institute uses four NVIDIA Tesla K40 GPUs—part of the Tesla accelerated computing platform of GPU accelerators and enabling software. The GPUs are mounted into a Dell R920 server with four octacore CPUs and 512GB main memory. The programs for training the deep networks on the Tesla K40 GPUs were written in CUDA. "At first we started out with small-scale computer graphics cards, the kind that you play games on. But then we got the Tesla K40 cards with huge memory so we could store the data on the card itself," said Thomas Unterthiner, who is working on the research at the institute, while studying for his Ph.D. "The neural network runs on the GPU—the CPU hardly does anything. It's the GPU that does all the number crunching." Now, using powerful computational models, it's becoming faster and easier to predict whether a substance may be toxic by interrupting certain biological pathways based on the chemical structure. This helps scientists better determine which chemicals need further testing. Among the institute's achievements is a high accuracy rate in toxicity prediction, which this January helped it win the Tox21 Data Challenge organized by a group that included the U.S. National Institute of Environmental Health Sciences, U.S. Environmental Protection Agency and U.S. Food and Drug Administration. For more information on NVIDIA's Global Impact Award. The post How GPUs Are Helping Pinpoint Toxins in Everyday Household Items appeared first on The Official NVIDIA Blog. |
| You are subscribed to email updates from The Official NVIDIA Blog To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
| Google Inc., 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States | |