How Faster GPUs are Accelerating Advances in Virtual Reality


The NVIDIA Blog


Posted: 18 Mar 2015 09:41 AM PDT
Outsmarting charging dinosaurs, floating in space and slaying dragons can all be done safely in virtual reality. With the right gear, you can be fully immersed in the experience – really there, wherever there is – except if you are connected to a device with wires tethering you to a computer.
Facebook's acquisition last year of Oculus VR, maker of the Oculus Rift virtual reality headset, "changed everything," said  Tony Parisi, founder of Third Eye, a San Francisco startup developing publishing software for the web, mobile and virtual reality systems.
"We've reached the promised land, or are at least getting close," he said. "We're seeing an explosion in hardware development and the Oculus acquisition released a burst of investment and funding."
While Oculus hasn't yet released a retail product, it's shipped two developer kits.
The headset "provides the most immersive VR experience, and Oculus Rift is the gold standard," Parisi said. Oculus Rift developer kits sell for about $350 and the company may ship their retail product toward the end of 2015, or early in 2016, he said.
Competitors, including Google, have followed with their own versions of VR gear. Google's Cardboard puts virtual reality on your smartphone supported inside a folded cardboard viewer.
NVIDIA has its own play in the VR game, offering "crazy fast GPUs" including the GeForce GTX 980 and GTX 970, Parisi said. The Maxwell architecture in the 980 and 970 includes the integration of VR Direct, making NVIDIA the first hardware company to put virtual reality software into its GPUs.
"The first attempt at a comprehensive GPU approach to virtual reality is with NVIDIA VR Direct," Parisi said. "NVIDIA's 970 kicks up what you can do."
NVIDIA's GPUs have gone some way to help with what VR experts call the presence, or the challenge, of having a camera responding to someone turning their head at 75 frames per second, or FPS. Higher latency breaks the VR illusion and means users risk motion sickness.
Desktop and mobile gear "can't hit this, and even classic targeting of 60FPS makes rendering hit a bottleneck," Parisi said. "With software techniques now, you can push as much as you can to GPUs."
The post VR Pioneer Parisi on How Faster GPUs are Accelerating Advances in Virtual Reality appeared first on The Official NVIDIA Blog.
Posted: 18 Mar 2015 07:00 AM PDT
OTOY Inc., a Los Angeles-based cloud graphics specialist, is doubling down on its bet on virtual reality. It's a move that could help content creators focus on hyper-realistic images and move past the constraints of traditional CGI and filmed media. But it won't cost customers a dime.
That's because OctaneVR, OTOY's new tool for cinematic virtual reality and augmented reality will be released free on Windows, OSX and Linux from April through July.
It allows artists on any budget to experiment and create hyper-realistic, movie-like experiences.
"This is like going to color from black and white in movies, or sound after silent – that's how big it is," said Jules Urbach, chief executive officer and founder of OTOY, after the announcement at NVIDIA's GTC.
"It's an experiment; it's a bet on VR to push awareness," he said. "We want it free, even for commercial use and if it works, we'll keep it free."
OTOY also announced upgrades for other tools it offers, including GPU-powered OctaneRender 3 and Octane Render Cloud services. The enhancements include rendering that "opens the door" to snow, mist and gas, and more support for smoke and fire, Urbach said.
The OctaneRender 3 helps transform a production-ready GPU renderer with more power and introduces new features not seen in production rendering before, including volumetric light-field primitives, which improve how light looks and reflects.
While working on updates for the rendering aspects, Urbach traveled to Barcelona to demo the technology at BLR VFX's studio. He asked BLR VFX's director J.J. Palermo to test the rendering abilities.
"It's like telling Yoda 900 years into Star Wars to switch light sabers," Urbach said.
Deploying two NVIDIA GTX TITAN GPUs, Urbach was able to show Palermo that instead of taking three days to render the kind of scene you'd see in BLR VFX's movie short "Keloid," "it could be switched to a much, much shorter time," Urbach said.
The post OTOY Doubles Down on Its Virtual Reality Bet for Production-Ready Renders appeared first on The Official NVIDIA Blog.
Posted: 17 Mar 2015 08:34 PM PDT
Some at this week’s GPU Technology Conference dream of building self-driving cars. Others want to build machines that can teach themselves to drive.
Bingcai Zhang, a technical fellow with defense contractor BAE Systems, just wishes he could have done more to help his brother, who still works on the 20-acre rice farm in China's Hubei province where Zhang grew up.
Zhang helped build the software widely-used to create 3D maps based on ultra high-definition satellite images, called Socet GXP.
Now Zhang wants to pair this hyper-sophisticated satellite mapping software with cheap drones and GPUs to slash the cost of generating detailed 3D maps that can help the world's poorest use their resources more productively.
"There are hundreds of millions of people like my brother," says Zhang, speaking to more than 150 people who crowded into a small room to hear him speak Tuesday. "He has only six years of education, but it's not because he's dumb."

far
Bingcai Zhang says GPUs can help bring the benefits of sophisticated mapping technology to the farmer’s who need it most.

Zhang holds five degrees, including a PhD from the University of Wisconsin. His brother, however, came of age during the Cultural Revolution, the anti-intellectual ferment that crushed the dreams of some of China's best and brightest. Amidst the tumult, he only got six years of schooling.
Now Zhang's brother is dying of cancer. He couldn't afford to see a doctor who might have diagnosed his cancer in time to save his life. "I feel guilty about this," Zhang says, tearing up for a moment during a conversation after his presentation. "I wish I could have helped him more."
Zhang's goal now: to help the world’s 100 million poorest people – many of whom are farmers – make the most of their meager resources. To do that, Zhang has to slash the cost of creating detailed maps.
These maps can do more than just help farmers, however. Reducing the use of fertilizers and pesticides can limit runoff from farms, while helping produce more food for a world whose population continues to grow.
"With GPU processing we can tell farmers what they need to do, when they need to do it, and how they can do it scientifically, precisely, and economically," Zhang says. "It has to be cheap, otherwise my brother cannot afford it."
Drones can play a big role here because they can provide images much more cheaply than satellites. So can GPUs, which can turn still images gathered by drones into point clouds 10 times faster than CPUs alone.
Such maps can help farmers cut reduce the cost of the fertilizer they use by more than 25%. It can also help them spot deadly pests that can wipe out their crops – a serious challenge for farmers who rely on the naked eye to detect pests.
Zhang doesn't have to look at the studies to know these things, of course. All he has to do is give his brother a call.
The post How One Scientist Wants to Use Drones and GPUs to Help Some of the World's Poorest Farmers appeared first on The Official NVIDIA Blog.
Posted: 17 Mar 2015 08:05 PM PDT
SpaceX, the other company led by Tesla founder and CEO Elon Musk, has an ambitious goal. It wants to put a human colony on Mars.
If that sounds hard, it's actually even harder to pull off. You don't just have to put people on Mars. You have to move the infrastructure needed to support them. Big rockets are needed. Really big ones.
During a packed session at GTC 2015, Adam Lichtl, SpaceX’s research director, noted that when the early pioneers crossed North America, they had to build shelter and find food. But, he quipped, “They had air.”
Plus, any expedition to Mars, which is much colder than Earth, would need facilities to generate power to provide heat and melt ice. And it turns out that designing a powerful enough rocket to accommodate the extra payload is a major conundrum.

It's called rocket science for a reason.
Going to Mars will be anything but easy. SpaceX explained how GPUs can help,

The complex physics computations necessary to complete a mere simulation have proven too much for even the world’s largest supercomputers. But by using GPUs, SpaceX has come up with a workaround. It's not just helping the company approximate computations of unfathomable scale. It may represent an advance that can be sold to the automotive industry.
“This is really a transformational technology that allows us to tackle problems that have never been tackled before in computational dynamics,” Lichtl told a packed conference room. “Without GPU acceleration, it takes months on thousands of cores to run even the simplest simulation. What the GPUs are enabling here is exponential acceleration.”
Yottabytes of data
Here’s the problem: The computations related to what is known as “turbulent non-premixed combustion” – which occurs when fuel and oxidizers are introduced in distinct streams – are dizzying in their scale.
The reactions yield data measured in yottabytes – that's 10 followed by 24 zeros. And, as Stephen Jones, SpaceX lead software engineer, told GTC attendees, “There’s no machine in the world that has that kind of memory and can manage that amount of data.”
Yet, to create a useful simulation that provides the needed insight into factors such as the turbulence created by combustion, Jones said, “You want to do all the mathematics without decompressing the image.”
Analogy to MP3s
By running its code on GPUs, SpaceX has generated adaptive grids that enable it to extract the details it needs while also preserving the compression that’s needed to keep the data manageable. It’s like the technology behind MP3s, which eliminates the frequencies that aren’t needed for compression while retaining the frequencies that allow us to hear musical tones and chords.
Jones said the company is also using wavelets that allow it to focus its computing efforts. Essentially, GPUs are allowing SpaceX scientists to calculate what would otherwise be massive amounts of data by zeroing in on the critical parts of a grid depicting the combustion reaction, eliminating anomalies, and extrapolating a larger data set.
The post How SpaceX Is Using GPUs to Build a Better Mars Rocket appeared first on The Official NVIDIA Blog.
Posted: 17 Mar 2015 04:57 PM PDT
Filmmakers flock each year to the GPU Technology Conference to share tales of how GPUs are transforming their art.
The latest is from Kevin Margo, who talked at a breakout session about how the extensive renderings in his new short film, "Construct"—an ambitious tale of a future ruled by robots—was accelerated by the latest NVIDIA GPUs.
"I can't keep up with feeding content into GPUs to render," Margo said. "It's usually the other way around, and you’re waiting for renders to get back to you."
For "Construct," the added compute power has meant a smaller render farm, one that can fit into his L.A. apartment. It's also led to a smaller team. Despite that shrunken footprint, Margo, who works days as a visual effects supervisor at Blur Studios, hasn't sacrificed speed. Quite the opposite.
Gone are the days when filmmakers wait for renders. Now, as Margo and his crew work on the film, they're able to see CGI renderings of the live-action, motion-capture scenes they're shooting, in real time.

And, as Margo put it, "Faster rendering means faster development."
But it's not just speed that Margo gains. It's quality. Before, poor visuals and latency frustrated Margo and his crew when attempting to view real-time renderings during motion-capture shoots.
Margo craved being able to view photorealistic renderings that tracked with the actor's motions. In other words, as actors perform stunts in front of him, he wanted to see something as close to a finished product as possible—with robots battling in a CGI environment with full lighting and shading.
Mission accomplished. And the payoff has been a breakthrough in creative freedom.
"There's so much spontaneity that comes when all these elements are working at the same time," Margo said.
Now Margo and his crew have added yet another wrinkle by uploading footage to a cloud-based GPU cluster. They then stream back the rendered version in real time.
In essence, Margo is moving closer to what he calls a "hybrid live action experience" in which he’s able to replicate the live action workflow in a real-time CGI setting.
And that, Margo says, has made him a better filmmaker by providing him with a different perspective.
"You can discover things in a way that you couldn't before."

The post How a Render Farm Squeezed Into an Apartment Brought to Life a World Ruled by Robots appeared first on The Official NVIDIA Blog.
Posted: 17 Mar 2015 01:23 PM PDT
When you're preparing to launch a computing platform for self-driving cars, it helps to bring one of the leading voices in automotive innovation along for the ride.
That's just what NVIDIA CEO and co-founder Jen-Hsun Huang did in his opening keynote at the 2015 GPU Technology Conference. Just moments after announcing NVIDIA’s DRIVE PX self-driving car computer, he sat down for a quick fireside chat with Musk, who, as usual, put the pedal to the metal.
"We'll take autonomous cars for granted in quite a short time," Musk told a crowd of some 4,000 at the San Jose McEnery Convention Center. "I almost view it as a solved problem. We know what to do, and we'll be there in a few years."
Musk—who also is co-founder and CEO of Space X, has floated an idea for an ambitious system of pod-like transports, and recently compared the development of artificial intelligence to "summoning the demon”—made it clear that NVIDIA's advances in GPU technology will play a key role.
"What NVIDIA is doing with Tegra is really interesting and really important for self-driving in the future," said Musk.

muskonstageweb
Tesla CEO Elon Musk shared some bold views about the future of automobiles during a chat with NVIDIA CEO Jen-Hsun Huang.

Not that Musk thinks autonomous cars will come easily. It's a long road to get from developing the capabilities to implementing them. For example, he predicts that regulators will be loathe to allow self-driving vehicles until they're presented two or three years of compelling evidence about their safety in comparison to driver-controlled vehicles.
Musk also cautioned that we'll eventually will have to embrace a seismic change in our relationship with cars as regulators struggle with how to transition from driver-controlled vehicles to those controlled by computers.
Bottom line: at some point, people may not be trusted behind the wheel anymore.
"In the distant future, (legislators} may outlaw driven cars because they're too dangerous," he said.

Tesla comes equipped with two Tegra processors to power its big screens.
Tesla comes equipped with two Tegra processors to power its screens.

He also noted that while getting autonomous cars to cooperate at 5-10 MPH is relatively simple, doing so at speeds between 10 and 50 MPH in urban and suburban settings—with pedestrians, cross-traffic and a host of other obstacles and distractions—will prove to be the really hard part of the work. Beyond that, he said, "Once you get above 50 MPH in a freeway environment, it gets easy again."
That's not all. Musk said that the transition to autonomous cars, even once they're ready for prime time, isn't going to happen overnight. He noted that there are 2 billion cars on roads today, and that the auto industry is cranking out another 100 million a year. At that rate, it would take 20 years to replace that fleet if autonomous cars were available tomorrow.
"It's not going to all transition immediately," Musk said. “It'll take quite a while."
And then there's the matter of security. When Huang asked him for his outlook on that, Musk said the threat of hackers taking over cars becomes more significant only if the steering wheel and brake pedal disappear. Until then, drivers can override any potential problems with their hands and feet.

Over the air updates help keep Tesla's Model S fresh long after it's left the factory.
NVIDIA CEO Jen-Hsun Huang told Musk he’s one of the many Tesla Model S drivers who look forward to the over-the-air updates that keep Tesla’s cars fresh long after they’ve left the factory.

But all these concerns have done nothing to dampen Musk's view of self-driving cars as a near-future inevitability.
"I think it's going to become normal, like an elevator," he said. "There used to be elevator operators and then we came up with circuitry so the elevator knew to come to your floor. Cars will be like that."
And few companies are better positioned to lead the charge than Tesla. The company's ground-breaking high-performance electric cars are already known for innovations such as touch screens, digital dashboards and a more recent autopilot feature that helps the cars take control on the highway, when parking, and even to avoid potential accidents.

The post Tesla Motors CEO Elon Musk Says Future of Autonomous Cars is Nigh appeared first on The Official NVIDIA Blog.