CEO of Nvidia announces Workstation for Data Scientists at GTC 2019

CEO of Nvidia announces Workstation for Data Scientists at GTC 2019

Nvidia CEO Jensen Huang’s keynote presentation includes an announcement of Workstation for Data Scientists powered by Nvidia GPU and CUDA-X AI. It has Dual Quadro RTX 8000 with 96 GB Memory, Pre-installed for CUDA-X Accelerated Data Science – RAPIDS, TensowFlow, PyTorch, Caffe, Anaconda Distribution and it is 10X faster.

Here is the link to full announcement:

Welcome to Omniverse – One Shared Unified World for Filmmakers

Welcome to Omniverse – One Shared Unified World for Filmmakers

At the GTC 2019 Keynote conference in San Jose this month, Nvidia’s CEO and founder Jensen Huang announced Omniverse, an open collaboration platform to simplify studio workflows for real-time graphics.

This has been in the works for nearly 25 years with the company, who have been long trying to make this happen throughout the years, worked closely with Pixar Animation Studios.  With Omniverse, the production pipeline in producing a full featured animated film now has become much simpler and more efficient in the process, as well as being a big money saver for the Hollywood animation studios.

Huang explained, “If you take a look at a major film and it cost something like $300M to 350M to produce that film and the vast majority of it is post production which is otherwise known as rendering and it might take something along the lines of that year and a half a year to year and a half.”

“If you could even save one month on what is otherwise a one year long project, the amount of money you could possibly say is in the millions and so this is one of the reasons why this industry is such in a hurry to find ways to accelerate the rendering process and to accelerate the production process.”

Making animated films has always been labor intensive throughout the complete complex rendering pipeline; from the beginning of its concept, modeling, texturing, rigging, animation, lighting and finally, the rendering process itself.

“You have to render it make it look totally perfect…and then once you create the character, you have to composite a whole bunch of other characters in the scene and all the environments and all the special effects…are done in physics simulation it is so so complicated.”  Huang continued, “…a few shots may be assigned to a studio, a few shots would be assigned to another studio… as a result, multiple studios in multiple sites are all working on a movie at the same time.”

Omniverse includes portals — two-way tunnels — that maintain live connections between industry-standard applications such as Autodesk Maya, Adobe Photoshop and Epic Games’ Unreal Engine.

This new open collaboration platform streamlines 2D and 3D product pipelines across industries. Omniverse is built around the latest industry standards for design collaboration.

It supports Pixar’s Universal Scene Description technology for exchanging information about modeling, shading, animation, lighting, visual effects and rendering across multiple applications. It also supports NVIDIA’s Material Definition Language, which allows artists to exchange information about surface materials across multiple tools.

In addition to Pixar, there are presently more than 200 animation film studios around the world but you can now work with any film studio no matter which continent they are located in.

Astonishingly, graphic artists and designers will be able to view updates made in real time, as though they are in the same room sitting next to each other through NVIDIA’s Omniverse Viewer, which gives users a live look at work being done in a wide variety of tools.  No matter where the filmmakers or studios are, even if they are working with on remote laptops, Omniverse connects and unifies all the designers together from anywhere making it one identifiable project, instead of hundreds of small projects that studios have been so accustomed to throughout the years.

To top it off, the Omniverse Viewer delivers the highest quality photorealistic images in real time by taking advantage of rasterization as well as support for NVIDIA RTX RT Cores, CUDA cores and Tensor Core-enabled AI.

“With Omniverse, NVIDIA has created a product artists will be eager to put to work,” said Guido Quaroni, vice president of Software at Pixar. “When we open sourced USD, our goal was to make it easier to combine complex characters and environments into a single scene. Omniverse raises the bar, leveraging USD to enable scalable real-time collaborative workflows across some of the industry’s major software packages.”

With Omniverse, artists can see live updates made by other artists working in different applications. They can also see changes reflected in multiple tools at the same time.

As a result, artists now have the flexibility to use the best tool for the task at hand.

For example an artist using Maya with a portal to Omniverse can collaborate with another artist using UE4 and both will see live updates of each others’ changes in their application.

Whether it’s Epic Games, Adobe or Autodesk, or any other Pixar collaborator, they are all encouraged by the new platform, as it allows artists to collaborate regardless of the tool they use and without the need for time-consuming conversions.

“We love the idea of connecting tools from all vendors to enable collaborative workflows,” said Tim Sweeney, CEO of Epic Games. “We adopted USD and MDL to streamline workflows where assets originate from many different applications, so it’s great to see NVIDIA extend that ecosystem to enable live connections with simultaneous updates.”

“Omniverse is an exciting concept that will enable artists around the world to collaborate on digital content creation,” said Sebastien Deguy, vice president of 3D and Immersive at Adobe. “We look forward to seeing its development and evolution.”

“We’re thrilled to explore the potential of NVIDIA Omniverse to give our customers access to immersive, interactive and collaborative experiences across industries,” said Amy Bunszel, senior vice president of Design and Creation Products at Autodesk. “We share their vision of better world modeling and simulation. By combining USD and RTX, Omniverse promises to accelerate the future of design and make.”

Huang professed at the Keynote address, “I can’t wait to see the first major motion movie made by Pixar, rendered completely on RTX”.  With Omniverse, it should be even more productive.

“To infinity and beyond!”

Here is a video demonstrating Omniverse at the Keynote, with CEO Jensen Huang.

GDC 2018 – Reel Time Becomes Real Time

GDC 2018 – Reel Time Becomes Real Time

San Francisco, March 25, 2018.   At this year’s GDC 2018 (Game Developer’s Conference), there were a couple of demos that stood out that strongly reinforced my thoughts about the eventual futuristic merging of computer gaming and movies.

After just a few demos on the GDC expo floor,  I was not be able to discern the difference between the computer game graphics and the live action from movies anymore.  Realism for game developers have never looked so real.

On the first day of GDC, NVIDIA announced their latest RTX technology of Real time-Cinematic rendering.  Nvidia’s RTX technology, alongside Microsoft’s new DirectX® Raytracing (DXR) API, has been an intensive work-in-progress for the last ten years.  It’s ray tracing renders lifelike and realistic lighting, along with reflections and shadows that make it nearly impossible to distinguish what is real and what is not, in terms of computer graphics.   It brings real-time, cinematic-quality rendering to content creators and game developers.

Yes, you read right: “real time”.

This new technology is a milestone for not just gaming developers and filmmakers, but any creator who needs to render an object as realistic as can be.  The computer graphics of tomorrow will make computer graphics of today look like a lifeless imitation.

Nvidia had Project Spotlight “Reflections” at their booth at GDC, which looks more like a teaser trailer for the next Star Wars feature film, but in reality, it’s a real time tracing demo, which Epic’s team along with ILMxLAB and NVIDIA’s DGX Station, equipped with four Tesla V100 GPUs, Epic’s Unreal Engine and NVIDIA’s RTX ray-tracing technology.

It definitely wowed the enthusiastic GDC crowd as it was hard to believe that there were no actual actors used in the storm troopers costumes.  It was all computer graphics being rendered.

At GDC, game developers will have access to ray-tracing denoiser module, part of the GameWorks SDK from Nvidia.  Photo by Marcus Siu.

NVIDIA has also announced that the GameWorks SDK will add a ray-tracing denoiser module, helping game developers take advantage of new capabilities. This updated SDK, which is coming soon, includes support for ray-traced area light shadows, glossy reflections and ambient occlusion.  This will help save a huge amount of time for creators.

Imagine how the Screen Actor’s will feel after they realize that computer graphics characters may jeopardize their career in the near future.  One of those actors will

Motion capture performance artist, Andy Serkis, known for playing “Gollum” in “The Lord of the Rings” trilogy and “Caesar” in the “Planet of the Apes” trilogy may not have problems finding work.  Just a few sections away from the Nvidia booth, I saw a computer generated “digital” Andy Serkis acting out his lines as his alien creature character was being rendered in real time at the Unreal booth.

Unreal indeed.

Just unreal – Andy Serkis’s character talks as Andy talks in real time at the GDC 2018. Photo by Marcus Siu

 

Maybe in the near distant future, we can just get rid of the entire Screen Actor’s Guild with the exception of performance capture or voice-over actors.  Or perhaps one day we can just clone the actors so there wouldn’t be any need for them to come in and perform on a sound stage.

Imagine at the Academy Awards…, “and the nominees for “Best Clone Actor in Supporting Role are”…

In addition with the progress of computer graphics coming from ray tracing in the visual sides, audio will also be just as important for content creators.  End user consumers are continuously looking for an immersive experience with their gaming, so many are reaching out to THX certified equipment for their PC gaming.

THX demonstrated their spatial audio platform, using the latest audio standard, MPEG-H, as well as UHD.  They were using the game trailer for “Helblade”, which ironically, was one of the first live motion capture technology for Epic Games in 2006.  It was in a way, ahead of its time.  Coming from a 2.1 THX certified Logitech speakers and a sub-woofer, it was quite sonically immersive.  I felt that I was right in the middle of the soundscape.

There was also a demo of it using headphones, but I still preferred the speakers.

                                    Certified THX Logitech Speakers at GDC. Photo by Marcus Siu.

In addition, they utilize personalized audio profiles using HRTS’s (Head-Related Transfer Functions), which are optimized and customized for each listener, based on user’s unique hearing anatomy.

Long associated with the Lucasfilm movie sound in movie theatres in the 80’s, it seems that THX is staging a comeback into the public eye in the marketplace.  In addition to its traditional THX certified products such as home entertainment products over the years, such as projectors, pre-amps, receivers and speakers, they have been gaining momentum into the gaming world by introducing THX certified products, such as laptops, headsets and satellite speakers.

The audience is listening…again.

Article and photos by Marcus Siu 

(originally published on mlsentertainment) 

Impressive VR demos at GTC 2016

Impressive VR demos at GTC 2016

GTC, May 2016 – Jen-Hsun Huang, the founder and CEO of Nvidia, one of the largest American manufacture of graphics accelerator chips, at the GPU conference, in May 2016 in California, announced that VR is going to change the way we design and experience the products. Such as shopping for cars. It is like being in virtual showroom where we can walk around our custom design car, open its door and check out the car’s interior.

At that conference took place two amazing virtual reality demos – Everest VR and Mars 2030.

For the “Everest VR” demo, Nvidia, partnered with Solfar, a Nordic VR games company, and RVX, a Nordic visual effects studio for the motion picture industry. RVX worked on movies, such as “Gravity”, which won the Oscar for Visual Effects. Using advanced stereo photogrammetry, pixel by pixel, a CGI (Computer Generated Image) model of Mount Everest was created.

For “Mars 2030” Nvidia worked with the scientists and engineers at NASA, along with Fusin VR, taking images from dozens of satellite flybys of Mars. They reconstructed 8 square kilometers of the surface of the planet. Even the rocks were hand sculptured, with millions of them being carefully placed, based on the satellite images.

Steve Wozniak, a co-founder of Apple, was invited to experience the Mars 2030 demo. As soon he slipped on his headset he was transported to a rover to drive around the planet.

Here are the key moments in VR history.

It started in 1939 at the trade show in New York City where introduced View-Master, a stereoscopic alternative to panoramic postcard. After that 30 years passed while Ivan Sutherland came up with first head-mounted display called “The Sword of Diamocles”. It passed another 30 years when the computer games company Sega introduced wrap-around VR glasses at CES in 1993. Two years later Nintendo produced the gaming console and named it Virtual Boy.

The real gold rush for virtual reality started in 2010.

That year Google came out with 360 degree version of Street View on Google Maps. In 2012 small company Oculus collected $2.4 million for a production of VR glasses. Two years after, Oculus was purchased by Mark Zuckerberg, the founder of Facebook, for $2 billion! The market research analysts from Deloitte, CCS Insight, Barclays and Digi-Capital accordingly forecast that $24 million of VR devices will be sold by 2018, and revenue from virtual and augmented reality products and content will reach to the $150 billion mark.

So far, virtual reality is associated with gaming industry sector mostly because the hard-core gaming community is willing to spend large amount of money for special purpose hardware such as VR glasses and games consoles. But that is changing. VR is appearing in different sectors of business and entertainment.

Yes, it is known that VR may cause motion sickness for some viewers and there are still a number of obstacles to work on but the technology is unstoppable now.