Welcome to Omniverse – One Shared Unified World for Filmmakers

Welcome to Omniverse – One Shared Unified World for Filmmakers

At the GTC 2019 Keynote conference in San Jose this month, Nvidia’s CEO and founder Jensen Huang announced Omniverse, an open collaboration platform to simplify studio workflows for real-time graphics.

This has been in the works for nearly 25 years with the company, who have been long trying to make this happen throughout the years, worked closely with Pixar Animation Studios.  With Omniverse, the production pipeline in producing a full featured animated film now has become much simpler and more efficient in the process, as well as being a big money saver for the Hollywood animation studios.

Huang explained, “If you take a look at a major film and it cost something like $300M to 350M to produce that film and the vast majority of it is post production which is otherwise known as rendering and it might take something along the lines of that year and a half a year to year and a half.”

“If you could even save one month on what is otherwise a one year long project, the amount of money you could possibly say is in the millions and so this is one of the reasons why this industry is such in a hurry to find ways to accelerate the rendering process and to accelerate the production process.”

Making animated films has always been labor intensive throughout the complete complex rendering pipeline; from the beginning of its concept, modeling, texturing, rigging, animation, lighting and finally, the rendering process itself.

“You have to render it make it look totally perfect…and then once you create the character, you have to composite a whole bunch of other characters in the scene and all the environments and all the special effects…are done in physics simulation it is so so complicated.”  Huang continued, “…a few shots may be assigned to a studio, a few shots would be assigned to another studio… as a result, multiple studios in multiple sites are all working on a movie at the same time.”

Omniverse includes portals — two-way tunnels — that maintain live connections between industry-standard applications such as Autodesk Maya, Adobe Photoshop and Epic Games’ Unreal Engine.

This new open collaboration platform streamlines 2D and 3D product pipelines across industries. Omniverse is built around the latest industry standards for design collaboration.

It supports Pixar’s Universal Scene Description technology for exchanging information about modeling, shading, animation, lighting, visual effects and rendering across multiple applications. It also supports NVIDIA’s Material Definition Language, which allows artists to exchange information about surface materials across multiple tools.

In addition to Pixar, there are presently more than 200 animation film studios around the world but you can now work with any film studio no matter which continent they are located in.

Astonishingly, graphic artists and designers will be able to view updates made in real time, as though they are in the same room sitting next to each other through NVIDIA’s Omniverse Viewer, which gives users a live look at work being done in a wide variety of tools.  No matter where the filmmakers or studios are, even if they are working with on remote laptops, Omniverse connects and unifies all the designers together from anywhere making it one identifiable project, instead of hundreds of small projects that studios have been so accustomed to throughout the years.

To top it off, the Omniverse Viewer delivers the highest quality photorealistic images in real time by taking advantage of rasterization as well as support for NVIDIA RTX RT Cores, CUDA cores and Tensor Core-enabled AI.

“With Omniverse, NVIDIA has created a product artists will be eager to put to work,” said Guido Quaroni, vice president of Software at Pixar. “When we open sourced USD, our goal was to make it easier to combine complex characters and environments into a single scene. Omniverse raises the bar, leveraging USD to enable scalable real-time collaborative workflows across some of the industry’s major software packages.”

With Omniverse, artists can see live updates made by other artists working in different applications. They can also see changes reflected in multiple tools at the same time.

As a result, artists now have the flexibility to use the best tool for the task at hand.

For example an artist using Maya with a portal to Omniverse can collaborate with another artist using UE4 and both will see live updates of each others’ changes in their application.

Whether it’s Epic Games, Adobe or Autodesk, or any other Pixar collaborator, they are all encouraged by the new platform, as it allows artists to collaborate regardless of the tool they use and without the need for time-consuming conversions.

“We love the idea of connecting tools from all vendors to enable collaborative workflows,” said Tim Sweeney, CEO of Epic Games. “We adopted USD and MDL to streamline workflows where assets originate from many different applications, so it’s great to see NVIDIA extend that ecosystem to enable live connections with simultaneous updates.”

“Omniverse is an exciting concept that will enable artists around the world to collaborate on digital content creation,” said Sebastien Deguy, vice president of 3D and Immersive at Adobe. “We look forward to seeing its development and evolution.”

“We’re thrilled to explore the potential of NVIDIA Omniverse to give our customers access to immersive, interactive and collaborative experiences across industries,” said Amy Bunszel, senior vice president of Design and Creation Products at Autodesk. “We share their vision of better world modeling and simulation. By combining USD and RTX, Omniverse promises to accelerate the future of design and make.”

Huang professed at the Keynote address, “I can’t wait to see the first major motion movie made by Pixar, rendered completely on RTX”.  With Omniverse, it should be even more productive.

“To infinity and beyond!”

Here is a video demonstrating Omniverse at the Keynote, with CEO Jensen Huang.

VR Games in the Esports League

VR Games in the Esports League

A mere four years ago, at the Games Developers Conference (GDC 2014), VR gaming was going through yet another revival after several decades and had another chance to revolutionize the gaming industry.  During that time, Oculus was just a small little company funded from a Kickstarter campaign that seemed to have appeared out of nowhere before being acquired by Facebook.

Immediately after that, Sony PlayStation decided to jump on the VR bandwagon, joining Oculus Rift and HTC Vive.  Many thought it had the potential to completely disrupt and revolutionize the game industry and that everyone would be abandoning their game controllers in favor of VR headsets. In addition, many assumed Sony would take the PS4 platform into more of an online VR community of VR gamers, much like they have done with the success of their traditional online games.  Sony and Xbox had dominated with over the years allowing you to be put in the same community as your friends no matter where you are in the world.

Instead, VR gaming has mainly been a one-person/player experience, being shut into an environment virtual reality world where you are in isolation.  However, through social media and online communities, things are beginning to change all that.  Especially the devoted community for Survio’s “Sprint Vector” which recently made it’s debut on Esports in the VR League at Oculus Connect 5 this year.

Andrew Abedian, the Senior game designer at VR game company, Survios, was recently at the 2018 XRDC conference in San Francisco and talked about the grassroots evolution of how their game, “Sprint Vector” was turned from an early speed-running prototype that evolved into a pioneer multiplayer VR title in Esports, thanks mainly through the help of their online community. “Humanity loves sports…there’s a great drive towards watching it because it’s so physical and athletic and stamina based…there’s a mental game with teamwork and strategy.  When you see a player going down the field you get a sense of what they are going through and the heart they are putting into it”.  Abedian continued, “On the other side of the coin you have Esports…which is really a mental game…highly dexterity based and drives with the mind”. “VR Esports are the middle ground”…”Sprint Vector is built around those concepts”. He explained that “Sprint Vector” is the “middle gap” of real sports and Esports, where real sports is very physical and gaming is very mental.

In “Sprint Vector”, players achieve speed and mobility by pumping their arms like a runner and turning their heads to steer.  Other controls allow them to jump and climb, drift and fly at tremendous speeds.  It is a very physical game and to be a contender at a high level, contestants really do need to be athletic and fit.

Replacing the more conventional traditional teleport locomotion or joystick for moving around, Survio’s developed and utilized their proprietary “Fluid Locomotion” system in “Sprint Vector” which nearly eliminates nausea.   90-95% of the people have reduced nausea or no nausea”, explains Abedian.  That essentially makes it much easier for the players to stay in the games longer.

For spectators of traditional sports, this makes Esports much more credible compared to watching couch potatoes with close up shots of showing their incredible finger dexterity.  These players are sweating it out with their arms, twisting and turning.  Action in movement creates excitement within a real competition and it’s much more exciting to watch the players getting a real workout.

Prior to its official release this year, “Sprint Vector” was able to gain much exposure through GDC2017 where their booths were gathering crowds and their events were becoming spectacles by themselves with many onlookers cheering and watching above from the guard rails. They also had tournaments, such as the Alienware VR Cup at CES2018, along with leader boards and prizes, sponsored by partners Alienware, Nvidia and Intel. After its release in February, despite the game’s exposure throughout GDC and CES, the game was wearing thin after a few months and it was near impossible to find players online to compete with.  Many were already losing interest in playing the game and the numbers were dropping rapidly.

That all changed when Survios reviving the small but passionate community by organizing Happy Hours on Saturday nights, along with Speed Running Tournaments, facilitating game rooms and ensuring game play for those that were interested.   In addition, they also offered prizes for online competition. Suddenly, the community started getting bigger and bigger.

Even ESL started to take note of the devoted community and they eventually chose “Sprint Vector” to participate in this year’s Oculus Connect 5 show and into the VR League making their major debut on Esports.  $12,000 was awarded in prizes for the “Sprint Vector” competition.  “Not bad for a game that originally had no intention of being an Esport.” Abedian noted.  It was just last year’s Oculus Connect 4 Conference where Mark Zuckerberg announced his lofty ambition goals for VR – “We’re setting a goal: we want to get a billion people in virtual reality.”

According to CCS Insight, there are approximately 22 million VR headsets that were sold this year and the number is expected to grow four fold to 121 million next year.  If that’s the case, then Zuckerberg will certainly hit his goal soon. Maybe in the future we can have a VR League marathon involving thousands of participants.



Demonstrations at Automated Vehicles Symposium 2018

Demonstrations at Automated Vehicles Symposium 2018

Here are the video sneak peeks from expo floor at Automated Vehicles Symposium 2018 in San Francisco.  The Expo featured segments from Continental presenting its Intelligent Intersection Technology, Velodyne Lidar sharing its advances in sensor technology that provides realtime 3D data, and from the educational side, Texas A&M Transportation Institute and Virginia Transportation Institute focusing on the infusion of technology into transportation and the role of research agencies in the process. 


Doug Jones receives Lifetime Achievement Award at CIFF

Doug Jones receives Lifetime Achievement Award at CIFF

After pulling off an all-nighter on the set of the popular TV series “Star Trek – Discovery” in his role as Commander Saru, actor Doug Jones was scheduled to fly the next morning from Toronto to San Francisco to be honored and presented onstage with a “Lifetime Achievement Award” at the California Independent Film Festival.

Not bad for an guy who has been unrecognizable to most of the public since he is usually behind  layers and layers of prosthetic makeup in the majority of his roles, but has become a Hollywood icon for over his thirty-year career by playing some of the most unforgettable characters ever created on screen.  The latest being “Amphibian Man”, the sea creature who falls in love with a mute woman, played by Sally Hawkins in last year’s Best Picture winner, “The Shape of Water”.

As soon as he arrived in the theater, he was sincerely apologetic to the cheering crowd for being late.   He certainly didn’t disappoint his legion of fans, including the many Trekkie’s, as some actors in this position might have taken a different approach, such as cancelling all together, but Jones seemed to care more about his fans than about himself and it definitely showed.


Interviewed by CAIFF founder and President, Derek Zemrak , Jones recalled growing up as a “a very tall, goofy, skinny kid” in a small town in Indiana.  He was made fun of and picked on by all the other kids over the years.  To survive all that, he went on to become the class clown, inspired by the likes of Dick Van Dyke, Jerry Lewis, Danny Kaye…Carol Burnett, Mary Tyler Moore, just to name a few.  Those were “his” people.

“I’m inspired by that TV, so I should be on it one day”, Jones reminisced.  “All my friends are on there, so I want to join them, right?”.

While going to school at Ball State University in Indiana, in addition to being the team mascot, Charlie Cardinal (with the red bird suit) he was learning mime as part of the mime troupe, “Mime over Matter”.  His first job after graduating was as a street mime and contortionist at King’s Island, an amusement park in Cincinnati.

Jones explained that his “excuse” job to move to Los Angeles was to apply for a bank management training position for nine months, but was fired after eight months.

“As they should have!…Banking?  You don’t want to trust me with your money.” Jones exclaims.

Doug Jones and Sally Hawkins in the film THE SHAPE OF WATER. Photo Courtesy of Fox Searchlight Pictures. © 2017 Twentieth Century Fox Film Corporation All Rights Reserved


Then he got involved in TV commercial acting classes in Los Angeles, and within six months he was a dancing mummy on a Southwest Airline spot, and the three years following that, did 27 commercial TV spots for MacDonald’s “Mac tonight” campaign that allowed him to buy his first house.

With his contortionist skills and his “flexible” reputation as “a tall, skinny goofy guy, who moves well; wears a lot of crud on his head and doesn’t complain about it”, his attitude, along with his affable personality helped Jones make that successful transition from TV commercials to the film industry; more specifically the creature effects makeup industry”.

Jones reminisced how he was referred to by a well-known stunt friend, who was well aware Jones could put his legs behind his head, which eventually led him into his first gig on his first major motion picture.

 “I got a call from the Stunt department…”, Jones explains.  “We want to you to come and meet us”.  After Burton demonstrates his skills, the stunt writer says “hang on a second, I’d like you to meet somebody… and he comes back with Tim “freaking” Burton.  So, after I wet myself…”, Jones demonstrated his skills again.  Jones got the part in “Batman Returns”, without even having any head shots.

In addition to playing the Thin Clown in “Batman Returns”, his other highlights include playing Billy Butcherson in the perennial favorite Halloween movie, “Hocus Pocus” starring Bette Midler; and the title role in “Fantastic Four – Rise of the Silver Surfer”.  However, he has become Guillermo del Toro’s “go to man”, in which he no longer needs to read the script for his films to accept the role.  He has worked in seven of his films, including roles of Abe Sapien in the Hellboy series, the Faun and Pale Man in Pan’s Labyrinth, and Amphibian Man in the Oscar winning Best Picture, “The Shape of Water”.

The original costume worn by Doug Jones as Orlok from the forthcoming movie, Nosferatu,, scheduled to be released in 2019. Photo taken at the lobby of the Orinda Theatre. Photo by Marcus Siu.


After having played so many monsters and creatures, Jones had just one dream role left…

“If you asked me ten years ago, what’s the one role you haven’t played yet you really would love to?”  My answer was “I haven’t played a proper vampire yet…I want fangs…I want to be a classic vampire…I’m too old and gross to play a young sparkly vampire.  Jones continued.  “Who’s old and gross? “Nosferatu!”.”

As chance would have it, Jones would get a call from Director David Lee Fisher, who did a 2005 remake of one of the first original German horror silent films, “The Cabinet of Dr. Caligari” that was shot over a hundred years ago; in which the remake updates the silent film with sound and dialogue.  He called to see if Jones wanted to play the lead role in his upcoming project, “Nosferatu”.  Little did Fisher know this was Jones’ absolute dream role was to play Count Orlok in “Nosferatu”.

“What’s really special…is the film in the background is all shot in green screen and the original footage from shot nearly 100 years ago will be all in the background.” Zemrak added, who came on board as one of the producers of the film, as well.

Jones also added, “not only did I get to play that character that I longed to play, but I got to play in the environment that Max Schreck got to play in.”

Something tells me with the enthusiasm and amazing talent of Doug Jones, this new and upcoming “Nosferatu” won’t suck.

Article and photos by Marcus Siu

(originally published on mlsentertainment.wordpress.com) 

CES 2018 – 5G and AI stimulate the world of new technologies

CES 2018 – 5G and AI stimulate the world of new technologies

CES 2018 will be remembered for its lights going out at the Las Vegas Convention Center filled with thousands of televisions and electronics. Those who were in the building found their ways to the exits using the lights from their cell phones. Within minutes, the blackout forced the convention center to be closed for over two hours. Although this unfortunate incident made international headlines, this year’s CES was still an exciting and stimulating event for the global tech community.

CES is a platform that allows an interaction with a new generation of technology! This year, over 170,000 attendees from 150 countries, 3,900 exhibitors presented their products and services, 7,000 journalists and bloggers all came to Las Vegas to report to a broad audience that the convention center wasn’t quite able to accommodate.

Steve Koenig, analyst and director of CTA (Consumer Technology Association) at the press meeting which took place two days before an official door opens, announced the main trends in the global tech world. His focus was on 5G and AI as the ingredient technologies for 2018 and beyond. 5G was the theme of my article in the last edition of Property Journal. It is important in the era that has a flood of automated data.

By 2020, the average internet user is going to create around 1,5 GB of traffic per day, but a smart hospital will produce 3,000 GB and an autonomous vehicle 4,000 GB per day. Today every big city has 4G and everyone who watches content on a mobile device is familiar with the buffering wheel icon which means that the operating system needs more time for downloading content. But for the consumer it means lost time for waiting. When 5G will be implemented that will no longer be the case.

How fast is 5G? Let’s use an example. How long would it take to download the two-hour-long “Guardians of the Galaxy” movie? In 3G (2001) it took 26 hours, in 4G (2009) only 6 minutes, in 5G (2020) it will be 3.6 seconds! There is no doubt that 5G is going to revolutionize everything including telecommunication, automotive, healthcare, social media, real-time GPS, mobile payments, video streaming (4K and 8K) and access to the Internet at the fingertips anywhere, anytime.

The second ingredient that will stimulate global technology is AI. Before, the abbreviation AI meant “Artificial Intelligence”, but some industry experts believe that the term is too closely linked to popular culture and has negative connotations. That is why AI is now described as “Augmented Intelligence”; to help people understand that AI simply improves upon products and services, and does not replace humans. Unfortunately, they both have the same acronym – AI.

IBM that heavily invested in Artificial Intelligence system called Watson suggested to use the term “Intelligence Augmentation” – IA. Quick note:  it is not about Sherlock Holmes’s assistant, Watson, but Thomas Watson, the charismatic CEO of IBM during the years, 1914-1956.

Many companies invest in AI because they believe that Augmented Intelligence is the key to learn what humans like and dislike, perfectly suggesting the products to match their tastes. Just recently Google joined the race for more customers by using AI. For the first time, Google had a standalone booth in the middle of the convention center at the CES. Outside the convention center, they had many signs and banners on the streets of Las Vegas that was read – “Hey, Google”. 



AR is coming into the mainstream

AR is coming into the mainstream

AR (Augmented Reality) is a technology that layers computer-generated enhancements on the top of existing reality enhancing it with meaningful information and to making it interactive. AR is developed into apps and is used on mobile devices the way that the components enhance one another but can also be detected apart very easily.

 In 2010, founder and CEO of Augmented World Expo (AWE), Ori Inbar predicted that in ten years (2020) that everyone will be using AR to experience reality in a more meaningful way.  There is a ramp and a learning curve and the technology has to get a foothold, but once it does the applications advance and the product become standard. Today AR is in that moment,  Inbar stated in an interview with me in 2015 – it is still incubating and trying to hone in on the correct experience for the market but the monetization plan is close to being defined. Today, Inbar’s words are confirmed with others.  Digi-Capital just published an article stating that mobile AR will top a billion users and will be a $60 billion dollar industry by 2021.

AR is often confused with VR.

VR is a fully computer generated image and AR is just a layer on real reality. But there are more differences. Ori continued, that there is a use model difference, since VR is a closed screen, it is a download based product. All of the content is created and scripted for consumption such as films, games or documentary style information. AR on the other hand is a see-through overlay type of experience. The content is typically streamed to the unit in real time based on the situation and feedback from user.  This creates a dynamic content environment, and it is also much more familiar to the user.

There are two types of augmented reality.  The first is vision based AR. The real environment is scanned with a mobile device with your phone or tablet and it will augment something within that data. The second is location based AR. Traditional GPS give us just minimum information about a trip from point A to point B. AR application could enrich it for much more including distances and measurements.

AR extends our vision.  The AR glasses optimize production, when the technician who uses them see the safety warning or manual instruction. AR glasses optimize the performance for runners and cyclists giving them their performance metrics such as speed, distance, ascent/descent, cadence (steeps per minute) or heart rate. In healthcare, AR glasses allow the medical professional for precision of IV placement.

AR and VR worlds are diverse and competitive, but standards are coming in.