Nvidia Unveils Tegra X1CPU and Autonomous Driving Tech

Posted by at 12:24 am on January 6, 2015

To kick off the Consumer Electronics Show in Las Vegas, Nvidia CEO Jen-Hsun Huang took to the stage to unveil the company’s newest offerings for mobile computing and in-car systems. Nvidia’s newest processor, the Tegra X1, leverages the Maxwell GPU technology released last year to produce a new “mobile super chip.” The invention of the new processor was necessary to power Nvidia’s two other announcements during the show that look toward the future of the automotive industry, the Drive CX and Drive PX systems.

The new Tegra X1 surpasses the Tegra K1 announced last year, as the new 64-bit mobile processor is capable of processing 1 teraflop worth of data with its new floating point 16 architecture. The chip, which was previously known as Codename Erista, holds 256 CUDA cores and eight CPU cores in a four-by-four configuration. The processor is said to be the “world’s first mobile chip” to process 4K, 10-bit video at 60Hz in H.265 and VP9 standards.

1420437211-md-tegrax1

Huang adds that the energy efficiency of Maxwell is what makes the power in the chip possible. In showing off a live run of the Unreal Engine 4’s Elemental demo, it’s was said that the chip uses less than 10W of power, compared to current gaming consoles that use over 100W. With the same thermal envelope used in the Tegra K1, the X1 is able to double the processing power.

Developing the Tegra X1 was important as two new in-car platforms were also uncovered during the presentation. The first is an in-car computer system called Drive CX. With Drive CX, car manufacturers are said to utilize the systems to handle a number of different tasks, but also customize different aspects through the Drive Studio runtime. This includes running cameras, a center console infotainment system, and the instrument system simultaneously. The Maxwell-powered small computer can drive two 4K displays at 60Hz or four high-definition displays. The system is capable of running one or two operating systems for different sections depending on the tasks assigned to it.

126456-md-drivecx

With the Drive CX and Drive Studio runtime, Nvidia is looking to present new options for the future. The system is said to handle current tasks like navigation, text-to-speech, media control and climate control, but can do much more. A center console shown in a demo was able to generate a split screen, with Android projected mode running on one half while the media and climate systems slid down to the bottom.

The customization goes further, as the processing power of the Tegra X1 allows the Drive CX to engage in “material rendering,” giving rendered items in the 3D space new textures like carbon fiber or bamboo, but keeping their natural characteristics when exposed to light and different viewing angles. Custom modes for different users are said to be possible, while manufacturers can pull in their specific geometry for looks and style. All items are rendered in real-time, including 3D navigation.

126456-md-drivecx2

Looking further into the future of cars, Nvidia announced Drive PX. Where Drive CX provides the power to drive in-car elements, Drive PX looks to help drive the car itself. The unit contains two of the Tegra X1 processors, allowing it to process 2.3 teraflops of throughput. Up to 12 high-definition cameras at 60Hz can be processed through the unit, with the Drive PX processing 1.3 billion pixels per second. Drive PX is birthed out of the evolution of car technologies like sensors, cameras and radar, moving further towards camera-based autonomous driving.

Two technologies are utilized in the Drive PX, which Nvidia calls “deep neural network computer vision” and “surround vision.” Deep neural net computer vision allows the Drive PX to learn items through deep learning, the same machine learning process that has accelerated imaging learning for artificial intelligence. The technology was developed to help systems make sense of what they are seeing faster, accelerating learning beyond months and years of training, to only days or hours.

126456-md-drivepx

Where older systems could only recognize items straight on or in full view, the Drive PX allow the system to learn and recognize partially occluded items, as well as different classes of items like cars. This is done at multiple angles and different speeds. In as little as 40 hours, the system learned different sub-classes of cars, including vans, heavy trucks, SUVs and sports cars. It could also detect a person partially hidden behind a car, before reclassifying him as a cyclists once in full view.

The system learns off of examples rather than symbols, learning to reclassify images in real-time based on the information it is seeing through attached cameras or recorded video. It builds on levels of information, working on the basic classifications before moving to a more specific categories. If any object is unknown, the data on the item is uploaded to a cloud-based system for additional learning and reclassification, then the new data is pushed out via automatic updates to all other systems.

126456-md-drivepx2

Surround vision allows the Drive PX system to build the environment as seen through cameras around the car. With four cameras, with one on each side, the car can construct obstacles, other information pieces and feature points in the world around it. Nvidia displayed the real world application of the technology in a simulation, where a car equipped with the Drive PX could park itself after determining the construction of a parking garage, cars parked in spaces, and properly identifying an empty space.

By collecting feature points from the cameras in real-time and tracking them frame to frame, the system can correlate those points to impassible objects. The environments are then constructed in a 3D space, which then allows the technology to utilize pathfinding for the task with consideration for the objects around the vehicle. It was added that one possible use for the technology would be auto-valet parking, in which users could recall vehicles with a smartphone once the area was successfully mapped and the vehicle parked.

126456-md-drivepx4

Leave a Reply

Sign Up For Our Newsletter

Sign up to receive breaking news
as well as receive other site updates

Enter your Email


Preview | Powered by FeedBlitz

Log in

Copyright © 2008 - 2024 · StreetCorner Media , LLC· All Rights Reserved ·