Secretary of Transportation Talks about Self Driving Technology at AVS 2018

Secretary of Transportation Talks about Self Driving Technology at AVS 2018

At the Automated Vehicles Symposium 2018 that was held July 9-12 in San Francisco Elaine Chao, the US Secretary of Transportation presented her remarks and observations on Self Driving Technology.

Her talk included discussion of the challenges in gaining the public trust for such technologies as they are implemented.  She continued with mentioning the related process of adopting driver assistance technology and the insights gained as a result of the Department’s communication with key stakeholders in this area. 

The full video captured by Roadway Media of the keynote speech follows.

 

2016 Automobility LA

2016 Automobility LA

November 2016 , Automobility LA – Where are we going with the cars? Brian Cooley in his keynote speech at the 2016 Automobility show in Los Angeles described the main characteristics for the automobile industry as follow: cars are doing more, the driver is the differentiator, many ways of electrification, the new performance and big data.

There are three forces changing cars today: Electrification – likely in 2025 EV will reach a tipping point. 35% of new sales by 2040? Connectivity – 75% of 90M+ new cars sold annually connected by 2020. Autonomy – 10M+ full/partial autonomous cars on the road by 2020.

The world is getting serious about EVs. MIT study pointed out that 0.83% of US car market is BEV/PHEV and 87% of cars are replaceable by current BEV/PHEV. In addition the electric vehicles are easy for car sharing what is considering as the future of the industry. The young people’s attitude towards cars has been changing and they not necessary feel a desire to own a vehicle. Bloomberg predictions are very optimistic: 40% of global new car sales by 2040 will be BEV/PHEV.

Car connectivity is necessary because of constant need of communication (web messaging, social media sharing), navigation (search to destination), entertainment (streaming music and video) and telematics (remote status and control).  Cars come home. Two main spaces: home and car are connected.

Autonomy is the future. The autonomous cars advantages are as follow: accident reduction, personal time recapture, congestion reduction, better road utilization and fuel efficiency.

Computer Vision in Cars

Computer Vision in Cars

Embedded Vision Summit, May, 2016 – Marco Jacobs from Videantis talked about the status, challenges and trends in computer vision for cars. Videantis, which has over 10 years in business, is the number 1 supplier of vision processors. In 2008 company moved into the automotive space.

What is a future of transportation?

Definitely we will be travelling less than today. Typically less than 100 miles in a day. The only autonomous people mover today is an elevator. So, when can we expect autonomous cars on our roads? At CES 2016 the CEO of Bosch answered – “Next decade maybe”. For now we have low speed and parking assistance but highway, exit to exit… around 2020.

The autonomous vehicle development is described in levels that range from L0 to L5. Today in production are levels L0, L1 and L2. L0 is where driver fully operates a vehicle, L1 is where the driver holds wheel or controls the pedals and vehicle steers or controls speed.  Finally L2 is where the drivers monitor, at all times, and vehicle drives itself but not 100% safely. Level 3 needs R&D and L4 and L5 is when the paradigm changes.

How does the market look in numbers?

There is 1.2B vehicles on the road, 20 OEMs produce over 1M each year, 100M cars sold each year, 100 Tier 1s creates over $1B revenue ($800B combined), and less than $1T business excluding infrastructure, fuel, and insurance, around 25% of the cost is electronics.

Today, new cars also carry 0.4 cameras.  There is an opportunity to increase that number to 10 cameras per car to extend visibility. In L0 vehicles there are rear, surround and mirror cameras. In L2 and L3 there will be rear, surround, mirror and front cameras.

Rear camera functions typically included: wide angle lens, lens dewarp, graphics for guidelines, H.264 compression for transmission over the automotive in-car Ethernet.  Vision technology allows for real-time camera calibration, dirty lens detection, parking assistance, cross traffic alert, backover protection and trailer steering assistance.

The surround view typical functions include: image stitching and re-projection. Vision technology in this application offers: structure from motion, automated parking assistance such as marker detection, free parking space detection and obstacle detection, including everything that rear camera vision functions include.

Mirror replacement to cameras on vehicles will reduce drag and expand the driver’s view. Its typical function include: image stitching and lens dewarping, blind spot detection and rear collision warning. Vision technology offers: object detection, optical flow and structure from motion, as well as the basic features found in rear cameras.

Typical function of a front camera that provides control of speed and the steering wheel are: emergency braking, auto cruise control, pedestrian and vehicle detection, lane detection and keeping, traffic sign recognition, headlight control, bicycle recognition (2018) and intersections (2020). Drive monitoring that is in L0 has features as such: driver drowsiness detection, driver distraction detection, airbag deployment, seatbelt adjustment and driver authentication. Vision technology offers: face detect and analysis, as well as driver’s posture detection.

Here is the image processing pipeline:

DSC_0060

The speaker mentioned and discussed the challenges like work under all conditions: cold & hot: low power; dark & light: HDR, noise; dirty lens; detect angles; operating over a speed range of 0-120mph: the need to select different algorithms; car loaded/dinged: calibration. Challenge is also working under severe power constraints such as: power source 100W, small form factors limits heat dissipation or complete smart camera less 1W. Another challenge is: what is the better option: centralized or distributed processing, as both have pros and cons. Pros for central processing is that single processing platform eases software development but the cons are: entry-level car also needs high-end head unit; it is not scalable and not modular; adding cameras causes system overload. Distributed processing seems to have more pros such as: low-end head unit, options become plug-and-play, and every camera adds processing capabilities. But the cost is: the system is more complex. The reality today is that some cars have 250 ECUs.

Jacobs’s final conclusions were: the business opportunity is huge (>$1T), self-driving car tech causes paradigm shift (new players can grab market share), automotive is not like consumer electronics, next 10 years no self-driving cars (change will be gradual, lots of driver assist functions with vision technologies; efficient computer vision systems are the key enabler for making our cars safer.