Alibaba Group Invests in WayRay, a Developer of AR-Enabled Car Navigation And Infotainment System

WayRay Announces Strategic Partnership with Banma Invested By Alibaba and SAIC to Develop New AR-enabled Car Navigation and Infotainment System

Alibaba Group Invests in WayRay, a Developer of AR-Enabled Car Navigation And Infotainment System

WayRay also announced that it has closed its Series B financing round which had come from existing investors as well Alibaba Group.

Over the past 4 years, WayRay has used US$10 million of its own funds as well as Angel and Series A venture capital, to create a patented technology for transparent holographic displays. This technology is the basis for Navion – WayRay’s first AR navigation system.

Banma Technologies is an independent startup invested in by Alibaba Group and China’s automaker SAIC Motor, dedicated to making developments in internet-connected cars.

WayRay will work closely with this consortium to create an advanced AR HMI that integrates augmented reality navigation, driving assistant notifications, a virtual dashboard, and more. The new system will be built into one of Banma’s 2018 car models, turning it into the world’s first vehicle in production with a holographic AR head-up display (HUD).

“At the moment, WayRay is the world’s only developer that integrates augmented reality systems into cars. It gives us an advantage over traditional HUDs and provides the opportunity to collaborate with the largest global car brands.” said Vitaly Ponomarev, founder and CEO of WayRay.

WayRay transforms the windshield into a new
medium for information

The AR infotainment system in the Oasis allows the “driver” to benefit
from useful route information while the “passenger” is entertained
by social media updates and interesting location alerts.

Holographic AR displays for cars

WayRay brings in the Holographic Augmented Reality Display,which acts like a fully-featured non-wearable augmentedreality infotainment system.

It shows relevant information at a comfortable distance for users’ eyes —from 3 meters to infinity.

Holographic AR infotainment systems for self-driving cars

WayRay offers a cutting-edge AR infotainment system
for self-driving cars. It creates a virtual world around
the car that moves with you and continuously changes
following your preferences and route.

Capture

To see video of winshield and more : visit: https://wayray.com/industry/

@Info credits:WayRay and alibaba group.

 

ARM advancing the IVI systems and ADAS systems

As excitedly as we talk about it, you’d think the promise of fully autonomous vehicles is right around the bend, just another few miles and we’ll pull into our destination. In truth, we’re not there yet, and a few sceptics even suggest we’ll never get there — too many bumps in the road ahead.

As engineers, we know that we’ll get to the holy grail of autonomous driving. The key questions are how long will it take, what’s the best route and how much will it cost?

Let’s pull out a map and see how we can get there from here in automotive electronics. We know it will require serious engineering feats to integrate all of the different components and deliver seamless safe performance. The good news is we’re perched on a stepping stone right now: advanced driver assistance systems (ADAS), which are seen today in features such as emergency braking assist, drive- and steer-by-wire and collision avoidance. If we want to see this type of advanced technology in every mass-market vehicle, these systems need to be robust, low cost and low power. But this is what the collaborative design chain ecosystem does well and relentlessly optimizes. These core engineering tenets are a compass that will help us navigate the map toward the holy grail of autonomous vehicles.

Open source development for vehicles

Take for example the recent announcement that Renault is teaming up with ARM to open the software and hardware architecture for its Twizy car. Open source development can enable new features to be developed faster and with more variety. This is an exciting new step for the ARM ecosystem, which offers the widest choice of silicon suppliers. Constantly pushing boundaries, our partners address a range of automotive technology use cases – from the smallest sensor to the highest compute required for both feature-rich in-vehicle infotainment (IVI), and ADAS. The ARM partnership combines the technology IP, tools and software needed to create the scalability, security and safety for the automotive experience of the future.

This is probably the most important time for automotive electronics design. It is the start of a world driven by data and technology where new business models offer choices and new experiences for users. Indeed, the future may be less about the drive, as cars will increasingly drive themselves, but more about comfort-level choices, entertainment and the specific functions owners want from a vehicle. For example, some features that are appropriate for a small city car will look out of place in a family car or an SUV. Enabling multiple suppliers increases the pace of progress and the number of choices.

We’ve just seen an example of how ecosystem innovation is quickening the pace for automotive designers on the road to the autonomous vehicle holy grail. Let’s take a closer look at how the ARM ecosystem is addressing two key areas of automotive, IVI and ADAS, which will be key to the adoption and success of autonomous vehicles.

IVI connects the car to the outside world

A new study from Juniper Research titled “M2M: Strategies & Opportunities for MNOs, Service Providers & OEMs 2016-2021” concludes that over the next five years, data from connected car infotainment and telematics will comprise up to 98% of all M2M (machine to machine) data traffic.

This poses a number of challenges for manufacturers and chip makers for IVI — thermal, area and power constraints, to name a few. These challenges arise from diverse requirements, which involve powering a large screen, enabling tactile feedback, displaying multiple apps coming from multiple sources and so on. Here, the lessons of decades of mobile design are being applied to automotive: ARM’s low power processors, combined with big.LITTLE™ technology, enable the manufacturer to ensure high, sustained performance throughout the IVI experience.

And since IVI connects the user to the outside world, communication needs to be secure, as well as the updates to apps and the vehicle software. Charlie Miller famously gained access to control a Jeep by exploiting a security hole in the car’s head unit, so security is paramount here. Safety is a concern here as well, with many systems requiring ASIL B or higher compliance, due to their impact on driver safety.

Technology choice is crucial as well, since IVI applications include GPS navigation, radio, telematics, video playback, and all of these will have different computing requirements. Selecting the right processor for each task will mean less heat, which in turn reduces the amount of cooling needed on-board, saving weight and cost, and increasing reliability.

Seamless cockpit experience

ADAS distributes intelligence across the vehicle

Analysts predict that the on-board computing power of a normal saloon will increase by 100x from 2016 to 2025, powering ADAS and IVI functions. This encompasses a huge computing spectrum, as ADAS combines information from the many sensors dotted all over the car, feeding into a large processing unit that makes sense of the data, and makes decisions in real time. These sensors include radar, lidar, ultrasonic and cameras, all designed to improve driver safety.

Safe operation is vital here, as this is where we as drivers are relinquishing responsibility and giving it to the car itself. That’s why all ADAS components must be designed in line with the highest safety standards, many up to ASIL D.

ADAS functions

The range of functions means that a broad ecosystem is required to deliver specialized computing in each area, allowing OEMs the best choice for each vehicle — whether a small city car, a premium saloon or an SUV. For the body components they must be small and highly integrated, but it is still critical they are secure from both external and internal interference. If you depend on your car’s sensors to tell you the lane beside you is clear for overtaking, you want to make sure it cannot be hacked.

Just like we are seeing with IVI, in ADAS specialization for each task is required due to the cost, power and thermal constraints for managing this number of processors. With ARM’s broad portfolio of low-power technology and comprehensive ecosystem built on a standard architecture, OEMs can have the choice about specialization and ease of integration required, no matter the vehicle or specification.

Safety and security are the responsibility of everyone

Safety is critical, due to the potential damage or injury caused when something goes wrong. Likewise, security, seemingly always in the news these days, is top of mind — not only for the same reason, but also because vehicles have a lifespan of 10-15 years. It’s vital to know vehicles will be secure now, and can be kept secure in the future as new attacks evolve.

No one supplier alone can ensure a vehicle’s safety and security. This requires an ecosystem of companies working together, from hardware to software, from operating system to manufacturer. Working with a standard architecture, manufacturers can reduce cost, increase scalability and ensure the quickest time to market.

ARM offers comprehensive security solutions throughout the vehicle, with automotive technology that builds on the trusted security, from sensor to server, successfully deployed in billions of mobile, networking and embedded applications. With an automotive dream-team of safety and security solutions – ISO 26262 support and TrustZone technology – ARM provides a trusted foundation for OEMs and users of the connected car. 

ARM technology for automotive

ARM technology provides a foundation for the future of automotive

The ARM ecosystem — constantly collaborating on technologies and solutions — is committed to working together to build safe and secure chips that scale to deliver the right performance for each function. That journey you see on the map may appear long, but from tiny radar sensors to powerful IVI and ADAS controllers, the future of driving technology is coming faster than you think.

 @info credits ARM technologies

Continental brings gesture control to the steering wheel

In a move to offload drivers and by providing a holistic human-machine interface, automotive supplier Continental integrates gesture control into the steering wheel. This technique, hitherto restricted to infotainment systems, will now be transferred to steering wheel.

The central element of steering wheel based gesture control is a time-of-flight sensor built into the instrument cluster. The integration of this sensor into the unusual position in the instrument cluster enables a solution that minimizes driver distraction and paves the way for further improvements on the way to a holistic HMI, Continental argues. The time-of-flight sensor detects the motion of the hand and converts it into actions. The driver can navigate through the menus by swiping up and down, and confirm the selection with a brief tapping motion. Touch-free operation is also possible for other functions. For example, if the driver moves his fingers up and down in a uniform movement while keeping his hands on the steering wheel, he can accept calls or reject them. A gesture is typically a movement linked to a specific property. Thanks to the time-of-flight sensor integrated in the instrument cluster, this development has a high rate of gesture recognition. The sensor comprises a 3D camera system with an integrated 3D image sensor and converts the infrared signal detected by the sensor into a 3D image. Consequently, the hand positions and gestures of the driver are detected with millimeter precision and converted to actions.

While with existing solutions the drivers frequently had to take their hands off the wheel or the eyes from the road ahead, the new action radius with the new solution is much more focused. “With gestures in a clearly defined area on the steering wheel, we can minimize distraction and increase safety. This narrowing down also prevents the driver from unintentionally starting gesture-based control by means of their usual everyday gestures, and thus making unwanted selections,” declares Ralf Lenninger, head of Strategy, System Development, and Innovation in Continental’s Interior division.

 

The system can currently detect four different gestures: setting the navigation, browsing through apps and starting music, answering calls, and controlling the on-board computer. Initial reactions of test users confirm the selection of these gestures. In particular, they welcomed the proximity to the steering wheel, operation with the thumb, as well as the intuitive learnability of the gestures. “The development of a holistic human-machine interface is crucial for further strengthening the driver’s confidence in their vehicle. Building up this confidence, combined with an intuitive dialog between driver and vehicle is yet another important step on the road to automated driving, one that we are supporting with gesture-based control on the steering wheel,” Ralf Lenninger summarizes.

The new operating concept integrates seamlessly into the holistic human-machine interface and can replace other elements such as buttons or even touch-sensitive surfaces on the steering wheel. Instead, it uses two transparent plastic panels – without any electronic components – behind the steering wheel, which a driver can operate with his thumbs, almost like a touchpad. As a result, the driver benefits from intuitive operation, while vehicle manufacturers benefit from optimized system costs. The clear design of the panels is compatible with almost any control geometry and new gestures can be added at any time. In addition, the variable complexity ensures that the system can be integrated in many different vehicle classes and not just in the luxury segments.

 

@info credits Continental