
In the focus of the megatrends "Connected - Autonomous - Shared - Electric" in the mobility sector, the interior of vehicles is gaining enormous importance in the perception of users and could soon become the most important differentiating feature for the automotive industry.
Increasing complexity in the interior is causing more and more uncertainty among customers. The automotive industry has been trying for many years to make the complexity intuitively usable by developing complex Human Machine Interfaces with an increasing number of screens and sensors. The integration of touch screens, voice recognition and eye tracking systems has significantly improved the technical side of the HMI, but the human side has been neglected. The paper tries to show a concept to simplify the HMI by means of sensor fusion and cognitive thinking model.
In automotive, the use of audio to create unique user experiences is becoming increasingly important. In-vehicle audio systems are no longer used just for entertainment but are increasingly taking on the function as a central interface for communication between the vehicle and its occupants. A key element in this is interactivity, which is not available in most audio systems. This talk will present a new approach how 3D audio technology based on audio objects acts as a central audio-hmi platform in the vehicle, enabling the reproduction of interactive and immersive audio content, thus creating new application scenarios for the use of audio in vehicles. The object-based approach reduces the effort for audio production as well as for the adaptation to different loudspeaker setups. As a uniform interface for all specialist departments developing various audio functions, e.g., for systems such as infotainment, driver assistance or active sound design, it significantly reduces the complexity of the coordination processes and simplifies the workflow.
The talk will cover two major topics for the future In-Cabin, strating with safety going over to latest comfort features and an outlook in the future.
Current 2D IR DMS systems and upcoming next generation 2D RGB/ IR OMS systems (≈2026) are expected to rely fully on 2D information. This is a good first step for OEMs to integrate a camera inside the car to monitor head and eye state to alert the driver in case of drowsiness or lack of attention.
Sony believes that in the future, advanced occupant state, posture and context awareness will be needed to achieve a safer in-cabin for all occupants. Therefore several sensors need to be fused, to not only cover the monitoring of the driver, but understanding the activity and behavioral context for the driver and all passengers. With this understanding the active safety systems step up in performance and passive safety systems can be optimally supported, such as active restraint systems.
In the talk Sony describes the necessity of sensor fusion, confirms the statement by research and shows ways how to enable safe in-cabins.
In the second half will be shown latest ideas how Sony can imagine implementing beamers, gesture interaction and further comfort features iside of future cars.
With the digital experience concept of the audi activesphere concept, Audi is reinventing the user interface making it a holistic, personal and immersive experience – even beyond the car.
It becomes a multi-dimensional experience. The interface becomes smarter, distributed & contextual, enhancing the real world, using next gen. technology and high-end AR glasses to create a mixed reality (XR) Experience. The interface extends beyond the car providing continuous access to your personal experience device and ecosystem. This user interface is available when you need it and gone when you don´t.