Intelligent vehicles technology is advancing at a vertiginous pace. However, the complexity behind some highly uncertain and dynamic urban driving scenarios suggests full automation may have usage limits in a near future. AUTOPIA Program was created 20 years ago with the intention of mitigating the potential limitations of fully autonomous driving through the use of shared human-machine control and cooperation with other road agents.
AUTOPIA has a solid experience in providing intelligence to automated vehicle systems in specific situations where communication and interaction abilities may permit to solve understanding-decision dilemmas of isolated self-driving cars. The group has a growing interest in decision-making architectures where driver intentions and skills can be adopted at different assistance levels (from SAE L2 to L4). In this connection, the influence of world modelling, localization and mapping uncertainty in decision-making and road interactions are key research questions that articulate AUTOPIA scientific activity.
AUTOPIA aims at properly combine the development of specific solutions of our context (medium term) with the exploration of techniques that can contribute to meet the big challenges in urban environment (long term). Given the strong segmentation of the upcoming solutions, the long transition period where manual and automated vehicles will have to coexist, and the need to significantly enhance the verifiability of AI-enabled systems, three design principles are at the core of our researches:
- Adaptability to driving scenarios and personalization to passengers’ preferences
- Dependability in the identified operational domain
- Safety by design through explainable AI
AUTOPIA has a fleet of 5 automatic vehicles, 2 electrics vans and 3 gas-propelled cars. In spite of small differences on the instrumentation of the vehicles, they all share a common SW architecture for autonomous driving. The main sensor is a differential GPS/GLONASS receptor with centimetre accuracy, which is combined with an Inertial Measurement Unit (IMU) and CAN bus information (odometry) for a precise vehicle localization. Moreover, some of the vehicles also have cameras and different LIDAR sensors (Ibeo, Velodyne Puck) for obstacles/pedestrian detection and localization, and a Driver Monitoring System to allow a smooth and safe override when needed Based on the information received from sensors and the commands previously given by the driver, the on-board unit (based on an Intel architecture) decides the actions to be taken over the main actuators of the car (i.e. throttle, brake and steering wheel), guiding the vehicle along a real-time planned trajectory. The HW architecture allows to introduce complimentary embedded platforms, connected to an internal network, so that additional functions can be isolated. We have recently acquired a NVIDIA Drive PX2 platform to evaluate its acceleration potential, and will be used in different projects to embed an AI-based system able to determine the most adequate level of automation for every driving situation, given the driver status and the health of embedded sensors.
CSIC has also at its disposal a test circuit designed as an inner-city area, with a combination of straight-road segments, curves, 90 crossings, a roundabout and a parking area. Additionally, a traffic light regulation system, chargers for electric vehicles, a smart traffic camera and RFID/Zigbee sensor networks make our facilities an excellent testing ground to validate and demonstrate new solutions for the most challenging topics of connected and automated driving.
Aerial image of the test circuit
Traffic lights, communication tower and control booth