Page 33 - Toucpad robotics C11
P. 33

Autonomous Vehicles (AVs) represent one of the most visible and transformative applications of New Age Robotics.
                 These vehicles aim to navigate and operate without human intervention, from self-driving cars to delivery drones.

                 Perception and Sensor Fusion
                    Concept: AVs integrate data from a multitude of advanced sensors – cameras, Lidar, Radar, Ultrasonic sensors,
                 u
                    and GPS – to create a comprehensive and real-time understanding of their surroundings. This process is called
                    sensor fusion.
                    AI/ML Application: Deep  Learning algorithms, particularly  Convolutional  Neural Networks (CNNs)  in
                 u
                    Computer Vision, are critical for processing raw sensor data. They enable the AV to accurately detect and classify
                    objects (e.g., distinguish between a pedestrian, a cyclist, and a lamppost), understand traffic signs, and identify
                    lane markings. Sensor fusion algorithms, often based on advanced filtering techniques and ML, combine data from
                    different sensors to overcome individual sensor limitations and create a robust environmental model.
                    Impact: Enables the vehicle to ‘see’ and ‘understand’ its environment with a level of detail and consistency often
                 u
                    surpassing human capabilities, even in challenging weather conditions or low light.
                    Example: A self-driving car uses Lidar to create a precise 3D map of the road, Radar to detect the speed and
                 u
                    distance of other vehicles (even in fog), and cameras to identify traffic light colours and read signs. AI algorithms then
                    fuse all this data to form a complete and reliable picture of the driving scene.
                 Decision Making and Path Planning

                    Concept: Once the environment is perceived, the AV’s AI brain must make complex decisions in real-time about
                 u
                    driving maneuvers, speed, lane changes, and interaction with other road users. It also needs to plan the optimal path
                    to its destination.
                    AI/ML Application: Reinforcement  Learning models  are being  trained  in simulated  environments  to  learn
                 u
                    optimal  driving policies  by  experiencing  various  scenarios  and receiving  rewards for  safe and efficient  driving.
                    Predictive AI models forecast the behaviour of other vehicles and pedestrians. Advanced Path Planning algorithms
                    (often graph-based or using optimisation techniques)  calculate  the safest, most  comfortable,  and fastest  route,
                    constantly updating it based on real-time traffic and obstacles.
                    Impact: Enables highly sophisticated and adaptive driving behaviour, moving towards fully autonomous navigation,
                 u
                    aiming for increased safety and reduced traffic congestion.
                    Example: When an autonomous vehicle approaches an intersection, its AI uses predictive models to anticipate
                 u
                    whether other vehicles will  proceed  or stop.  Based  on this, and considering  factors  like  road conditions  and
                    traffic laws, its path planning algorithms decide the optimal speed and trajectory to navigate the intersection safely
                    and efficiently.

                 Human-Machine Interface (HMI) and Communication
                    Concept: For autonomous vehicles, especially semi-autonomous ones, effective communication between the vehicle
                 u
                    and its human occupants (and even pedestrians) is crucial for trust and safety.
                    AI/ML Application: Natural Language Processing (NLP) allows passengers to interact with the vehicle using voice
                 u
                    commands. AI can provide clear, timely information about the vehicle’s intentions and status. Advanced HMI designs,
                    informed by AI, ensure intuitive controls and alerts.
                    Impact: Enhanced  user experience,  increased  passenger  comfort  and trust,  and improved  safety  through clear
                 u
                    communication between the vehicle and its users.
                    Example: If a self-driving car needs to make an unusual manoeuvre or hand over control to the human driver, its NLP
                 u
                    system might verbally explain the situation clearly (e.g., “Merging left in 100 metres,” or “Driver intervention required
                    due to heavy rain”) while also displaying visual cues on a screen.



                                                                                                                       31
                                                                                    Introduction to Robots: What Exactly are They?
   28   29   30   31   32   33   34   35   36   37   38