Publisher's Synopsis
In autonomous vehicles (AVs), perception and planning are two fundamental components that work together to ensure safe and effective navigation. The perception system interprets the vehicle's surroundings by processing data from a variety of sensors, including cameras, radar, and LiDAR. This sensor data is used to detect and classify objects-such as other vehicles, pedestrians, cyclists, traffic signs, and road boundaries-and to assess environmental factors like weather and road conditions. Ongoing advancements in machine learning and computer vision play a vital role in enhancing the accuracy and reliability of AV perception. Planning, on the other hand, involves determining the vehicle's course of action to safely and efficiently reach its destination. This encompasses route planning, behavioral planning, and motion planning. Seamless integration between perception and planning is essential: perception provides real-time, accurate environmental data, while planning systems adapt dynamically based on new inputs from perception. Achieving this integration requires a carefully designed pipeline that ensures rapid, reliable data flow, often utilizing software frameworks that prioritize both low latency and high dependability. This book explores the latest technological advancements in both perception and planning, with a focus on their integration. It also delves into cutting-edge developments in autonomous racing, a domain that pushes the limits of AV technology. It is intended for a broad audience, including academic researchers, graduate students, automotive engineers at OEMs and suppliers, software and ICT professionals, as well as managers and decision-makers in the field.