GENERATING INTERACTIVE PEDESTRIAN BEHAVIOR IN URBAN TRAFFIC ENVIRONMENTS
Achieving fully autonomous driving systems hinges on their ability to safely interact with pedestrians, particularly in highly dynamic and unpredictable urban environments. Pedestrian behavior is influenced by multiple factors, including intent, environmental context, and interactions with vehicles and other road users. As a result, autonomous vehicles must not only make safe and reliable navigation decisions but also ensure these decisions are interpretable, robust, and adaptable to real-world variations. This dissertation develops a comprehensive framework that systematically addresses these challenges, progressing from explainable decision-making and intention prediction to trajectory modeling and pedestrian behavior generation.
At the core of autonomous driving lies the challenge of making informed, interpretable decisions. To address this, we developed an attention-based interrelation modeling approach that enables autonomous vehicles to reason about pedestrian interactions and predict driving actions while simultaneously generating human-readable explanations. By explicitly modeling traffic participant relationships in a structured framework, we enhance both the decision accuracy and transparency of autonomous vehicle behavior, facilitating safer interactions in urban environments.
Building upon this foundation, we introduce an evidential transformer-based pedestrian intention prediction model that incorporates uncertainty estimation to reflect the inherent ambiguity of pedestrian behaviors. Unlike conventional deterministic models, our approach quantifies confidence levels in predictions, allowing the autonomous vehicle to adjust its decisions dynamically in response to uncertain pedestrian cues. Experimental results demonstrate that this uncertainty-aware framework improves prediction reliability and provides a valuable mechanism for integrating human-like reasoning into autonomous systems.
To further refine motion anticipation, we developed a novel two-tower framework for pedestrian trajectory forecasting that disentangles vehicle-induced motion (caused by dash-cam movement) from true pedestrian motion. By addressing this challenge, our method significantly improves trajectory forecasting performance, particularly in first-person vision applications where ego-motion introduces noise into scene interpretation. Through evaluations on large-scale datasets, we show that this approach leads to over a 15\% improvement in trajectory prediction accuracy, enhancing the vehicle’s ability to anticipate pedestrian actions in real time.
Finally, this dissertation extends beyond predictive modeling to the generation of realistic pedestrian behaviors. We present a diffusion-based framework that synthesizes both high-level trajectory information and fine-grained pedestrian poses, enabling the creation of highly realistic movement sequences for simulation and testing. By incorporating probabilistic modeling and data-driven motion synthesis, our approach advances autonomous driving validation through more realistic and diverse pedestrian behavior scenarios.
Taken together, these contributions establish a cohesive framework that advances pedestrian behavior understanding in autonomous driving systems. By integrating explainable decision-making, uncertainty-aware intention prediction, camera-motion-agnostic trajectory forecasting, and realistic pedestrian behavior generation, this research provides a robust foundation for improving the safety, interpretability, and adaptability of autonomous vehicles. The insights gained from this work hold significant implications for autonomous driving research, urban mobility planning, and the broader pursuit of human-centric AI-driven transportation systems.
History
Degree Type
- Doctor of Philosophy
Department
- Industrial Engineering
Campus location
- West Lafayette