SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Yu R, Han L, Abdel-Aty M, Wang L, Zou Z. Accid. Anal. Prev. 2023; 194: e107360.

Copyright

(Copyright © 2023, Elsevier Publishing)

DOI

10.1016/j.aap.2023.107360

PMID

37897955

Abstract

Recent state-of-art crash risk evaluation studies have exploited deep learning (DL) techniques to improve performance in identifying high-risk traffic operation statuses. However, it is doubtful if such DL-based models would remain robust to real-world traffic dynamics (e.g., random traffic fluctuations.) as DL models are sensitive to input changes, where small perturbations could lead to wrong predictions. This study raises the critical robustness issue for crash risk evaluation models and investigates countermeasures to enhance it. By mixing up crash and non-crash samples under the traffic flow fundamental diagram, traffic flow adversarial examples (TF-AEs) were generated to simulate real-world traffic fluctuations. With the developed TF-AEs, model accuracy decreased by 8% and sensitivity dropped by 18%, indicating weak robustness of the baseline model (a convolutional neural network, CNN-based crash risk evaluation model). Then, a coverage-oriented adversarial training method was proposed to improve model robustness in highly imbalanced crash and non-crash situations and various crash risk transition patterns. Experiments showed that the proposed method was effective to improve model robustness as it could prevent 76.5% accuracy drops and 98.9% sensitivity drops against TF-AEs. Finally, the evaluation model outputs' stability and limitations of the current study are discussed.


Language: en

Keywords

Adversarial training; Crash risk evaluation model; Model robustness; Traffic flow adversarial example; Traffic flow fundamental diagram

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print