SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Basso R, Kulcsár B, Sanchez-Diaz I, Qu X. Transp. Res. E Logist. Transp. Rev. 2022; 157: e102496.

Copyright

(Copyright © 2022, Elsevier Publishing)

DOI

10.1016/j.tre.2021.102496

PMID

unavailable

Abstract

Dynamic routing of electric commercial vehicles can be a challenging problem since besides the uncertainty of energy consumption there are also random customer requests. This paper introduces the Dynamic Stochastic Electric Vehicle Routing Problem (DS-EVRP). A Safe Reinforcement Learning method is proposed for solving the problem. The objective is to minimize expected energy consumption in a safe way, which means also minimizing the risk of battery depletion while en route by planning charging whenever necessary. The key idea is to learn offline about the stochastic customer requests and energy consumption using Monte Carlo simulations, to be able to plan the route predictively and safely online. The method is evaluated using simulations based on energy consumption data from a realistic traffic model for the city of Luxembourg and a high-fidelity vehicle model. The results indicate that it is possible to save energy at the same time maintaining reliability by planning the routes and charging in an anticipative way. The proposed method has the potential to improve transport operations with electric commercial vehicles capitalizing on their environmental benefits.


Language: en

Keywords

Approximate dynamic programming; Electric vehicles; Energy consumption; Green logistics; Reinforcement learning; Vehicle routing

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print