Dynamic vehicle allocation policies for shared autonomous electric fleets
Date
2022-03-31Author
Roy, Debjit
Dong, Yuxuan
Koster, René De
Yu, Yugang
Metadata
Show full item recordAbstract
In the future, vehicle sharing platforms for passenger transport will be unmanned, autonomous, and electric. These platforms must decide which vehicle should
pick up which type of customer based on the vehicle’s battery level and customer’s travel
distance. We design dynamic vehicle allocation policies for matching appropriate vehicles
to customers using a Markov decision process model. To obtain the model parameters, we
first model the system as a semi-open queuing network (SOQN) with multiple synchronization stations. At these stations, customers with varied battery demands are matched
with semi-shared vehicles that hold sufficient remaining battery levels. If a vehicle’s battery
level drops below a threshold, it is routed probabilistically to a nearby charging station for
charging. We solve the analytical model of the SOQN and obtain approximate system performance measures, which are validated using simulation. With inputs from the SOQN
model, the Markov decision process minimizes both customer waiting cost and lost demand and finds a good heuristic vehicle allocation policy. The experiments show that the
heuristic policy is near optimal in small-scale networks and outperforms benchmark policies in large-scale realistic scenarios. An interesting finding is that reserving idle vehicles to
wait for future short-distance customer arrivals can be beneficial even when long-distance
customers are waiting.
Collections
- Journal Articles [3713]