121–140 von 150 Ergebnissen
|

    Reinforcement learning for logistics and supply chain management: Methodologies, state of the art, and future opportunities

    Yan, Yimo / Chow, Andy H.F. / Ho, Chin Pang et al. | Elsevier | 2022
    Schlagwörter: Markov decision process

    Dynamic bicycle relocation problem with broken bicycles

    Cai, Yutong / Ong, Ghim Ping / Meng, Qiang | Elsevier | 2022
    Schlagwörter: Markov decision process

    Commuter preferences for a first-mile/last-mile microtransit service in the United States

    Rossetti, Tomás / Broaddus, Andrea / Ruhl, Melissa et al. | Elsevier | 2022
    Schlagwörter: Markov chain Monte Carlo

    Imperfect rail-track inspection scheduling with zero-inflated miss rates

    Altay, Ayça / Baykal-Gürsoy, Melike | Elsevier | 2022
    Schlagwörter: Markov chain Monte Carlo

    Integrated and coordinated relief logistics and road recovery planning problem

    Akbari, Vahid / Sayarshad, Hamid R. | Elsevier | 2022
    Schlagwörter: Markov decision process (MDP)

    Activity-based TOD typology for seoul transit station areas using smart-card data

    Shin, Yonggeun / Kim, Dong-Kyu / Kim, Eui-Jin | Elsevier | 2022
    Schlagwörter: Hidden Markov Model (HMM)

    The flying sidekick traveling salesman problem with stochastic travel time: A reinforcement learning approach

    Liu, Zeyu / Li, Xueping / Khojandi, Anahita | Elsevier | 2022
    Schlagwörter: Markov decision process

    Multi-agent deep reinforcement learning for adaptive coordinated metro service operations with flexible train composition

    Ying, Cheng-shuo / Chow, Andy H.F. / Nguyen, Hoa T.M. et al. | Elsevier | 2022
    Schlagwörter: Markov decision process

    Vulnerability-based regionalization for disaster management considering storms and earthquakes

    Chen, Yenming J. / Chang, Kuo-Hao / Sheu, Jiuh-Biing et al. | Elsevier | 2022
    Schlagwörter: Markov chain random field (MCRF) kriging

    Driving cycle electrification and comparison

    Ye, Yiming / Zhao, Xuan / Zhang, Jiangfeng | Elsevier | 2023
    Schlagwörter: Markov chain

    Relocation incentives for ride-sourcing drivers with path-oriented revenue forecasting based on a Markov Chain model

    Beojone, Caio Vitor / Geroliminis, Nikolas | Elsevier | 2023
    Schlagwörter: Markov Chain

    How income satisfaction impacts driver engagement dynamics in ride-hailing services

    Chen, Xian / Bai, Shuotian / Wei, Yongqin et al. | Elsevier | 2023
    Schlagwörter: Hidden Markov models

    Adaptive scheduling of mixed bus services with flexible fleet size assignment under demand uncertainty

    Chow, Andy H.F. / Li, Guang-yu / Ying, Cheng-shuo | Elsevier | 2023
    Schlagwörter: Markov decision process

    Experience-based territory planning and driver assignment with predicted demand and driver present condition

    Li, Yifu / Zhou, Chenhao / Yuan, Peixue et al. | Elsevier | 2023
    Schlagwörter: Markov decision process

    Dynamic battery swapping and rebalancing strategies for e-bike sharing systems

    Zhou, Yaoming / Lin, Zeyu / Guan, Rui et al. | Elsevier | 2023
    Schlagwörter: Markov chain

    Markov chain-based traffic analysis on platooning effect among mixed semi- and fully-autonomous vehicles in a freeway lane

    Guan, Hao / Wang, Hua / Meng, Qiang et al. | Elsevier | 2023
    Schlagwörter: Markov chain

    Passenger engagement dynamics in ride-hailing services: A heterogeneous hidden Markov approach

    Chen, Xian / Bai, Shuotian / Wei, Yongqin et al. | Elsevier | 2023
    Schlagwörter: Hidden Markov models

    RIde-hail vehicle routing (RIVER) as a congestion game

    Zhang, Kenan / Mittal, Archak / Djavadian, Shadi et al. | Elsevier | 2023
    Schlagwörter: Markov decision process

    Intent-informed state estimation for tracking guided targets

    Lee, Seokwon / Shin, Hyo-Sang / Tsourdos, Antonios | Elsevier | 2023
    Schlagwörter: Conditionally Markov process