H

Huiling Meng, Ningyuan Chen, Xuefeng Gao

Articles by Huiling Meng, Ningyuan Chen, Xuefeng Gao

Academic · 1 min

Design Experiments to Compare Multi-armed Bandit Algorithms

arXiv:2603.05919v1 Announce Type: new Abstract: Online platforms routinely compare multi-armed bandit algorithms, such as UCB and Thompson Sampling, to select the best-performing policy. Unlike standard …

Huiling Meng, Ningyuan Chen, Xuefeng Gao
11 views