Loading Events
  • This event has passed.

Jiantao JIAO (UC Berkeley) – "Sharp Minimax Rates for Imitation Learning"

February 15, 2021 @ 5:00 pm - 6:15 pm
The Statistical Seminar: Every Monday at 2:00 pm.
Time: 5:00 pm – 6:15 pm Exceptionally
Date: 15th of February 2021
Place: Visio
Jiantao JIAO (UC Berkeley) – “Sharp Minimax Rates for Imitation Learning”

Abstract: We establish sharp minimax bounds on Imitation Learning (IL) in episodic Markov Decision Processes (MDPs) with a state space S. We focus on the known-transition setting where the learner is provided a dataset of N length-H trajectories from a deterministic expert policy and knows the MDP transition. We show the minimax rate is Theta (|S|H^{3/2}/N) while the unknown-transition setting suffers from a larger sharp rate Theta(|S|H^2/N)~\citep{rajaraman2020fundamental}. Our upper bound is established using the Mimic-MD algorithm in~\citet{rajaraman2020fundamental} which we prove to be computationally efficient, and the lower bound is established by proving a two-way reduction between IL and the value estimation problem of the unknown expert policy under any given reward function, as well as linear functional estimation with subsampled observations. We further show that under the additional assumption that the expert is optimal for the true reward function, there exists an efficient algorithm, which we term as Mimic-Mixture, that provably achieves suboptimality O(1/N) for arbitrary 3-state MDPs with rewards only at the terminal layer. In contrast, no algorithm can achieve suboptimality O(\sqrt{H}/N) with high probability if the expert is not constrained to be optimal. Our work formally establishes the benefit of the expert optimal assumption in the known transition setting, while~\citet{rajaraman2020fundamental} showed it does not help when transitions are unknown.

Organizers:
Cristina BUTUCEA (CREST), Alexandre TSYBAKOV (CREST), Karim LOUNICI (CMAP) , Zoltan SZABO (CMAP)
Sponsors:
CREST-CMAP