Mathematical Finance Seminar
Date
Time
18:oo
Location
online via zoom
Jianfeng Zhang (USC)

Set Values of Mean Field Games

When a mean field game satisfies certain monotonicity conditions, the mean field equilibrium is unique and the corresponding value function satisfies the so called master equation. In general, however, there can be multiple equilibriums, and in the literature one typically studies the asymptotic behaviors of individual equilibriums of the corresponding $N$-player game. We instead study the set of values over all (mean field) equilibriums, which we call the set value of the game. We shall establish two crucial properties of the set value: (i) the dynamic programming principle; (ii) the convergence of the set values from the $N$-player game to the mean field game. We emphasize that the set value is very sensitive to the choice of the admissible controls. For the dynamic programming principle, one needs to use closed loop controls (not open loop controls) and it involves some very subtle path dependence issue. For the convergence, one has to restrict to the same type of equilibriums for the $N$-player game and for the mean field game. The talk is based on a joint work with Zach Feinstein and Birgit Rudloff and another ongoing joint work with Melih Iseri.