Zijian Guo |
Nonstandard InferenceSummary (LLM read my papers; human bias-correction applied)A central theme of my work is reliable uncertainty quantification in modern problems where standard Wald intervals with “estimate + standard error” recipes can break. This happens when the target is defined through an optimization (e.g., a worst-case/robust objective), when the procedure involves data-adaptive steps (e.g., model selection or causal discovery), or when key regularity conditions fail so the usual smooth asymptotic normal approximations are no longer accurate. In these settings, I develop inference methods—often based on perturbation/resampling or other stability-driven constructions—that remain valid even when the limiting distribution is nonsmooth or non-normal, or when nuisance components are learned flexibly and may converge slowly. Intuitively, the goal is to provide confidence statements that continue to mean what they say (correct coverage) in the realistic regimes where robustness, selection, and complex learning are essential, but classical asymptotic theory is overly optimistic.
underline indicates supervised students ; # indicates equal contribution; * indicates alphabetical ordering ; ✉ indicates corresponding authorship. |