| 主讲人 |
张琼 |
简介 |
<div>We have entered an era where deep learning and foundation models are transforming data analysis, increasingly handling prediction tasks that were traditionally the domain of statistical modeling. This rapid shift raises a fundamental question: How should statistics evolve in a landscape dominated by large-scale AI? In this talk, I argue that rather than becoming obsolete, traditional statistical principles are essential for overcoming the natural limits of brute-force scaling. I present a research program driven by a dual perspective: applying statistical thinking to solve engineering bottlenecks in modern AI, and conversely, leveraging AI paradigms to inspire new statistical methodologies. I will illustrate this synergy through three chapters of my research:<br />
<br />
- Statistical Efficiency for AI Systems: I first demonstrate how Mixture Reduction grounded in optimal transport addresses computational redundancy, enabling the compression of 3D computer graphics models by 90% while preserving geometric fidelity. I further apply this rigor to Federated Learning, resolving label switching and utilizing Empirical Likelihood to transform central servers into "intelligent routers" that leverage, rather than suppress, data heterogeneity.<br />
- AI Inspires New Statistics: Turning the direction of influence, I explore how In-Context Learning (ICL) redefines statistical inference. We show that foundation models trained via ICL can outperform specialized statistical methods in a wide range of tasks.<br />
<br />
This talk aims to demonstrate that the future of data science lies in a deep integration where statistical rigor provides efficiency and trustworthiness to AI, while modern AI systems expand the boundaries of what is statistically possible.</div>
<p> </p> |