Loading...

ÇмúÇà»ç

Korean AI Association

  >   ÇмúÇà»ç   >   ±¹³»Çмú´ëȸ

±¹³»Çмú´ëȸ

±âÁ¶ & ÃÊû°­¿¬
Plenary Talk 1 

 

 

Prof. Robert D. Nowak(University of Wisconsin-Madison)

 

Title: Deep Learning and Foundation Models: Theoretical Insights and Label-Efficient Learning
 
Abs 
This talk explores the intersection of deep learning theory with practical advancements in label-efficient learning. We begin by presenting a theory that characterizes the types of functions learned by ReLU networks, revealing that deep neural networks learn compositions of functions of bounded second-order variation in the Radon transform domain.  This sheds new light on the role of weight decay regularization and provides principled approaches to more efficient training methods and network compression.
 
In the context of label efficiency, the shift toward large pre-trained foundation models brings new challenges. Traditional theories underpinning label-efficient methods, like semi-supervised and active learning, were designed for training classical models and rather than today’s fine-tuning strategies. Addressing this gap, the LabelBench project—a collaborative, open-source initiative—provides a testbed for optimizing label-efficient fine-tuning of vision  models. Early results show that combining strategies can significantly boost label efficiency, revealing new approaches for training large models with limited labeled data.
 

Bio

Robert Nowak is the Grace Wahba Professor of Data Science and Keith and Jane Nosbusch Professor in Electrical and Computer Engineering at the University of Wisconsin-Madison. His research focuses on machine learning, optimization, and signal processing. He serves on the editorial boards of the SIAM Journal on the Mathematics of Data Science and the IEEE Journal on Selected Areas in Information Theory.
 
Plenary Talk 2

 

 
Prof. Xiao Wang(Purdue University)
 
Title: Harnessing AI for Bayesian Inference: From Neural Conformal Inference to Neural Adaptive Empirical Bayes
 
Abs 
The fusion of artificial intelligence and Bayesian inference has unlocked new possibilities for tackling complex statistical challenges. This talk delves into two AI-driven methodologies that advance Bayesian inference with neural network capabilities. The first, Neural Conformal Inference, offers a likelihood-free approach that maps observed data to model parameters using deep neural networks. By circumventing traditional discretization errors and integrating conformal prediction, this new framework ensures rigorous uncertainty quantification and reliable posterior coverage. The second approach, Neural Adaptive Empirical Bayes, constructs flexible priors using implicit generative models and combines this with variational inference to optimize hyperparameters directly from data. This adaptive strategy enhances predictive accuracy and provides comprehensive uncertainty measures, addressing the challenges of high-dimensional, complex data structures.
 
Bio
 
Education 
2005 Ph.D. in Statistics, University of Michigan, Ann Arbor, MI
2000 M.S. in Mathematics, University of Science and Technology of China
1997 B.S. in Mathematics, University of Science and Technology of China
 
Research
AI
Machine Learning
Nonparametric Statistics
Functional Data Analysis
Reliability
 
Honors & Awards
Professional Achievement Award, Purdue University, 2022-2023
Elected Fellow of the American Statistical Association (ASA) 2021
Elected Fellow of the Institute of Mathematical Statistics (IMS) 2021
Regina and Norman F. Caroll Research Award, Purdue University 2017- 2018
 
ÃÊû°­¿¬

 

Title: Áö´ÉÇü ¾÷¹« Çù¾÷ ¼Ö·ç¼Ç(Brity Works + Copilot)

 

Abs 

º» °­¿¬¿¡¼­´Â »ý¼ºÇü AI ±â¹ÝÀÇ Çù¾÷ ¼Ö·ç¼ÇÀÌ ±â¾÷ÀÇ ¾÷¹«¸¦ ¾î¶»°Ô Çõ½ÅÇÒ ¼ö ÀÖ´ÂÁö »ìÆ캾´Ï´Ù.  ÇöÀç »ï¼º±×·ì ¹× ´Ù¾çÇÑ ±â¾÷¿¡¼­ »ç¿ë ÁßÀÎ Çù¾÷ ¼Ö·ç¼Ç(ºê¸®Æ¼¿÷½º)°ú »ý¼ºÇü AI ¼­ºñ½º(ºê¸®Æ¼ÄÚÆÄÀÏ·µ)ÀÇ ½ÇÁ¦ Àû¿ë »ç·ÊÀ» ¼Ò°³ÇÏ°í, ¾ÕÀ¸·Î Copilot ¼­ºñ½º°¡ Personal Agent ÇüÅ·ΠÁøÈ­ÇÏ°Ô µÇ¸é, ¾î¶°ÇÑ »õ·Î¿î ¾÷¹« °æÇèÀ» °¡Á®¿À°Ô µÉÁö ÇÔ²² »ý°¢Çغ¾´Ï´Ù. 

 

Bio:

2023.12 ~ ÇöÀç           »ï¼ºSDS IW»ç¾÷ÆÀÀå / »ó¹«
2021.01 ~ 2023.11      »ï¼ºSDS C&C»óÇ°±âȹ±×·ì ±×·ìÀå
2017.11 ~ 2020.12      »ï¼ºSDS Knox Portal»ç¾÷±×·ì »óÇ° ¸®´õ
2014.04 ~ 2017.10      »ï¼ºSDS ÄÁÆÛ·±½Ì¼Ö·ç¼Ç±×·ì °³¹ß ¸®´õ