Loading...

ÇмúÇà»ç

Korean AI Association

  >   ÇмúÇà»ç   >   ±¹³»Çмú´ëȸ

±¹³»Çмú´ëȸ

Æ©Å丮¾ó
 
·ù°æ¼® ±³¼ö (¼­¿ï´ëÇб³)
 
Title: Optimization for ML
 

Abs

Optimization plays a fundamental role in training neural networks in machine learning. In this tutorial, we will introduce the fundamentals of gradient- and stochastic gradient-based optimization algorithms. We will discuss the convergence analysis of SGD in the convex and non-convex setup; continuous-time analysis of SGD through ODEs and SDEs; and momentum- and non-momentum-based accelerations. 

 

Bio

Gene Golub Best Thesis Award, 2016 
Assistant Professor, Department of Mathematical Sciences, Seoul National University 
Assistant Adjunct Professor, Department of Mathematics, University of California, Los Angeles 
Ph.D. in Computational and Mathematical Engineering, Stanford University 

 

 

 


 
¿¹Á¾Ã¶ ±³¼ö  (KAIST ±èÀçöAI´ëÇпø)
 
Title: Diffusion Models
 

Abs

Recently, score-based models, and denoising diffusion probabilistic models (DDPMs) have gained wide interest as a new class of generative model that achieves surprisingly high sample quality without adversarial training. Among many works, Song et al. (2021b) generalized discrete score-matching procedures to a continuous stochastic differential equation (SDE), which in fact also subsumes diffusion models into the same framework. Score-based models perturb the data distribution according to the forward SDE by injecting Gaussian noise, arriving at a tractable distribution (e.g. isotropic Gaussian dis- tribution). In order to sample from the data distribution, one can train a neural network to estimate the gradient of the log data distribution (i.e. score function, ∇x log p(x)), and use it to solve the reverse SDE numerically. However, diffusion models are inherently slow to sample from, needing few thousand steps of iteration to generate images from pure Gaussian noise. In this tutorial, we review the basic mathematical principles of score-based diffusion models and its applications to various computer vision tasks, such as image synthesis, translation, inverse problems, etc. Then, recent research activities to accelerate the sampling process are reviewed.
 

Bio

Jong Chul Ye is a Professor of the Kim Jaechul Graduate School of Artificial Intelligence (AI) of Korea Advanced Institute of Science and Technology (KAIST), Korea. He received the B.Sc. and M.Sc. degrees from Seoul National University, Korea, and the Ph.D. from Purdue University, West Lafayette. Before joining KAIST, he worked at Philips Research and GE Global Research in New York. He has served as an associate editor of IEEE Trans. on Image Processing, and an editorial board member for Magnetic Resonance in Medicine. He is currently an associate editor for IEEE Trans. on Medical Imaging, and a Senior Editor of IEEE Signal Processing Magazine. He is an IEEE Fellow, was the  Chair of IEEE SPS Computational Imaging TC,  and IEEE EMBS Distinguished Lecturer. He was a General Cochair (with Mathews Jacob) for IEEE Symp. On Biomedical Imaging (ISBI) 2020. His research interest is in machine learning for biomedical imaging and computer vision. 

 


 

¾È¼º¼ö ±³¼ö (Æ÷Ç×°ø°ú´ëÇÐ)
 
Title: Graph Neural Network
 

Abs

Deep learning has produced significant breakthroughs in several fields such as computer vision, natural language processing, and speech recognition. Such research is based on neural networks designed for Euclidean-structured data such as images, text, or acoustic signals. In this tutorial, I will introduce geometric deep learning (GDL): a framework to generalize neural networks for non-Euclidean structured data such as graphs and manifolds. I will also introduce how GDL helps to solve drug discovery as a framework for handling molecular graph and protein 3D structures. In particular, I will briefly introduce using GDL for molecular property prediction, molecular design, retrosynthesis, and protein structure prediction (such as AlphaFold) 

 

Bio

Assistant Professor, Department of Computer Science and Engineering and Graduate School of Artificial Intelligence, POSTECH 
Research associate, MBZUAI 
Ph.D in Electrical Engineering, KAIST 

 

 

 

 


 
 
ÁÖÀç°É ±³¼ö  (KAIST ±èÀçöAI´ëÇпø)
 
Title: Bias and Domain Adaptation in Computer Vision
 

Abs

À̹ÌÁö ºÐ·ù ¸ðµ¨¿¡¼­ ÆíÇâ ¹®Á¦´Â ÇØ´ç Ŭ·¡½ºÀÇ º»¿¬ÀÇ Æ¯Áú (¿¹: »õ Ŭ·¡½º¿¡ ´ëÇØ »õÀÇ ³¯°³, ºÎ¸® µî) ´ë½Å, ÇнÀ µ¥ÀÌÅÍ ³»¿¡¼­ ³ôÀº ºóµµ·Î ÇÔ²² ¹ß»ýµÇ´Â ºÎÂ÷ÀûÀÎ ¼Ó¼ºµé (¿¹: »õ°¡ ³¯°í ÀÖ´Â ÆĶõ ÇÏ´Ã ¹è°æ, »õ°¡ ¾É¾Æ ÀÖ´Â ³ª¹« µî) ¿¡ ÀÇÁ¸ÇÏ¿© À̹ÌÁö ºÐ·ù¸¦ ¼öÇàÇÏ´Â ¹®Á¦¸¦ ÀǹÌÇÑ´Ù. º» Æ©Å丮¾ó¿¡¼­´Â ÀÌ·¯ÇÑ À̹ÌÁö ÆíÇâ ¹®Á¦¸¦ ÇØ°áÇϱâ À§ÇÑ ´Ù¾çÇÑ ¹æ¹ý·ÐµéÀ» ¼Ò°³ÇÏ°í, ÇâÈÄ ¿¬±¸ ¹æÇâµéÀ» ³íÀÇÇÑ´Ù. 

 

Bio:   

ÇзÂ
2001: Çлç, ¼­¿ï´ëÇб³ Àü±â°øÇкΠ
2009: ¼®»ç, Electrical and Computer Engineering, Georgia Tech 
2003: ¹Ú»ç, Computational Science and Engineering, Georgia Tech 
°æ·Â 
2020.03 - ÇöÀç: KAIST ÀΰøÁö´É´ëÇпø ºÎ±³¼ö 
2019.09 – 2020.02: °í·Á´ëÇб³ ÀΰøÁö´ÉÇаú ºÎ±³¼ö 
2015.03 – 2019.08: °í·Á´ëÇб³ ÄÄÇ»ÅÍÇаú Á¶±³¼ö 
2011.12 – 2015.02: Research Scientist, Georgia Tech 
¼ö»ó½ÇÀû 
2020: IEEEE VIS'20 10-Year Test-of-Time Award 
2016: ICDM’16 Best Student Paper Award 
2015: ³×À̹ö ½ÅÁø±³¼ö»ó