Loading...

ÇмúÇà»ç

Korean AI Association

  >   ÇмúÇà»ç   >   ±¹³»Çмú´ëȸ

±¹³»Çмú´ëȸ

Æ©Å丮¾ó
 
¹é½Â·Ä ±³¼ö (UNIST)
 
Title: Recent Advances Towards Efficient Transformers in Vision Domain
 
Abs:
Transformers have already replaced LSTMs in the NLP domain and they are also replacing CNNs in the vision domain. However, their huge computational cost may hinder the practical use in real-time industrial applications. This tutorial will first briefly introduce basics of Transformers and will cover recent techniques that tried to achieve the efficient Transformers in the context of vision applications.
 
Bio
Seungryul Baek is an associate professor at the Artificial Intelligence Graduate School (AIGS) and Department of Computer Science and Engineering (CSE) of UNIST. He obtained BS (2009) and MS (2011) degrees from Dept. of Electrical Engineering at KAIST and Ph.D. degree (2020) from Dept. of Electrical and Electronic Engineering at Imperial College London. Before joining Ph.D., he was an employee at DMC Research Center of Samsung Electronics for four years (2011.2.-2015.2.). 
 

 
 
±èÇüÈÆ ±³¼ö (UNIST)
 
Title: Large Language Models: Ability and Alignment
 
Abs:
Large Language Models (LLMs) demonstrate remarkable abilities across a variety of tasks, highlighting significant advancements in AI technology. This presentation will begin by exploring the foundational concepts of LLMs, including language modeling and the attention mechanisms essential for their development and training. Building on this foundation, we will discuss the emergent abilities of LLMs, particularly their innate reasoning capabilities. We will then highlight how these abilities are not only intrinsic but can also be significantly amplified through innovative techniques such as advanced prompt engineering and instruction-tuning.
 
Bio
Hyounghun Kim is an assistant professor in the Artificial Intelligence Graduate School and the Department of Computer Science and Engineering at the Ulsan National Institute of Science and Technology (UNIST). He earned his Ph.D. from the Department of Computer Science at UNC-Chapel Hill, advised by Prof. Mohit Bansal. He received M.S. in Computer Science from USC and B.S. in Electrical and Electronic Engineering from Yonsei University. His research interests are in natural language processing and multimodal learning focusing on embodied AI, commonsense reasoning, VQA/VideoQA, visual dialogue, as well as AI assistant applications. He has been serving as an action editor (area chair), reviewer, and program committee member for ACL Rolling Review, top conferences, and workshops. He also worked for Adobe Research and Amazon Alexa AI as an intern, and Samsung Electronics as a software engineer.
 

 
 
¹Ú³ë¼º ±³¼ö (KAIST)
 
Title: Physics-inspired Deep Learning
 
Abs:
Scientific knowledge, written in the form of differential equations, plays a vital role in various deep learning fields. In this talk, I will present a graph neural network (GNN) design based on reaction-diffusion equations, which addresses the notorious oversmoothing problem of GNNs. Since the self-attention of Transformers can also be viewed as a special case of graph processing, I will present how we can enhance Transformers in a similar way. I will also introduce a spatiotemporal forecasting model based on neural controlled differential equations (NCDEs). NCDEs were designed to process irregular time series in a continuous manner and for spatiotemporal processing, it needs to be combined with a spatial processing module, i.e., GNN. I will show how this can be done.
 
Bio:
Noseong Park is currently a tenured associate professor of the School of Computing and the director of the Big Data Analytics and Learning Laboratory at Korea Advanced Institute of Science and Technology (KAIST) in South Korea. Before joining KAIST, he was a tenured associate professor at Yonsei University from 2020 to 2024 and was an assistant professor at George Mason University from 2018 to 2019 and the University of North Carolina at Charlotte from 2016 to 2018. He received his computer science Ph.D. in 2016 from the University of Maryland, College Park under the supervision of Prof. V. S. Subrahmanian. He received his Master's degree from KAIST and Bachelor's degree from Soongsil University. He has much experience on the core fields of data mining and machine learning, and their applications to scientific machine learning, deep generative learning, spatiotemporal processing, time series processing, recommender systems, etc. He has published more than 50 papers in top-tier venues, such as NeurIPS, ICLR, ICML, KDD, WWW, VLDB, ICDM, WSDM, SIGIR, IJCAI, AAAI and so on. In particular, his VLDB paper, which proposed a tabular data synthesis method called TableGAN, is now the most cited among all VLDB papers published within the last 5 years. He received the best student paper award, as an advisor, from IEEE BigData 2022, the best demonstration runner-up award from IJCAI 2019, and the best paper runner-up award from ASONAM 2016. He also received a couple of medals in Samsung Humantech Paper Award, the most prestigious award in Korea, at 2022 and 2023. He has also secured several research grants that amount to, in total, $530K as a PI or co-PI from NSF, the Office of Naval Research, and so forth when he was in the US. Including the grants that he has received in Korea after 2020, the total research fund amounts to $2M. He also has served (or is now serving) as a PC member in many top-tier venues, such as ICML, ICLR, NeurIPS, KDD, WWW, AAAI, ICDM, ICWSM, and so on.

 
 
ÀÓ¼ººó ±³¼ö (°í·Á´ëÇб³) 
 
Title: Diffusion Models: Foundation and Algorithm
 
Abs:
Diffusion models have recently gained significant attention in probabilistic machine learning due to their theoretical properties and impressive applications in generative AI, including Stable Diffusion and DALL-E. This tutorial provides an introduction to the theory and algorithm for understanding the foundation of diffusion models.
 
Bio
Sungbin Lim is currently an Assistant Professor at the Department of Statistics at Korea University. He earned a Ph.D. in Mathematics from Korea University in 2016, under the supervision of Prof. Kyeong-Hun Kim, and transitioned to the private sector as a senior AI Scientist until 2019 and returned to academia in 2020. During his doctoral studies, he focused primarily on research in stochastic partial differential equations and diffusion theory. Since then, he has developed a keen interest in probabilistic machine learning including deep generative modeling, and is also collaborating with LG AI Research on AI for Science projects.
 
- 2023 - Current: Assistant Professor, Dept. of Statistics, Korea University.
- 2020 - 2023: Assistant Professor, Dept. of IE & AIGS, UNIST.
- 2018 - 2019: Research Scientist, Kakao Brain.
- 2016 - 2017: Data Scientist, Samsung Fire & Marine Insurance.
- 2016: Ph.D. in Mathematics, Korea University.
 

 
 
¼ÛȯÁØ ±³¼ö (KAIST)
 
Title: Innovation in text summarization: the role and prospects of large language models
        ÅؽºÆ® ¿ä¾àÀÇ Çõ½Å: ÃÊ°Å´ë ¾ð¾î ¸ðµ¨ÀÇ ¿ªÇÒ°ú Àü¸Á
 
Abs:
º» °­¿¬¿¡¼­´Â ÅؽºÆ® ¿ä¾à¿¡ °üÇÑ ¼¼ °¡Áö ÇÙ½É ÁÖÁ¦¸¦ ´Ù·ì´Ï´Ù. ù°·Î, ÅؽºÆ® ¿ä¾à ¹æ¹ý¿¡ ´ëÇÑ ¼Ò°³¿Í ÇÔ²², ±âÁ¸ »ý¼ºÇü ¾ð¾î ¸ðµ¨À» È°¿ëÇÑ ¹®¼­ ¿ä¾à°ú ÃÊ°Å´ë ¾ð¾î ¸ðµ¨À» ÀÌ¿ëÇÑ ¹®¼­ ¿ä¾à¿¡ ´ëÇØ »ìÆ캾´Ï´Ù. µÑ°·Î, °Å´ë ¾ð¾î ¸ðµ¨ÀÇ Ãß·Ð ÇÑ°èÀΠȯ°¢, ¿ÏÀü¼º, °£°á¼º ¹®Á¦µéÀ» ºÐ¼®ÇÏ°í, ÀüÅëÀûÀÎ ¹®¼­ ¿ä¾à Æò°¡ ¹æ¹ýÀÇ ÇѰ踦 ³íÀÇÇÕ´Ï´Ù. ±×¸®°í °Å´ë ¾ð¾î ¸ðµ¨À» È°¿ëÇÑ ÀÚµ¿ Æò°¡ ¹æ¹ý¿¡ ´ëÇØ ¾Ë¾Æº¾´Ï´Ù. ¼Â°·Î, º¥Ä¡¸¶Å· µ¥ÀÌÅÍÀÇ ±¸Ãà°úÁ¤ºÎÅÍ ÃÊ°Å´ë ¾ð¾î ¸ðµ¨ÀÇ º¥Ä¡¸¶Å· ¹× ºÐ¼®, ±×¸®°í ÀÌ¿¡ ´ëÇÑ Ç×ÈÄ ¿¬±¸ °úÁ¦¿¡ ´ëÇØ ´Ù·ì´Ï´Ù.
 
Bio

- (Çö) Çѱ¹°úÇбâ¼ú¿ø »ê¾÷¹×½Ã½ºÅÛ °øÇаú Á¶±³¼ö

- (Çö) Research Fellow at Asian Trustworthy Machine Learning (ATML) Group 

- Amazon Web Services Inc. ±Ù¹« (¹Ì±¹/Á¤±ÔÁ÷) 

- NAVER AI Lab ±Ù¹« (Çѱ¹/Á¤±ÔÁ÷) 

- Google Research ±Ù¹« (¹Ì±¹/ÀÎÅÏ) 

- Çѱ¹°úÇбâ¼ú¿ø µ¥ÀÌÅÍ»çÀ̾𽺠´ëÇпø ¹Ú»ç


 
 
ÃÖÁ¾Çö ±³¼ö (¼­¿ï´ëÇб³)
 
Title: Practical set-ups and methods for continual learning​
 
Abs:
Prevalent setups in continual learning often fall short of practical applicability, posing unrealistic assumptions and constraints. First, I will try to address these gaps by presenting various continual learning setups (mostly for class incremental learning) that are more realistic and feasible, on which we would need to evaluate our algorithms. Then, I will discuss one of our approaches to address a continual learning (e.g., class incremental learning) in a bit realistic set-up, online continuous data stream and beyond.
 
Bio

I am an associate professor in Seoul National University and was an associate professor in Yonsei University (2022-2024), an assistant professor in GIST (2018-2022) and a researcher in Allen Institute for Artificial Intelligence (AI2) (2016-2018) and Comcast Labs, DC (2015). During my PhD, I was fortunate to work as a research intern in Microsoft Research, Redmond (2014 Summer), Disney Research, Pittsburgh (2014 Spring), Adobe Research, San Jose (2013 Summer) and U.S. Army Research Lab. (Adelphi, MD, Summer 2011).

I received a Ph.D. degree from University of Maryland, College Park, under the supervision of Prof. Larry S. Davis, and a B.S. and a M.S. degree from Seoul National University, under the supervision of Prof. Kyoung-Mu Lee in SNU Computer Vision Lab.