Efficient Computing Lab.
The Efficient Computing Laboratory (ECL) is a part of Department of AI at UST ETRI Campus. Gajeong-ro 218, Yuseong-gu, Daejoen South Korea.

7-416 ETRI, Gajeong-ro 218,Yuseong-gu123, Daejeon, South Korea
Welcome to the Efficient Computing Lab.
We focus on energy efficiency, system optimization, user experience, and sustainable tech solutions. Our research interests are the followings:
• Model Compression: Enhancing machine learning models’ efficiency through techniques like pruning, quantization, and knowledge distillation for better performance in resource-limited settings.
• AI Compiler: Developing optimized AI compilers to reduce computational power, energy use, and execution time, improving efficiency and sustainability.
For detailed research areas and insights into graduate life, please refer to the following slides. Students interested in pursuing graduate studies are encouraged to contact me directly after following the application instructions provided in the slides.
You can reach me at: leejaymin_at_etri_dot_re_dot_kr.
News
Apr 29, 2025 | Our paper was accepted at IJCAI (NRF BK21 IF 4)![]() |
---|---|
Apr 22, 2025 | Our paper was accepted at LCTES (NRF BK21 IF 2)![]() |
Feb 21, 2025 | Our paper was accepted at Sensors Journal (IF: 3.4, JCR24 Top 30.92%, Q2)![]() |
Dec 17, 2024 | Our paper was selected as a Distinguished Paper in the 1st International Conference on Artificial Intelligence Computing and Systems (AICompS) 2024![]() |
Dec 15, 2024 | “QuantuneV2” was accepted at Future Generation Computer Systems Journal (IF: 6.2, JCR23 Top 9.4%, Q1)![]() |
Nov 14, 2024 | Our paper received Best Paper Award in 2024 IeMeK, held at Jeju in 13–16 Nov. 2024![]() |
Nov 8, 2024 | Two papers was accepted at AIcompS 2024 . Congratulations![]() |
Selected Publications
2025
- IJCAIExploring the Trade-Offs: Quantization Methods, Task Difficulty, and Model Size in Large Language Models From Edge to GiantIn International Joint Conferences on Artificial Intelligence (IJCAI) 2025, To appear, NRF BK21 IF: 4, Acceptance Rate 19.3% (1042 papers accepted out of 5404 submitted)., Aug 2025
- LCTESMulti-Level Machine Learning-Guided Autotuning for Efficient Code Generation on a Deep Learning AcceleratorIn The 26th ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems (LCTES) 2025, To appear, NRF BK21 IF: 2, Acceptance Rate 38% (16 papers accepted out of 42 submitted)., Jun 2025