Donghyun Son

prof_pic.png

I’m an undergraduate student majoring Computer Science and Engineering at Seoul National University. I’m working on efficient LLM inference algorithms as an undergraduate researcher at CMA Lab, under the supervision of Sungjoo Yoo. Previously, I worked as a machine learning engineer at Hyperconnect (acquired by Match Group), where I built an ML-based content moderation system for Match Group brands.

I’m broadly interested in efficient algorithms for model training and inference. My previous works focus on data efficient methods including multiple subtasks approach, domain generalization, and few-shot personalization.

Recently, I’m interested in addressing memory bounds for LLM inference. Particularly, I’m interested in post-training quantization techniques, efficient KV cache management, and sparse attention for long context inference.

Aside from research, I enjoy algorithmic problem solving and have competed in various competitive programming contests such as ICPC, Google Hashcode, and SCPC. You can find me on codeforces and BOJ.

Links: Github / CV / Google Scholar / X

 

selected publications

  1. WSDM23 (Oral)
    wsdm23.png
    Reliable decision from multiple subtasks through threshold optimization: Content moderation in the wild
    Donghyun Son*, Byounggyu Lew*, Kwanghee Choi, and 5 more authors
    In Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining, 2023
  2. OOD-CV@ICCV23
    ood-cv23.png
    Gradient estimation for unseen domain risk minimization with pre-trained models
    Byounggyu Lew*Donghyun Son*, and Buru Chang
    In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023