Yinmin Zhong

Yinmin Zhong

Ph.D. Student

Peking University

zhongyinmin [at] pku.edu.cn

About Me

I am a second-year Ph.D. student studying computer science in the Computer Systems Research Group at Peking University where I am advised by Xin Jin. Before that, I received my B.S. in Computer Science from Peking University.

I have a broad interest in building efficient systems for training and serving deep learning models, with a primary focus on large language models (LLMs) currently.

I am also an enthusiastic self-learner and interested in various fields of computer science. I have built a website to share my self-learning experiences and resources.

  • Machine Learning Systems
  • Distributed Systems
  • Large Language Models
  • Peking University

    Ph.D. in Computer Science, Sep 2022 - Present

  • Peking University

    B.S. in Computer Science, Sep 2018 - June 2022

  • StepFun System Team

    Research Intern, June 2024 - Present

  • ByteDance AML Team

    Research Intern, Aug 2023 - May 2024

  • Alibaba DAMO Academy

    Research Intern, Sep 2021 - Sep 2022

  • AI Innovation Center, Peking University

    Software Engineer Intern, Sep 2020 - Mar 2021


DistServe: Disaggregating Prefill and Decoding for Goodput-optimized Large Language Model Serving.
MegaScale: Scaling Large Language Model Training to More Than 10,000 GPUs.
DistMind: Efficient Resource Disaggregation for Deep Learning Workloads.
AlpaServe: Statistical Multiplexing with Model Parallelism for Deep Learning Serving.
ElasticFlow: An Elastic Serverless Training Platform for Distributed Deep Learning.
Fast Distributed Inference Serving for Large Language Models.
FLUX: Fast Software-based Communication Overlap On GPUs Through Kernel Fusion.
LoongServe: Efficiently Serving Long-context Large Language Models with Elastic Sequence Parallelism.