avatar

Zhen Qin

Third-year PhD Student @ College of Computer Science and Technology, Zhejiang University

Biography

I am a 3rd-year Ph.D. Student at College of Computer Science and Technology, Zhejiang University, China, working on Federated Learning under the supervision of Prof. Shuiguang Deng. Currently, I mainly focus on 1) Federated Fine-tuning of Large Language Models, 2) Personalized Federated Learning and 3) Trustworthy Federated Learning. So far, I have published 12 papers on ICML, SIGKDD, AAAI, ICSOC, IEEE ICWS, IEEE TSC, KBS, etc.

Currently, I am working as a research intern at DAMO Academy, Alibaba Group, focusing on Federated Fine-tuning of Large Language Models. Prior to that, I worked as a research intern in 2012 Lab, Huawei Technologies Ltd. Co. from Mar. 2022 to May 2023, focusing on Preliminary Research of 6G architecture.

Research Topics

Currently, I am focusing on several research topics in Federated Learning, including:

  • Federated Fine-tuning of Large Language Models
    Designing federated fine-tuning techniques for billion-sized large language models (LLMs), leveraging the vast quantities of data continuously generated at end devices to enhance the responsibility of LLMs to tasks described in natural language.
  • Personalized Federated Learning
    Designing the personalized FL framework to provide adaptability to the statistical heterogeneity of data among distributed clients, to provide high model accuracy regardless of the degree of non-IIDness.
  • Trustworthy Federated Learning
    • The defenses to backdoor attacks in FL through anomaly detection techniques.
    • Building fully decentralized FL systems on the basis of blockchain, providing distributed trustworthy for FL among peer participants.

Selected Publications

A full list of publications is available at Google Scholar.

Experience

  • Research Intern at Damo Academy, Alibaba Group (2023.06 - Now)
    • Federated Fine-tuning of LLMs: Exploring the possibilities of tuning large models based on federated learning, mainly addressing communication overhead and memory cost issues.
    • System Development: Exploring memory-efficient fine-tuning techniques for LLMs which are suitable for cross-device FL, and integrating them into FederatedScope.
  • Research Intern at 2012 Lab, Huawei Technologies, Ltd. Co. (2022.03 - 2023.04)

Selected Honor & Awards

  • Outstanding Graduate Student of Zhejiang University in 2023
  • Outstanding Graduate Student of Zhejiang University in 2022
  • Excellent Graduate of Shanghai
  • National Scholarship for Graduate Students in 2020 (in Shanghai University)
  • National Scholarship for Graduate Students in 2019 (in Shanghai University)
  • Second Prize of China Post-graduate Mathematical Contest in Modeling (<14.5\%)