About Me

I am a 4th-year Ph.D. Student at College of Computer Science and Technology, Zhejiang University, China,under the supervision of Prof. Shuiguang Deng. Currently, I mainly focus on 1) Federated Fine-tuning of Large Language Models, 2) Personalized Federated Learning and 3) Trustworthy Federated Learning. So far, I have published 12 papers on ICML, KDD, AAAI, WWW, IEEE TSC, etc.

Currently, I am working as a research intern at Tongyi Lab, Alibaba Group, focusing on Federated Learning and Large Language Models. Prior to that, I worked as a research intern in 2012 Lab, Huawei Technologies Ltd. Co. from Mar. 2022 to May 2023, focusing on Preliminary Research of 6G architecture.

Interests
  • Federated Learning
  • Large Language Models
  • Data Mining
Education
  • PhD Computer Science

    Zhejiang University

  • MEng Computer Science

    Shanghai University

  • BSc Computer Science

    Shanghai University

📚 Research Topics

Currently, I am focusing on several research topics in Federated Learning (FL) and Large Language Models (LLMs), including:

  • Multi-modal large language models (MLLMs).
  • Federated fine-tuning of LLMs.
  • Trustworthy, personalization and communication cost of FL.

Please reach out to collaborate 😃

Featured Publications
Recent Publications
(2024). Federated Data-Efficient Instruction Tuning for Large Language Models. arXiv.
(2024). Federated Full-Parameter Tuning of Billion-Sized Language Models with Communication Cost under 18 Kilobytes. in ICML.
(2024). The Synergy between Data and Multi-Modal Large Language Models: A Survey from Co-Development Perspective. arXiv.
(2024). BlockDFL: A Blockchain-based Fully Decentralized Peer-to-Peer Federated Learning Framework. in WWW.
(2024). LARA: A Light and Anti-overfitting Retraining Approach for Unsupervised Time Series Anomaly Detection. in WWW.