Introduction

arXiv Badge License Badge

The origin of the data heterogeneity phenomenon is the characteristics of users, who generate non-IID (not Independent and Identically Distributed) and unbalanced data. With data heterogeneity existing in the FL scenario, a myriad of approaches have been proposed to crack this hard nut. In contrast, the personalized FL (pFL) may take advantage of the statistically heterogeneous data to learn the personalized model for each user.

Simple File Structure

PFLlib File Structure

An Example for FedAvg. You can create a scenario using generate_DATA.py and run an algorithm using main.py, clientNAME.py, and serverNAME.py. For a new algorithm, you only need to add new features in clientNAME.py and serverNAME.py.

Key Features

  • 37 traditional FL (tFL) or personalized FL (pFL) algorithms, 3 scenarios, and 24 datasets.
  • Some experimental results are avalible in the PFLlib paper and Benchmark Results.
  • The benchmark platform can simulate scenarios using the 4-layer CNN on Cifar100 for 500 clients on one NVIDIA GeForce RTX 3090 GPU card with only 5.08GB GPU memory cost.
  • We provide privacy evaluation and systematical research support.
  • You can now train on some clients and evaluate performance on new clients by setting args.num_new_clients in ./system/main.py. Please note that not all tFL/pFL algorithms support this feature.
  • PFLlib primarily focuses on data (statistical) heterogeneity. For algorithms and a benchmark platform that address both data and model heterogeneity, please refer to our extended project HtFLlib.
  • As we strive to meet diverse user demands, frequent updates to the project may alter default settings and scenario creation codes, affecting experimental results.
  • Closed issues may help you a lot when errors arise.
  • Acknowledgement

    If you find our PFLlib useful, please cite its corresponding paper

    @article{zhang2023pfllib,
        title={PFLlib: Personalized Federated Learning Algorithm Library},
        author={Zhang, Jianqing and Liu, Yang and Hua, Yang and Wang, Hao and Song, Tao and Xue, Zhengui and Ma, Ruhui and Cao, Jian},
        journal={arXiv preprint arXiv:2312.04992},
        year={2023}
    }