• Flower: a friendly federated learning framework

      

      

    Flower: a friendly federated learning framework. - Javier Fernandez-Marques (Flower Labs)


    FL moves the training to the data

    • Cross-device FL
      Massively distributed taking the training to smart devices.

    • Cross-silo FL
      Institutions collaborate training an ML model while preserving customers/patients data.

    • Intra-organisation FL
      A single organisation, one or multiple-locations, generating large volumes of data.

    Flower community is incredibly active

    • Fast adoption in open source projects.
    • Involved in shaping the core framework, documentation, examples, tutorials.
    • Rich community interactions.
    • Used to build the latest research.

    Flower Framework

    A unified approach to Federated Learning, Analytics and Evaluation.


    Python, Android, iOS, C++, cross-silo, cross-device, horizontal, vertical, differential privacy, secure aggregation, SMPC, simulation, deployment? Yes.


    • Flower API is lightweight.
    • Minimal, but complete, FL training with ~20 lines of extra code.
    • Works out of the box, easy to adapt to your setting.
    • Extend Flower’s strategies via custom callables.
    • Develop your own strategies entirely.



    Custom Strategies

    A Flower Strategy is at the core of a FL pipeline: samples clients, sends models to clients, aggregates models, etc.



    Implement your own client sampling mechanism.
    For example: resampling clients in consecutive rounds. In what other settings we might want to customize how client sampling is performed?

    • Biased sampling.
    • Sample client based on CO2 emissions.
    • Client-specific fit() instructions.



    Control the periodicity at which certain sub-stages take place within the strategy.
    For example: how often the global model is evaluated by the server. When is this particularly useful?

    • Cross-device FL with lightweight models for clients but large validation set on the server.
    • Event-based global evaluation.



    Federated Fine-tuning of a ViT

    • State of the art Vision architectures.
    • Fast fine-tuning of a ImageNet-pretrained ViT on a single GPU.



    On-device Federated Fine-tuning of Open AI’s Whisper

    • Finetune Whisper with Flower for keyword spotting applications.
    • Runs even on Raspberry Pi.



    LLM-FlowerTune: Federated LLM Fine-tuning with Flower

    • Finetune an LLM with Flower as a general purpose Chat assistant.
    • Highly optimized: LoRA + PEFT
    • Runs on a single GPU.




    Flower Datasets

    • Keep dataset downloading and partitioning simple.
    • 9 built-in partitioning methods.
    • Interface with 120K+ HF Datasets.
    • Great for research and prototyping!
    • Open source.