B2SFL: A Bi-Level Blockchained Architecture for Secure Federated Learning-Based Traffic Prediction

Hao Guo, Collin Meese, Wanxin Li*, Chien Chung Shen, Mark Nejad

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


Federated Learning (FL) is a privacy-preserving machine learning (ML) technology that enables collaborative training and learning of a global ML model based on aggregating distributed local model updates. However, security and privacy guarantees could be compromised due to malicious participants and the centralized FL server. This article proposed a bi-level blockchained architecture for secure federated learning-based traffic prediction. The bottom and top layer blockchain store the local model and global aggregated parameters accordingly, and the distributed homomorphic-encrypted federated averaging (DHFA) scheme addresses the secure computation problems. We propose the partial private key distribution protocol and a partially homomorphic encryption/decryption scheme to achieve the distributed privacy-preserving federated averaging model. We conduct extensive experiments to measure the running time of DHFA operations, quantify the read and write performance of the blockchain network, and elucidate the impacts of varying regional group sizes and model complexities on the resulting prediction accuracy for the online traffic flow prediction task. The results indicate that the proposed system can facilitate secure and decentralized federated learning for real-world traffic prediction tasks.

Original languageEnglish
Pages (from-to)4360-4374
Number of pages15
JournalIEEE Transactions on Services Computing
Issue number6
Publication statusPublished - 1 Nov 2023


  • Blockchain
  • federated learning
  • homomorphic encryption
  • secure averaging
  • traffic prediction


Dive into the research topics of 'B2SFL: A Bi-Level Blockchained Architecture for Secure Federated Learning-Based Traffic Prediction'. Together they form a unique fingerprint.

Cite this