Personalized neural architecture search for federated learning.

Minh Hoang and Carl Kingsford (2021) Personalized neural architecture search for federated learning.. In NeurIPS 2021 Workshop on New Frontiers in Federated Learning 2021..

Federated Learning (FL) is a recently proposed learning paradigm for decentralized devices to collaboratively train a predictive model without exchanging private data. Existing FL frameworks, however, assume a one-size-fit-all model architecture to be collectively trained by local devices, which is determined prior to observing their data.

Even with good engineering acumen, this often falls apart when local tasks are different and require diverging choices of architecture modelling to learn effectively. This motivates us to develop a novel personalized neural architecture search (NAS) algorithm for FL, which learns a base architecture that can be structurally personalized for quick adaptation to each local task. On several real- world datasets, our algorithm, FEDPNAS is able to achieve superior performance compared to other benchmarks on heterogeneous multitask scenarios.

View source