Can the use of privacy enhancing technologies enable federated learning for health data applications in a Swedish regulatory context?
AbstractA recent report by the Swedish Authority for Privacy Protection (IMY) evaluates the potential of jointly training and exchangingmachine learningmodels between two healthcare providers. In relation to the privacy problems identified therein, this article explores the trade-off between utility and privacy when using privacyenhancing technologies (PETs) in combination with federated learning. Results are reported from numerical experiments with standard text-book machine learning models under both differential privacy (DP) and FullyHomomorphic Encryption (FHE). The results indicate that FHE is a promising approach for privacy-preserving federated learning, with the CKKS scheme being more favorable in terms of computational performance due to its support of SIMD operations and compact representation of encrypted vectors. The results for DP are more inconclusive. The article briefly discusses the current regulatory context and aspects that lawmakers may consider to enable an AI leap in Swedish healthcare while maintaining data protection.
Copyright (c) 2023 Rickard Brännvall, Helena Linge, Johan Östman
This work is licensed under a Creative Commons Attribution 4.0 International License.