Abstract
Federated learning technologies have found their first applications with the publication by Google of an article about the algorithms embedded in the keyboards of Android phones. Rather than feeding data about users' conversations back to Google's servers, Android keyboards train the predictive models that enable spelling correction or suggestion on the phone itself, feeding back only the changes to the predictive models inferred from the user's conversation data.
In addition to considerations relating to the confidentiality of the data on which the algorithms are trained, these techniques can be useful for use cases where data cannot cross national borders, or in situations where a consortium of companies would like to co-construct a predictive model without any of them having to reveal important information to their competitors.
Federated learning technologies also offer tremendous potential for generating new medical discoveries. Today, there is no single method that meets all the necessary constraints: such a system must have low bandwidth, guarantee confidentiality of transmitted data, manage asynchrony between nodes and non-representativeness of batches. Peer-to-peer federated learning technologies incorporating compressed communications are currently being developed to unleash the potential of federated learning on medical data.