Open Access
Issue |
Wuhan Univ. J. Nat. Sci.
Volume 27, Number 6, December 2022
|
|
---|---|---|
Page(s) | 499 - 507 | |
DOI | https://doi.org/10.1051/wujns/2022276499 | |
Published online | 10 January 2023 |
- Yang Q, Liu Y, Chen T J, et al. Federated machine learning: Concept and applications[J]. ACM Transactions on Intelligent Systems and Technology (TIST), 2019, 10(2): 1-19. [Google Scholar]
- Chang W G, You T, Seo S, et al. Domain-specific batch normalization for unsupervised domain adaptation[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington D C: IEEE, 2019: 7354-7362. [Google Scholar]
- Zhao S C, Wang G Z, Zhang H H, et al. Multi-source distilling domain adaptation[C]// Proceedings of the AAAI Conference on Artificial Intelligence,2020, 34(7):12975-12983. [Google Scholar]
- Peng X C, Huang Z J, Zhu Y Z, et al. Federated Adversarial Domain Adaptation[EB/OL]. [2019-05-15]. https://.www.arXivpreprintarXiv:1911.02054. [Google Scholar]
- Kairouz P, McMahan H B, Avent B, et al. Advances and open problems in federated learning[J]. Foundations and Trends in Machine Learning, 2021,14(1-2): 1-210. [CrossRef] [Google Scholar]
- Karimireddy S P, Kale S, Mohri M , et al. SCAFFOLD: Stochastic Controlled Averaging for On-Device Federated Learning[EB/OL]. [2019-05-15].https://arxiv.org/abs/1910.06378. [Google Scholar]
- Li X, Huang K X, Yang W H, et al. On the Convergence of FedAvg on Non-iid Data[EB/OL]. [2019-04-27]. https://www.arXivpreprintarXiv:1907.02189. [Google Scholar]
- Peng X C, Bai Q X, Xia X D, et al. Moment matching for multi-source domain adaptation[C]// Proceedings of the IEEE/CVF International Conference on Computer Vision.Wasnington D C: IEEE, 2019: 1406-1415. [Google Scholar]
- McMahan H B, Moore E, Ramage D, et al. Communication-efficient learning of deep networks from decentralized data[C]//Artificial Intelligence and Statistics. New York: PMLR, 2017: 1273-1282. [Google Scholar]
- Kallista B, Hubert E, Wolfgang G, et al. Towards federated learning at scale: System design[C]// Proceedings of Machine Learning and Systems, 2019,1: 374-388. https://doi.org/10.48550/arXiv.1902.01046. [Google Scholar]
- Zhao H, Combes R T D, Zhang K, et al. On learning invariant representations for domain adaptation[C]// International Conference on Machine Learning. New York: PMLR, 2019: 7523-7532. [Google Scholar]
- Long M S, Cao Y, Wang J M, et al. Learning transferable features with deep adaptation networks[C]// International Conference on Machine Learning. New York: ACM, 2015,37: 97-105. [Google Scholar]
- Lee C Y, Batra T, Baig M H, et al. Sliced wasserstein discrepancy for unsupervised domain adaptation[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.Washington D C: IEEE , 2019:10285-10295. [Google Scholar]
- Chen D F, Mei J P, Wang C, et al. Online knowledge distillation with diverse peers[C]// Proceedings of the AAAI Conference on Artificial Intelligence. Washington D C: IEEE , 2020, 34(4): 3430-3437. [Google Scholar]
- Feng H Z, You Z Y, Chen M H, et al. KD3A: Unsupervised multi-source decentralized domain adaptation via knowledge distillation[C]// International Conference on Machine Learning. New York: PMLR, 2021: 3274-3283. [Google Scholar]
- Zhu Z D, Hong J Y, Zhou J Y. Data-free knowledge distillation for heterogeneous federated learning[C]// International Conference on Machine Learning. New York: PMLR, 2021,139: 12878-12889. [Google Scholar]
- Oord A V D, Li Y Z, Vinyals O. Representation Learning with Contrastive Predictive Coding[EB/OL]. [2018-03-28].https://arXiv:1807.03748. [Google Scholar]
- Bachman P, Hjelm R D, Buchwalter W. Learning Representations by Maximizing Mutual Information Across Views[EB/OL]. [2019-09-27]. https://arxiv.org/abs/1906.00910. [Google Scholar]
- Chen H Y, Chao W L. Fedbe: Making Bayesian Model Ensemble Applicable to Federated Learning[EB/OL]. [2020-02-15]. https://arXivpreprintarXiv:2009.01974. [Google Scholar]
- Lin T, Kong L J, Stich Sebastian U, et al. Ensemble distillation for robust model fusion in federated learning[J]. Advances in Neural Information Processing Systems, 2020, 33: 2351-2363. [Google Scholar]
- Li Q B, He B S, Song D. Model-contrastive federated learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington D C: IEEE , 2021: 10713-10722. [Google Scholar]
- Liang J, Hu D P, Feng J S. Do we really need to access the source data source hypothesis transfer for unsupervised domain adaptation[C]//Proceedings of the 37th International Conference on Machine Learning. New York: ACM, 2020: 6028-6039. [Google Scholar]
- Geiping J, Bauermeister H, Dröge H, et al. Inverting gradients-how easy is it to break privacy in federated learning?[C]//Proceedings of the 34th International Conference on Neural Information Processing Systems. New York: ACM, 2020, 33: 16937-16947. [Google Scholar]
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.