Domain Adaptation via Bregman divergence minimization

Document Type : Article

Authors

Faculty of Information Technology and Computer Engineering, Urmia University of Technology, Urmia, Iran

Abstract

In recent years, Fisher linear discriminant analysis (FLDA) based classifi-
cation is one of the most successful approaches and has been shown effective
performance in different classification tasks. However, when the learning data
(source domain) have a different distribution against the testing data (tar-
get domain), the FLDA-based models may not be optimal, and the perfor-
mance will be degraded, dramatically. To face this problem, in this paper, we
propose an optimal domain adaptation via Bregman divergence minimization
(DAB) approach, in which the discriminative features of source and target do-
mains are simultaneously learned via domain invariant representation. DAB
is designed based on the constraints of FLDA, with the objective to adapt
the coupled marginal and conditional distribution mismatches with Breg-
man divergence minimization. Thus, the resulting representation can have
well functionality like FLDA and simultaneously have better discrimination
ability. Moreover, our proposed approach can be easily kernelized to deal
with nonlinear tasks. Extensive experiments on various benchmark datasets
show that our DAB can effectively deal with the cross domain divergence and
outperforms several state-of-the-art domain adaptation approaches on cross-
distribution domains.

Keywords


References
[1] Zandifar, M. and Tahmoresnezhad, J. "Locality Fisher discriminant analysis for
conditional domain adaption", Iran Journal of Computer Science, 4(1), pp.17-
34, (2020)
[2] Si, S., Tao, D. and Geng, B. Bregman divergence-based regularization for
transfer subspace learning", IEEE T KNOWL DATA EN, 22(7), pp. 929{942,
(2010).
[3] Karimpour, M., Saray, S. N., Tahmoresnezhad J. et al. Multi-source domain
adaptation for image classi cation", Machine Vision and Applications, 31(6),
pp.1-19, (2020).
[4] Tahmoresnezhad, J. and Hashemi, S. An ecient yet e ective random parti-
tioning and feature weighting approach for transfer learning", INT J PATTERN
RECOGN, 30(02), pp.1651003, (2016).
[5] Vapnik, V.N. LECT NOTES MATH. Wiley, (1998).
[6] Cayton, L. Fast nearest neighbor retrieval for bregman divergences", In Pro-
ceedings of the 25th international conference on Machine learning, pp. 112-119,
(2008).
[7] Noble, W. S. What is a support vector machine?", Nature biotechnology,
24(12), 1565-1567, (2006).
[8] Denisov, P. and Vu, N. T. "End-to-end multi-speaker speech recognition using
speaker embeddings and transfer learning." arXiv preprint arXiv:1908.04737,
(2019).
[9] Shivakumar, P.G. and Georgiou, P. "Transfer learning from adult to children
for speech recognition: Evaluation, analysis and recommendations", Computer
Speech and Language, 63, pp. 101077, (2020).
[10] Liu, R., Shi, Y., Ji, C. et al. A survey of sentiment analysis based on transfer
learning", IEEE Access, 7, 85401-85412, (2019).
18
[11] Wei, W., Meng, D., Zhao, Q., Xu, Z. et al. Semi-supervised transfer learning
for image rain removal", In Proceedings of the IEEE Conference on Computer
Vision and Pattern Recognition, pp. 3877-3886, (2019).
[12] Saray, S. N. and Tahmoresnezhad, J. "Joint distinct subspace learning and un-
supervised transfer classi cation for visual domain adaptation", Signal, Image
and Video Processing, pp. 1-9, (2020).
[13] Wang, J., Zheng, V. W., Chen, Y. et al. Deep transfer learning for cross-
domain activity recognition", In proceedings of the 3rd International Confer-
ence on Crowd Science and Engineering, pp. 1-8, (2018).
[14] Zhu, H., Samtani, S., Chen, H. et al. Human identi cation for activities of daily
living: A deep transfer learning approach", Journal of Management Information
Systems, 37(2), pp.457-483, (2020).
[15] Cai, L., Gu, J., Ma, J., et al. Probabilistic wind power forecasting approach via
instance-based transfer learning embedded gradient boosting decision trees",
Energies, 12(1), pp.159, (2019).
[16] Sun, G. , Liang, L., Chen, T., et al. Network trac classi cation based
on transfer learning", Computers and electrical engineering, 69, pp.920-927,
(2018).
[17] Hooshmand, A. and Sharma, R. Energy predictive models with limited data
using transfer learning", In Proceedings of the Tenth ACM International Con-
ference on Future Energy Systems, pp. 12-16, (2019).
[18] Zhong, X., Guo, S., Shan, H. et al. Feature-Based Transfer Learning Based on
Distribution Similarity", IEEE Access, 6, pp.35551-35557, (2018).
[19] Gholenji, E. and Tahmoresnezhad, J. Joint local and statistical discriminant
learning via feature alignment", Signal, Image and Video Processing, pp.1-8,
(2019).
[20] Gong, B., Grauman, K. and Sha, F., Connecting the dots with landmarks:
Discriminatively learning domain-invariant features for unsupervised domain
adaptation",ASTR SOC P, pp. 222{230, (2013).
[21] Xiao, T., Liu, P., Zhao W. et al. Iterative landmark selection and subspace
alignment for unsupervised domain adaptation", Journal of Electronic Imaging,
27(3), pp. 033037, (2018).
[22] Aljundi, R., Emonet, R., Muselet, D., et al. Landmarks-based kernelized sub-
space alignment for unsupervised domain adaptation", in Proceedings of the
IEEE Conference on Computer Vision and Pattern Recognition, pp. 56{63,
(2015).
19
[23] Wang, B., Qiu, M., Wang, X., et al. A Minimax Game for Instance based
Selective Transfer Learning," inProceedings of the 25th ACM SIGKDD Inter-
national Conference on Knowledge Discovery Data Mining ACM, pp. 34-43,
(2019).
[24] Long, M., Wang, J., Ding, G., et al. Adaptation regularization: A gen-
eral framework for transfer learning",IEEE T KNOWL DATA EN, 26(5), pp.
1076{1089, (2014).
[25] Gheisari, M. and Baghshah, M. S. Unsupervised domain adaptation via rep-
resentation learning and adaptive classi er learning", NEUROCOMPUTING,
165, pp. 300{311, (2015).
[26] Gheisari, M. and Baghshah, M. S. Joint predictive model and representation
learning for visual domain adaptation", ENG APPL ARTIF INTEL, 58, pp.
157{170, (2017).
[27] Fodor, I. K. A survey of dimension reduction techniques",CMR WORKSH, 9,
pp.1{18, (2002).
[28] Tao, D., Li, X., Wu, X. et al. Geometric mean for subspace selection", IEEE
Transactions on Pattern Analysis and Machine Intelligence, 31(2), pp. 260-274,
(2008).
[29] Song, P., Zheng, W., Liu, J., et al. A novel speech emotion recognition method
via transfer pca and sparse coding", In Chinese Conference on Biometric Recog-
nition, pp. 393-400, (2015).
[30] Song, S., Yu, H., Miao, Z. et al. Domain adaptation for convolutional neu-
ral networks-based remote sensing scene classi cation", IEEE Geoscience and
Remote Sensing Letters, 16(8), pp. 1324-1328, (2019).
[31] Tahmoresnezhad, J. and Hashemi, S. Visual domain adaptation via transfer
feature learning", KNOWL INF SYST, 50(2), pp. 585{605, (2017).
[32] Liu, J., Li, J. and Lu, K., Coupled local-global adaptation for multi-source
transfer learning," ENeurocomputing, 275, pp. 247-254, (2018).
[33] Rezaei, S. and Tahmoresnezhad, J. Discriminative and domain invariant sub-
space alignment for visual tasks,". Iran Journal of Computer Science, 2(4)
pp.219-230 (2019).
[34] Ghifary, M., Balduzzi, D., Kleijn, W. B. et al. Scatter component analysis: A
uni ed framework for domain adaptation and domain generalization",IEEE T
PATTERN ANAL, 39(7), pp. 1414{1430, (2017).
[35] Gneiting, T., Balabdaoui, F. and Raftery, A. E. Probabilistic forecasts, cali-
bration and sharpness", J R STAT SOC, 69(2), pp. 243{268, (2007).
[36] Wand M. P. and Jones, M. C. "Kernel smoothing", Crc Press, (1994).
20
[37] Pan, S. J., Kwok, J. T. and Yang, Q. Transfer learning via dimensionality
reduction", AAAS R&D B, 8, pp. 677{682, (2008).
[38] Wang, J., Chen, Y., Hao, S. et al. Balanced distribution adaptation for transfer
learning", In 2017 IEEE International Conference on Data Mining (ICDM), pp.
1129-1134, (2017).
[39] Torkkola, K. Feature extraction by non-parametric mutual information maxi-
mization", J MACH LEARN RES, 3(Mar), pp. 1415{1438, (2003).
[40] Gong, B., Shi, Y., Sha, F., et al. Geodesic 
ow kernel for unsupervised domain
adaptation", COMM COM INF SC, pp. 2066{2073, (2012).
[41] Long, M., Wang, J., Ding, G., et al. Transfer feature learning with joint dis-
tribution adaptation", LECT NOTES COMPUT SC, pp. 2200{2207, (2013).
[42] Long, M., Wang, J., Ding, G., et al. Transfer joint matching for unsupervised
domain adaptation", P IEEE INT FREQ CONT, pp. 1410{1417, (2014).
[43] Xu, Y., Fang, X., Wu, J., et al. Discriminative transfer subspace learning via
low-rank and sparse representation", IEEE T IMAGE PROCESS, 25(2), pp.
850{863, (2016).
[44] Zhang, W. and Wu, D. Discriminative joint probability maximum mean dis-
crepancy (DJP-MMD) for domain adaptation", In 2020 International Joint
Conference on Neural Networks (IJCNN), pp. 1-8, IEEE.(2020).