Download PDFOpen PDF in browser
ZH
Switch back to the title and the abstract
in Chinese

Research on Cross-model of Transfer Learning based on Deep Learning

EasyChair Preprint 1092

34 pagesDate: June 5, 2019

Abstract

Transfer learning is a new machine learning method. It mainly uses the already labeled data set to train the model to solve different but related problems. The rapid expansion of data size has led to more and more serious problems of statistical heterogeneity and labeling. The lack of annotation data can lead to serious over-fitting problems in traditional supervised learning. The research work of this paper is as follows:

1) Based on SJE, the latent embeding model LatEm is built. Try to replace the SJE linear map with a piecewise linear approach. The main idea of ​​the model is to replace the low-dimensional features of the image with high-dimensional semantic features to learn the classifier, so that the trained model has mobility. LatEm is a cross-modal approach: it uses information from images and categories that are collected in an unsupervised manner by manual annotation or from a large text corpus.

2) For the case where a single sample in LatEm contains multiple mapping matrices. We sort the loss functions corresponding to each matrix and use the gradient down method to find the optimal solution.

3) The LatEm model was implemented on MATLAB and compared with the results obtained by SJE on the three data sets AWA, CUB and Dogs. The unlabeled classification accuracy on the two fine-grained data sets CUB and Dogs reached 52.3% and 24.5%, respectively.

Keyphrases: 多模态, 细粒度分类, 迁移学习, 零样本学习

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:1092,
  author    = {Bozhao Guo},
  title     = {Research on Cross-model of Transfer Learning based on Deep Learning},
  howpublished = {EasyChair Preprint 1092},
  year      = {EasyChair, 2019}}
Download PDFOpen PDF in browser