bzdww

Get answers and suggestions for various questions from here

14 papers present a complete picture of "migration learning" | Selected Papers #04

cms
PaperWeekly is an AI academic sharing community. A large number of first-line AI scholars have gathered here, and they use refined words to recommend their own high-quality papers. Join the community now and create your own collection of essays.

Here is a selection of the 4th issue of the proceedings.

For the sake of human beings, migration learning is the mastery of learning ability. For computers, the so-called migration learning is to extract knowledge and experience from one or more source tasks and then apply it to a relevant target domain .

In this issue, we bring a collection of migration learning monographs created by PaperWeekly community user @jindongwang . Through 14 new and classic papers, we will take you through the development and current status of migration learning . If you have a paper that fits your mind, copy the link to the browser to view the original text.

Domain adaptation via transfer component analysis

@jindongwang Recommend

#Transfer Learning

The well-recognized classic work in the migration learning field, the team of experts from the Hong Kong University of Science and Technology Qiang Yang professor team, recommend all students who do migration study and study to take a look.

Link to the paper: paperweekly.site/papers

2. Geodesic flow kernel for unsupervised domain adaptation

@jindongwang Recommend

#Unsupervised Learning

A representative article in the field of migration learning - GFK (Geodesic flow kernel). The GFK method first solves the problem of SGF: how to determine the number of intermediate points on the source and target paths. It solves this problem by proposing a kernel method that takes advantage of the integration of all points on the path. This is the first contribution. Then, it solves the second problem: How do we decide which source to use to migrate with target when there are multiple sources? GFK solves this problem by proposing a Rank of Domain metric that measures the source closest to the target.

Link to the paper: paperweekly.site/papers

3. Transfer feature learning with joint distribution adaptation

@jindongwang Recommend

#Domain Adaptation

Another classic article in the migration learning field is an enhanced version of TCA, recommended for reading. The JDA method is clever, adapting two distributions, and then very finely arranging an optimization goal. Iterating with a weak classifier, and finally achieved good results, it is worth learning.

Link to the paper: paperweekly.site/papers

4. Unsupervised Domain Adaptation by Backpropagation

@jindongwang Recommend

#Transfer Learning

Deep migration to learn classic articles.

Link to the paper: paperweekly.site/papers

Code link: github.com/shucunt/doma

5. How transferable are features in deep neural networks?

@jindongwang Recommend

#CNN

Exploring the migratable nature of deep networks is well worth reading. Although the paper does not propose an innovative method, the following conclusions have been obtained through experiments, which have a very high guiding significance for future deep learning and deep migration learning.

The first three layers of the neural network are basically general features, and the effect of migration will be better; adding fine-tune to the deep migration network will increase the effect, which may be better than the original network; Fine-tune can be better. To overcome the difference between data; deep migration network is better than random initialization weight; network layer migration can accelerate network learning and optimization.

Link to the paper: paperweekly.site/papers

Code link: github.com/yosinski/con

6. Deep Domain Confusion: Maximizing for Domain Invariance

@jindongwang Recommend

#Deep Learning

The earliest representative article of deep migration learning, although it is not known where it has been sent (always on arXiv), but the amount of citation is large, it is a relatively basic work. Worth reading.

Link to the paper: paperweekly.site/papers

7. Learning Transferable Features with Deep Adaptation Networks

@jindongwang Recommend

#Transfer Learning

Deep Adaptation Netowrk (DAN) is a deep migration learning method proposed by Tsinghua University Long Mingsheng. It was originally published in the 2015 ICML conference. DAN solves the classic domain adaptation problem in migration learning and machine learning, but only uses the deep network as the carrier to adapt the migration.

Link to the paper: paperweekly.site/papers

8. Simultaneous Deep Transfer Across Domains and Tasks

@jindongwang Recommend

#Transfer Learning

The traditional deep migration learning method only performs domain confusion. This article adds task transfer, that is, fully considers the similarity between categories.

Link to the paper: paperweekly.site/papers

9. A Unified Framework for Metric Transfer Learning

@jindongwang Recommend

#Transfer Learning

The author team of this article is from Nanyang Technological University in Singapore. The main boss is Sinno Jialin Pan. He is the first author of "A survey on transfer learning". The article is relatively new and worth reading.

Link to the paper: paperweekly.site/papers

10. Adversarial Discriminative Domain Adaptation

@corenel recommended

#Domain Adaptation

ADDA summarizes the overall architecture and outline of the DA field.

Link to the paper: paperweekly.site/papers

Code link:

github.com/erictzeng/ad

github.com/corenel/pyto

11. Correlation Alignment by Riemannian Metric for Domain Adaptation

@jindongwang Recommend

#Domain Adaptation

A relatively new job, but less innovative: just replace the distance metrics in the existing CoRAL work with the metrics in the Riemann space.

Link to the paper: paperweekly.site/papers

12. Understanding How Feature Structure Transfers in Transfer Learning

@jindongwang Recommend

#Representation Learning

IJCAI-17 The latest article, understanding how feature transfer in migration learning. There are two big cows, Qiang Yang and Dacheng Tao, and the article is definitely not bad.

Paper link: paperweekly.site/papers

13. Associative Domain Adaptation

@corenel recommended

#Deep Learning Processor

Compared with ADDA, it greatly improves the performance of DA and is worth reading.

Link to the paper: paperweekly.site/papers

Code link: github.com/haeusser/lea

14. Learning to Transfer

@jindongwang Recommend

#Transfer Learning

The relatively new research direction in the field of migration learning, combining migration learning with incremental learning is a groundbreaking work. It is recommended to read it.

Paper link: paperweekly.site/papers

This article is recommended by the AI ​​academic community PaperWeekly. The community currently covers research areas such as natural language processing, computer vision, artificial intelligence, machine learning, data mining and information retrieval. Click to join the community immediately!

About PaperWeekly

PaperWeekly is an academic platform that recommends, interprets, discusses, and reports on the results of artificial intelligence frontier papers. If you are researching or working in the AI ​​field, you are welcome to click on the "Communication Group" in the background of the public number . The assistant will bring you into the communication group of PaperWeekly.

WeChat public number: PaperWeekly

Sina Weibo: @PaperWeekly