论文阅读 [TPAMI-2022] Meta-Transfer Learning Through Hard Tasks

论文阅读 [TPAMI-2022] Meta-Transfer Learning Through Hard Tasks

论文搜索(studyai.com)

搜索论文: Meta-Transfer Learning Through Hard Tasks

搜索论文: http://www.studyai.com/search/whole-site/?q=Meta-Transfer+Learning+Through+Hard+Tasks

关键字(Keywords)

Task analysis; Adaptation models; Training; Feature extraction; Training data; Data models; Measurement; Few-shot learning; transfer learning; meta learning; image classification

机器学习; 机器视觉

图像分类; 半监督学习; 小样本学习; 元学习; 迁移学习

摘要(Abstract)

Meta-learning has been proposed as a framework to address the challenging few-shot learning setting.

元学习被提议作为一个框架来解决具有挑战性的少数镜头学习环境。.

The key idea is to leverage a large number of similar few-shot tasks in order to learn how to adapt a base-learner to a new task for which only a few labeled samples are available.

关键的想法是利用大量类似的少量任务,以便学习如何使基础学习者适应一项新任务,而该任务只有少量标记样本可用。.

As deep neural networks (DNNs) tend to overfit using a few samples only, typical meta-learning models use shallow neural networks, thus limiting its effectiveness.

由于深度神经网络(DNN)往往只使用少数样本进行过度拟合,典型的元学习模型使用浅层神经网络,因此限制了其有效性。.

In order to achieve top performance, some recent works tried to use the DNNs pre-trained on large-scale datasets but mostly in straight-forward manners, e.g., (1) taking their weights as a warm start of meta-training, and (2) freezing their convolutional layers as the feature extractor of base-learners.

为了获得最佳性能,最近的一些工作尝试使用在大规模数据集上预先训练的DNN,但主要是以直接的方式,例如,(1)将其权重作为元训练的热身开始,(2)冻结其卷积层作为基础学习者的特征提取器。.

In this paper, we propose a novel approach called meta-transfer learning (MTL), which learns to transfer the weights of a deep NN for few-shot learning tasks.

在本文中,我们提出了一种称为元迁移学习(MTL)的新方法,该方法学习在少数镜头学习任务中转移深度神经网络的权重。.

Specifically, meta refers to training multiple tasks, and transfer is achieved by learning scaling and shifting functions of DNN weights (and biases) for each task.

具体来说,meta指的是训练多个任务,而转移是通过学习每个任务的DNN权重(和偏差)的缩放和移位函数来实现的。.

To further boost the learning efficiency of MTL, we introduce the hard task (HT) meta-batch scheme as an effective learning curriculum of few-shot classification tasks.

为了进一步提高MTL的学习效率,我们引入了硬任务(HT)元批处理方案,作为一种有效的少镜头分类任务学习课程。.

We conduct experiments for five-class few-shot classification tasks on three challenging benchmarks, miniImageNet, tieredImageNet, and Fewshot-CIFAR100 (FC100), in both supervised and semi-supervised settings.

我们在监督和半监督环境下,在miniImageNet、tieredImageNet和Fewshot-CIFAR100(FC100)这三个具有挑战性的基准上,对五类少镜头分类任务进行了实验。.

Extensive comparisons to related works validate that our MTL approach trained with the proposed HT meta-batch scheme achieves top performance.

通过与相关工作的广泛比较,验证了我们使用所提出的HT元批处理方案训练的MTL方法达到了最佳性能。.

An ablation study also shows that both components contribute to fast convergence and high accuracy…

消融研究还表明,这两种成分都有助于快速收敛和高精度。。.

作者(Authors)

[‘Qianru Sun’, ‘Yaoyao Liu’, ‘Zhaozheng Chen’, ‘Tat-Seng Chua’, ‘Bernt Schiele’]

猜你喜欢

转载自blog.csdn.net/weixin_42155685/article/details/124230335