Download PDFOpen PDF in browser

Budget Active Learning for Deep Networks

EasyChair Preprint no. 2686

7 pagesDate: February 17, 2020

Abstract

In the digital world unlabeled data is relatively easy to acquire but expensive to label even with use of domain experts. On the other hand, state-of-the-art Deep Learning methods are dependent on large labeled datasets for training. Recent works on Deep Learning focus on use of Active Learning with uncertainty for model training. Although most uncertainty Active Learning selection strategies are very effective, they fail to take informativeness of the unlabeled instances into account and are prone to querying outliers. In order to address these challenges, we propose a Budget Active Learning (BAL) algorithm for Deep Networks that advances active learning methods in three ways. First, we exploit both the uncertainty and diversity of instance using uncertainty and correlation evaluation metrics. Second, we use a budget annotator to label high confidence instances, and simultaneously update the selection strategy. Third, we incorporate Active Learning in Deep Networks and perform classifications on untrained and pretrained models with two classical and a plant-seedling sets of data while minimizing the prediction loss. Experimental results on the three datasets of varying sizes demonstrate the efficacy of the proposed BAL method over other state-of-the-art Deep Active Learning methods.

Keyphrases: active learning, deep learning, machine learning

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:2686,
  author = {Patrick Gikunda and Nicolas Jouandeau},
  title = {Budget Active Learning for Deep Networks},
  howpublished = {EasyChair Preprint no. 2686},

  year = {EasyChair, 2020}}
Download PDFOpen PDF in browser