17 Mar
14:00 Defesa de Mestrado Integralmente a distância
Avaliação de Modelos Auto-Supervisionados para Cenários Fora da Distribuição e com Poucos Dados
Levy Gurgel Chaves
Orientador / Docente
Orientadora: Sandra Eliza Fontes de Avila / Coorientador: Eduardo Alves do Valle Júnior
Breve resumo
Self-supervised learning bridges the gap between supervised and unsupervised learning. The former leads to the most accurate models but requires human-annotated samples, while the latter exploits non-annotated samples but often leads to disappointing accuracies. By using synthesized annotations on so-called pretext tasks, self-supervision can pre-train models on abundant pseudo-labels before tuning them for the downstream (target) task. This work assesses five self-supervision schemes against a supervised baseline for medical and natural images classification tasks. We consider challenging cases when low data samples – 1% and 10% of the original training set – are available and when test sets contain unknown distribution shifts (out-of-distribution) at training time. We cover a range of four medical classification tasks: skin lesion screening, breast cancer, brain tumor, and histopathology samples, and two general-purpose tasks: animal and vehicle classification. Our results suggest that supervised pre-training on ImageNet is advantageous in low-data and out-distribution scenarios when the classes of the target task are known at pre-training time, i.e., when the target classes are a subset of the original class subset. In medical applications, self-supervised pre-training improves the mean performance over the baseline in almost all datasets. However, none of the self-supervised pre-trained methods presented consistent advantages over the baseline in all datasets, but the best one varied depending on the target task. We hypothesize such behavior occurs due to the original self-supervised pre-training scheme adding difficulty in capturing the low inter-class and intra-class variation details in medical applications.
Banca examinadora
Sandra Eliza Fontes de Avila IC/UNICAMP
Cristina Nader Vasconcelos GOOGLE
Hélio Pedrini IC/UNICAMP
Mariana Pinheiro Bento UCALGARY
Esther Luna Colombini IC/UNICAMP