In this study, we present a method using knowledge distillation to create faster versions of deep architectures with fewer parameters for underwater marine waste classification. The proposed architectures, derived from VGG19, ResNet50, and DenseNet121, can independently classify images for remote monitoring with lower latency and improved performance. All deep models were trained on the JAMSTEC library of deep- sea images and compared with the original dense architectures that have higher parameter counts. The proposed models, with approximately 25% of the original deep architecture parame- ters, showed improved performance in terms of accuracy and inference time, achieving 70% accuracy through knowledge distillation.
Optimized Monitoring of Underwater Marine Waste Using Knowledge Distillation
Bushra Jalil
;Luca Valcarenghi;Luca Maggiani
2024-01-01
Abstract
In this study, we present a method using knowledge distillation to create faster versions of deep architectures with fewer parameters for underwater marine waste classification. The proposed architectures, derived from VGG19, ResNet50, and DenseNet121, can independently classify images for remote monitoring with lower latency and improved performance. All deep models were trained on the JAMSTEC library of deep- sea images and compared with the original dense architectures that have higher parameter counts. The proposed models, with approximately 25% of the original deep architecture parame- ters, showed improved performance in terms of accuracy and inference time, achieving 70% accuracy through knowledge distillation.File | Dimensione | Formato | |
---|---|---|---|
Optimized_Monitoring_of_Underwater_Marine_Waste_Using_Knowledge_Distillation.pdf
solo utenti autorizzati
Tipologia:
Documento in Pre-print/Submitted manuscript
Licenza:
Copyright dell'editore
Dimensione
986.51 kB
Formato
Adobe PDF
|
986.51 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.