Skin lesion segmentation using a semi-supervised U-NetSC model with an adaptive loss function | IEEE Conference Publication | IEEE Xplore

Skin lesion segmentation using a semi-supervised U-NetSC model with an adaptive loss function

Publisher: IEEE

Abstract:

Skin lesion segmentation is a crucial step in cancer detection. Deep learning has shown promising results for lesion segmentation. However, the performance of these model...View more

Abstract:

Skin lesion segmentation is a crucial step in cancer detection. Deep learning has shown promising results for lesion segmentation. However, the performance of these models depends on accessing lots of training samples with pixel-level annotations. Employing a semi-supervised approach reduces the need for a large number of annotated samples. Accordingly, a semi-supervised strategy is proposed based on the high correlation of segmentation and classification tasks. The U - N et Segmentation and Classification model (U-NetSC) is a unified architecture containing segmentation and classification modules. The classification module uses feature maps from the last layer of the segmentation model to increase the collaboration of two tasks. U-NetSC can be trained with only class-level or both class-level and pixel-level ground truth using an adaptive loss function. U-NetSC achieves ~2%, ~ 2%, ~ 3%, and ~ 1 % improvement in Jaccard Index, Dice coefficient, precision, and accuracy, respectively, in comparison with a supervised attention-gated U-Net model. Clinical relevance - The paper proposes an automatic skin lesion segmentation model in a semi-supervised manner. Training the segmentation model is based on a combination of class-level and pixel-level information without requiring a large number of labeled samples
Date of Conference: 11-15 July 2022
Date Added to IEEE Xplore: 08 September 2022
ISBN Information:

ISSN Information:

PubMed ID: 36086077
Publisher: IEEE
Conference Location: Glasgow, Scotland, United Kingdom

References

References is not available for this document.