Publication:
Deep Learning-Based Segmentation of Head and Neck Organs-at-Risk with Clinical Partially Labeled Data

dc.affiliation.dptoUC3M. Departamento de Bioingenieríaes
dc.affiliation.grupoinvUC3M. Grupo de Investigación: BSEL - Laboratorio de Ciencia e Ingeniería Biomédicaes
dc.contributor.authorCubero Gutierrez, Lucia
dc.contributor.authorCastelli, Joël
dc.contributor.authorSimon, Antoine
dc.contributor.authorDe Creavoisier, Renaud
dc.contributor.authorAcosta, Oscar
dc.contributor.authorPascau González-Garzón, Javier
dc.contributor.funderMinisterio de Ciencia, Innovación y Universidades (España)es
dc.date.accessioned2023-06-01T14:35:10Z
dc.date.available2023-06-01T14:35:10Z
dc.date.issued2022-11-15
dc.description.abstractRadiotherapy is one of the main treatments for localized head and neck (HN) cancer. To design a personalized treatment with reduced radio-induced toxicity, accurate delineation of organs at risk (OAR) is a crucial step. Manual delineation is time- and labor-consuming, as well as observer-dependent. Deep learning (DL) based segmentation has proven to overcome some of these limitations, but requires large databases of homogeneously contoured image sets for robust training. However, these are not easily obtained from the standard clinical protocols as the OARs delineated may vary depending on the patient¿s tumor site and specific treatment plan. This results in incomplete or partially labeled data. This paper presents a solution to train a robust DL-based automated segmentation tool exploiting a clinical partially labeled dataset. We propose a two-step workflow for OAR segmentation: first, we developed longitudinal OAR-specific 3D segmentation models for pseudo-contour generation, completing the missing contours for some patients; with all OAR available, we trained a multi-class 3D convolutional neural network (nnU-Net) for final OAR segmentation. Results obtained in 44 independent datasets showed superior performance of the proposed methodology for the segmentation of fifteen OARs, with an average Dice score coefficient and surface Dice similarity coefficient of 80.59% and 88.74%. We demonstrated that the model can be straightforwardly integrated into the clinical workflow for standard and adaptive radiotherapy.es
dc.description.sponsorshipResearch supported by projects PI18/01625 and AC20/00102 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III, Asociación Española Contra el Cáncer and European Regional Development Fund “Una manera de hacer Europa”), Project PerPlanRT (ERA Permed) and Rennes Métropole. GPUs were donated by NVIDIA Applied Research Accelerator Programen
dc.identifier.bibliographicCitationCubero, L., Castelli, J., Simon, A., De Crevoisier, R., Acosta, O., & Pascau, J. (2022). Deep Learning-Based Segmentation of Head and Neck Organs-at-Risk with Clinical Partially Labeled Data. Entropy, 24(11), 1661.es
dc.identifier.doi10.3390/e24111661
dc.identifier.issn1099-4300
dc.identifier.publicationfirstpage1es
dc.identifier.publicationissue11, 1661es
dc.identifier.publicationlastpage15es
dc.identifier.publicationtitleEntropy (Entropy)es
dc.identifier.publicationvolume24es
dc.identifier.urihttps://hdl.handle.net/10016/37409
dc.identifier.uxxiAR/0000033092
dc.language.isoengen
dc.publisherMDPIen
dc.relation.projectIDGobierno de España. PI18/01625es
dc.relation.projectIDGobierno de España. AC20/00102es
dc.rights© 2022 by the authors. Licensee MDPI, Basel, Switzerland.en
dc.rightsAtribución 3.0 España*
dc.rights.accessRightsopen accessen
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/es/*
dc.subject.ecienciaBiología y Biomedicinaes
dc.subject.ecienciaIngeniería Mecánicaes
dc.subject.ecienciaMaterialeses
dc.subject.otherDlen
dc.subject.otherAutomated segmentationen
dc.subject.otherHead and neck radiotherapyen
dc.subject.otherOrgans-at-risken
dc.subject.otherPartiallyen
dc.subject.otherLabeleden
dc.subject.otherLongitudinal dataen
dc.titleDeep Learning-Based Segmentation of Head and Neck Organs-at-Risk with Clinical Partially Labeled Dataen
dc.typeresearch article*
dc.type.hasVersionVoR*
dspace.entity.typePublication
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Deep_E_2022.pdf
Size:
1.85 MB
Format:
Adobe Portable Document Format