Lesion Labeling Analysis in LR Assisted by SAM-DR and Segmentation with YOLO v8-obb

Autores/as

  • Edgar Gilberto Platas-Campero CIC
  • Raquel Díaz-Hernández Instituto Nacional de Astrofísica
  • Leopoldo Altamirano-Robles Instituto Nacional de Astrofísica

DOI:

https://doi.org/10.13053/cys-29-1-5436

Palabras clave:

Leukemic retinopathy, diabetic retinopathy, YOLO, SAM, dual transfer learning

Resumen

Leukemic retinopathy (LR) presents significant challenges in automated diagnosis due to the scarcity of accurately labeled images. This work addresses these challenges through deep learning techniques, utilizing models such as You Only Look Once (YOLO) for lesion detection and the Segment Anything Model (SAM) for automatic labeling. The results show that the Segment Anything Model-Diabetic Retinopathy (SAM-DR) outperforms manual labeling, especially in detecting Hemorrhages (HE), with an mAP50 of 0.804. Furthermore, the comparison between transfer learning (TL) and dual transfer learning (DTL) reveals that DTL improves lesion detection across all classes. This automated approach not only enhances accuracy in lesion segmentation but also serves as a valuable asset in scenarios where specialis tlabeling is limited and data is scarce, enabling effective leveraging and transfer of acquired knowledge to similar pathologies.

Descargas

Publicado

2025-03-25

Número

Sección

Articles of the Thematic Section