Skip to content
forked from lab-midas/autoPET

repository for autoPET machine lerning challenge

License

Notifications You must be signed in to change notification settings

Zrrr1997/autoPET

 
 

Repository files navigation

autoPET challenge

Repository for codes associated with autoPET machine learning challenge:
autopet.grand-challenge.org

If you use the data associated to this challenge, please cite:

Gatidis S, Kuestner T. A whole-body FDG-PET/CT dataset with manually annotated tumor lesions (FDG-PET-CT-Lesions) 
[Dataset]. The Cancer Imaging Archive, 2022. DOI: 10.7937/gkr0-xv29

Data conversion

Scripts for converting the database between DICOM, NiFTI, HDF5 and MHA formats.

nnUNet baseline

Baseline model for lesion segmentation: In this baseline model, the nnUNet framework (https://github.com/MIC-DKFZ/nnUNet) was used for training using the 3D fullres configuration with 16 GB of VRAM. PET (SUV) and resampled CT volumes were used as model input. The number of epochs was set to 1,000; the initial learning rate to 1e-4. Training was performed with 5-fold cross validation.

MONAI uNet baseline

Baseline model for lesion segmentation: In this proof-of-concept model a standard 3D uNet model as provided within the MONAI framework (https://monai.io) was adapted to dual-channel input (PET (SUV) and resampled CT volumes). Input patches were of size (128, 128, 32), batch size was set to 12, learning rate 1e-4 using Adam, maximum number of epochs set to 800.

References

Challenge: DOI
Database:

About

repository for autoPET machine lerning challenge

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 90.0%
  • Shell 6.3%
  • Dockerfile 3.7%