-
-
Notifications
You must be signed in to change notification settings - Fork 16.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calculating a 'Detection Rate' | YOLOv5 #13450
Comments
👋 Hello @kyrangraves, thank you for your interest in YOLOv5 🚀! Your project detecting coral morphologies sounds fascinating 🌊🐠. Regarding your query, modifying the evaluation metric to ignore classification while maintaining IoU thresholds is indeed achievable but requires some custom edits. If you'd like assistance with this, we recommend sharing a detailed description of the modifications you're planning to make in the If this is a ❓ Question rather than a 🐛 Bug Report, please also share any additional information, such as your dataset structure, training logs, and validations conducted so far. This will ensure we can provide the most accurate guidance. Make sure to also check our training tips for best practices to enhance experimental results. RequirementsMake sure you are using Python>=3.8.0 with all dependencies installed from YOLOv5 supports various run environments, including Jupyter notebooks with GPU, as well as cloud platforms like Google Cloud and AWS. These come preconfigured with dependencies like CUDA, cuDNN, Python, and PyTorch. Continuous IntegrationIf you're encountering unexpected training or evaluation results, consider checking the status of YOLOv5’s CI tests. Green status indicates that all training, validation, inference, export, and benchmark tests are passing on the latest commit. Finally, this is an automated response 🤖, but an Ultralytics engineer will review your issue and assist as soon as possible. Good luck with your modifications! 🚀 |
@kyrangraves hi Kyran, You're correct that YOLOv5's default Alternatively, you can adjust the evaluation pipeline to compute this metric post hoc by analyzing output predictions stored in For customization guidance, you may find it helpful to review YOLOv5's validation implementation. If possible, share your modifications with the community—it might help others tackling similar challenges! All the best with detecting coral morphologies! |
Hi All,
I have trained a bunch of YOLOv5 models with varying parameters to detect different coral morphologies from ROV imagery. I want to modify the val.py script to create an evaluation metric that determines the % of ground truth objects that have been correctly detected, irrespective of whether the assigned label (classification) is correct. If I'm correct, the out-the-box val.py script determines a true positive when your defined IoU threshold and correct classification are met. I'm essentially wanting to remove the classification element of this.
Before I try, has anybody written a bit of script like this before or know of another simpler way of calculating this metric?
All the best - Kyran
Originally posted by @kyrangraves in #13449
The text was updated successfully, but these errors were encountered: