Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added support by using oriented_box_iou_batch #1593

Merged
merged 10 commits into from
Oct 31, 2024

Conversation

PrakharJain1509
Copy link
Contributor

@PrakharJain1509 PrakharJain1509 commented Oct 13, 2024

Add Support for Oriented Bounding Boxes in MeanAveragePrecision

Changes

  • Modified MeanAveragePrecision class to support oriented bounding boxes
  • Added integration tests for MeanAveragePrecision

Details

  • MeanAveragePrecision now handles ORIENTED_BOUNDING_BOXES as a metric target using oriented_box_iou_batch.

@CLAassistant
Copy link

CLAassistant commented Oct 13, 2024

CLA assistant check
All committers have signed the CLA.

@PrakharJain1509
Copy link
Contributor Author

@LinasKo I have created this PR by making the required changes in the code according to the issue #1562, Please review it and let me know if any changes are required.

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 13, 2024

Hi @PrakharJain1509 ,

Thank you very much! I'll have a look and review it tomorrow morning. Thank you for your contribution!

@PrakharJain1509
Copy link
Contributor Author

Hi @PrakharJain1509 ,

Thank you very much! I'll have a look and review it tomorrow morning. Thank you for your contribution!

Sure,
Thanks

@PrakharJain1509
Copy link
Contributor Author

Hi @PrakharJain1509 ,

Thank you very much! I'll have a look and review it tomorrow morning. Thank you for your contribution!

@LinasKo Sir, Any updates regarding this?

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 14, 2024

Looking at it right now 😉

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 14, 2024

Hi @PrakharJain1509 👋

I can only see one commit at the moment, and that does not include everything you mentioned. Is there some code you haven't pushed yet?

Added integration tests for MeanAveragePrecision
New tests cover various scenarios including perfect overlap, no overlap, and rotated boxes

Normally we ask to make a Colab, as that makes the review process much faster - we don't need to create an additional environment where we can run and tweak the values, and we get a bit of code describing the current behaviour that we can look back at in the future. if you're adding well-rounded tests - the Colab is optional.

@PrakharJain1509
Copy link
Contributor Author

Hi @PrakharJain1509 👋

I can only see one commit at the moment, and that does not include everything you mentioned. Is there some code you haven't pushed yet?

Added integration tests for MeanAveragePrecision
New tests cover various scenarios including perfect overlap, no overlap, and rotated boxes

Normally we ask to make a Colab, as that makes the review process much faster - we don't need to create an additional environment where we can run and tweak the values, and we get a bit of code describing the current behaviour that we can look back at in the future. if you're adding well-rounded tests - the Colab is optional.

The changes which I made so far are correct or they are also invalid?
Also though that the function is already implemented so I just have to use it in MeanPrecisionClass .
Pls guide me for how to proceed or what steps should I take in order to resolve.

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 14, 2024

Sure! The steps so far are correct. Did you write any code for the tests like you said?

Added integration tests for MeanAveragePrecision

If you did, the code is not visible in Files Changed section. In which case, you have to commit the code and push it to your develop branch. It will be visible on the PR when you do.

@PrakharJain1509
Copy link
Contributor Author

Sure! The steps so far are correct. Did you write any code for the tests like you said?

Added integration tests for MeanAveragePrecision

If you did, the code is not visible in Files Changed section. In which case, you have to commit the code and push it to your develop branch. It will be visible on the PR when you do.

Okay so only the test codes part is remaining?
Please give me some time, I will write that part and update you by tom morning
Thanks

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 14, 2024

Unit tests are welcome, but typically not required. However, since you wrote in the PR description that unit tests are included, I expect to see the code I'm reviewing to include them 😄

What we really like is a Colab that shows how to use the new feature. It should run the MeanAveragePrecision metric on OBB results. For example, you can use yolo11n-obb, yolo11s-obb, yolo11m-obb and compare the results. You should also test what happens when you attempt to compute mAP for OBB detections that don't overlap, and detections that are empty (created with sv.Detections.empty())

@PrakharJain1509
Copy link
Contributor Author

Unit tests are welcome, but typically not required. However, since you wrote in the PR description that unit tests are included, I expect to see the code I'm reviewing to include them 😄

What we really like is a Colab that shows how to use the new feature. It should run the MeanAveragePrecision metric on OBB results. For example, you can use yolo11n-obb, yolo11s-obb, yolo11m-obb and compare the results. You should also test what happens when you attempt to compute mAP for OBB detections that don't overlap, and detections that are empty (created with sv.Detections.empty())

I looked into the Tests, but I am unable to test it. Can you please help me with the testing and tell if the changes I already did are fine or need some modifications.

@PrakharJain1509
Copy link
Contributor Author

Unit tests are welcome, but typically not required. However, since you wrote in the PR description that unit tests are included, I expect to see the code I'm reviewing to include them 😄

What we really like is a Colab that shows how to use the new feature. It should run the MeanAveragePrecision metric on OBB results. For example, you can use yolo11n-obb, yolo11s-obb, yolo11m-obb and compare the results. You should also test what happens when you attempt to compute mAP for OBB detections that don't overlap, and detections that are empty (created with sv.Detections.empty())

Also I have create a google colab in which I have loaded the whole MeanPrecision code, but am unable to test it, Could you help me with that part.

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 15, 2024

Hi @PrakharJain1509,

I won't be able to help with testing, but can give some pointers regarding the Colab. If we can create the Colab that checks how the new functionality works, then unit testing won't be necessary. Please share a link to the Colab.

I am also opening this up to the community for a second contributor.

@PrakharJain1509
Copy link
Contributor Author

PrakharJain1509 commented Oct 15, 2024

Hi @PrakharJain1509,

I won't be able to help with testing, but can give some pointers regarding the Colab. If we can create the Colab that checks how the new functionality works, then unit testing won't be necessary. Please share a link to the Colab.

I am also opening this up to the community for a second contributor.

Okay so I am done with the testing part and I am ready with the colab link
Please have a look into it and tell me if anything else is required.
Also I made one more commit doing some minor changes.

Here is the link :
https://colab.research.google.com/drive/1m8RK27dEd2D6HfHVnzRr0-npTjfVQwjp?usp=sharing

@PrakharJain1509
Copy link
Contributor Author

@LinasKo Did you look into it sir?

@PrakharJain1509
Copy link
Contributor Author

@LinasKo Did you look into it sir?

@LinasKo Please Look into the colab file and let me know if any changes are required.

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 17, 2024

Hi @PrakharJain1509,

Good work with the Colab. Stepping into a new repository and having to learn how everything is put together is tricky, and I can see you're putting good effort into it. Especially the metric tests at the end - that is definitely going in the right direction.

Here are a few points of improvement:

  • Repo installation can be made simpler: Starter Template.
  • We can add a few more use cases, e.g. showing metric results when predictions/targets are empty.
  • Your latest commit reverts some code we have on develop branch. We should instead have merged develop into your branch instead.
  • This PR only adds OBB support to mAP. F1 score should be covered as well.

With that in mind, while this has more issues than many other PRs, I'm quite happy with the result. I'll accept your contributio, take over from here, and prepare it for merging. Thank you very much! 🙏

@LinasKo LinasKo added the hacktoberfest-accepted Contribute to the notion of open-source this October! label Oct 17, 2024
@PrakharJain1509
Copy link
Contributor Author

PrakharJain1509 commented Oct 17, 2024

  • We can add a few more use cases, e.g. showing metric results when predictions/targets are empty.

@LinasKo
The second last block of code, just before the plots is about empty detections.
Thank you for accepting this PR and please let me know in future if anything is required.

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 31, 2024

@LinasKo
Copy link
Collaborator

LinasKo commented Oct 31, 2024

Merging - everything seems to be in order now.

Thank you for the contribution @PrakharJain1509. Expect to see it included in the next supervision release! 🤝

@LinasKo LinasKo merged commit 3b4cd42 into roboflow:develop Oct 31, 2024
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
hacktoberfest-accepted Contribute to the notion of open-source this October!
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants