Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
Abonia1 committed May 8, 2024
1 parent 355c932 commit b72ff3f
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 10 deletions.
23 changes: 15 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# YOLO Prediction to Labelme and Anylabeling Json
# YOLO Predictions to Labelme and Anylabeling-Compatible JSON

<p align="center">
<img alt="yolosegment2labelme" style="width: 128px; max-width: 100%; height: auto;" src="images/labelme_test/logo.png"/>
<img alt="yolosegment2labelme" style="width: 128px; max-width: 100%; height: auto;" src="https://raw.githubusercontent.com/Abonia1/yolosegment2labelme/main/images/labelme_test/logo.png"/>
<h1 align="center">🌟 yolosegment2labelme 🌟</h1>
<p align="center">Convert your yolo model prediction results to json to view and edit in Labelme and Anylabeling. <b>YOLO Result to Json with single line cmd</b>!</p>
<p align="center"><b>yolosegment2labelme = Easy Coversion + Predicted to Json + Auto-labeling</b></p>
Expand All @@ -12,12 +12,16 @@
[![PyPI](https://img.shields.io/pypi/v/yolosegment2labelme)](https://pypi.org/project/yolosegment2labelme/)
[![license](https://img.shields.io/github/license/abonia1/yolosegment2labelme.svg)](https://github.com/Abonia1/yolosegment2labelme/blob/main/LICENSE)
[![open issues](https://isitmaintained.com/badge/open/abonia1/yolosegment2labelme.svg)](https://github.com/abonia1/yolosegment2labelme/issues)
[![Article](https://img.shields.io/badge/Read-Documentation-green)](https://abonia1.github.io/)
[![Website](https://img.shields.io/website?url=https%3A//abonia1.github.io&style=flat&logo=github&logoColor=white)](https://abonia1.github.io/)
[![Follow](https://img.shields.io/badge/+Follow-abonia-blue)](https://www.linkedin.com/in/aboniasojasingarayar/)
[![YouTube](https://img.shields.io/badge/-YouTube-red?style=flat-square&logo=youtube)](https://www.youtube.com/channel/UCGphGM_oeR4r9dqVs71Jc5w)
[![Medium](https://img.shields.io/badge/-Medium-black?style=flat-square&logo=medium)](https://medium.com/@abonia)
[![GitHub](https://img.shields.io/badge/-GitHub-black?style=flat-square&logo=github)](https://github.com/Abonia1)


> ⭐ Follow [AboniaSojasingarayar](https://www.linkedin.com/in/aboniasojasingarayar) for project updates.
**yolosegment2labelme** is a Python package that allows you to convert YOLO segmentation prediction results to LabelMe JSON format. This tool facilitates the annotation process by generating JSON files that are compatible with LabelMe and other labeling annotation tools.
**yolosegment2labelme** is a Python package that allows you to convert YOLO segmentation prediction results to LabelMe JSON format. This tool facilitates the annotation process by generating JSON files that are compatible with [Labelme](https://github.com/labelmeai/labelme) and [Anylabeling](https://github.com/vietanhdev/anylabeling) annotation tools.

## Features

Expand Down Expand Up @@ -53,18 +57,18 @@ yolosegment2labelme --model yolov8n-seg.pt --images /path/to/images --conf 0.3
This command will process the images located in the specified directory (`/path/to/images`), using the YOLO model weights file `yolov8n-seg.pt` and here you can add path/to/your/customYOLOModel, and generate LabelMe JSON files with a confidence threshold of your choice and here it is of 0.3.


## Sample Images
The table below displays sample images along with their corresponding annotations generated using yolosegment2labelme:
## Sample Output in Anylabeling Annotation Tool

Below are examples of image annotations created using yolosegment2labelme and viewed in the [Anylabeling](https://github.com/vietanhdev/anylabeling) annotation tool:

| Sample Image 1 | Sample Image 2 |
|-----------------------------------------------------|-----------------------------------------------------|
| ![Sample Image 1](images/labelme_test/sample1.png) | ![Sample Image 2](images/labelme_test/sample2.png) |
| ![Sample Image 1](https://raw.githubusercontent.com/Abonia1/yolosegment2labelme/main/images/labelme_test/sample1.png) | ![Sample Image 2](https://raw.githubusercontent.com/Abonia1/yolosegment2labelme/main/images/labelme_test/sample2.png) |
| Sample Annotation for Image 1 | Sample Annotation for Image 2 |

| Sample Image 3 | Sample Image 4 |
|-----------------------------------------------------|-----------------------------------------------------|
| ![Sample Image 3](images/labelme_test/sample3.png) | ![Sample Image 4](images/labelme_test/sample4.png) |
| ![Sample Image 3](https://raw.githubusercontent.com/Abonia1/yolosegment2labelme/main/images/labelme_test/sample3.png) | ![Sample Image 4](https://raw.githubusercontent.com/Abonia1/yolosegment2labelme/main/images/labelme_test/sample4.png) |
| Sample Annotation for Image 3 | Sample Annotation for Image 4 |


Expand All @@ -74,6 +78,9 @@ The documentation for **yolosegment2labelme** can be found on GitHub: [yolosegme

## Contributing

#### If you like this work do star to this repo ⭐ and contribute...💁💁💁

---
Contributions are welcome! If you'd like to contribute to **yolosegment2labelme**, please check out the [Contribution Guidelines](CONTRIBUTING.md).

## License
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "yolosegment2labelme"
version = "0.0.2"
version = "0.0.3"
authors = [
{ name="Abonia Sojasingarayar", email="[email protected]" },
]
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
with codecs.open(os.path.join(here, "README.md"), encoding="utf-8") as fh:
long_description = "\n" + fh.read()

VERSION = '0.0.2'
VERSION = '0.0.3'
DESCRIPTION = 'Yolo segmentation prediction to labelme json'
LONG_DESCRIPTION = 'A package that allows generating JSON from YOLO prediction results, compatible with LabelMe/any labeling annotation tool'

Expand Down

0 comments on commit b72ff3f

Please sign in to comment.