Skip to content

Exploring Local Detail Perception for Scene Sketch Semantic Segmentation (IEEE TIP)

Notifications You must be signed in to change notification settings

drcege/Local-Detail-Perception

Repository files navigation

Exploring Local Detail Perception for Scene Sketch Semantic Segmentation

Code release for "Exploring Local Detail Perception for Scene Sketch Semantic Segmentation" (IEEE TIP)

Requirements

  • Create a conda environment from the environment.yml file:
conda env create -f environment.yml
  • Activate the environment:
conda activate LDP

Preparations

  • Get the code:
git clone https://github.com/drcege/Local-Detail-Perception && cd Local-Detail-Perception
  • Download datasets from releases and place them under the datasets directory following its instructions.

  • Generate ImageNet pre-trained "ResNet-101" model in TensorFlow version for initial training and place it under the resnet_pretrained_model directory. This can be obtained following the instructions in chenxi116/TF-resnet. For convenience, the converted model can be downloaded from here.

Training

python3 segment_main.py --mode=train --run_name=LDP 

Evaluation

python3 segment_main.py --mode=test --run_name=LDP

Credits

About

Exploring Local Detail Perception for Scene Sketch Semantic Segmentation (IEEE TIP)

Resources

Stars

Watchers

Forks

Packages

No packages published