https://github.com/ArturoDeza/NeuroFovea_PyTorch
This repository containts the code to reproduce the Metamers used in the paper (Deza, Jonnalagadda, Eckstein. ICLR 2019). Link to the paper and discussion in openreview: https://openreview.net/forum?id=BJzbG20cFQ
This code has been tested successfully on CUDA version 8.0 (Ubuntu 14.04 and 16.04) and CUDA version 10.0 (Ubuntu 18.04).
The code to implement our model is mainly driven by:
- Adaptive Instance Normalization code: https://github.com/xunhuang1995/AdaIN-style
- pix2pix super-resolution module: https://github.com/phillipi/pix2pix
- The original Metamer code of Freeman & Simoncelli: https://github.com/freeman-lab/metamers
- A set of localized pooling regions stored in the Receptive_Fields/ folder for the different rate of growth of the receptive fields specified by the scaling factor which should match the human psychophysical testing procedure as specified in the paper.
Metamers are a set of stimuli that are physically different but perceptually indistinguishable to each other. See below for an example.
Input | Metamer |
---|---|
When maintaing center fixation on the orange dot the two images that are flipped back and forth should be perceptually indistinguishable to each other even though they are physically different (strong difference in the periphery vs the fovea).
As in our previous demo, the metameric effects will only work properly if one fixates at the orange dot at the center of the image. In the paper we provide more details on how we psychophysically tested this phenomena using an eye-tracker to control for center fixation, viewing distance, display time, and the visual angle of the stimuli. We tested our model on grayscale images, and have extended the model in this code release to color images.
It was developed in CUDA 8.0 on Ubuntu 16.04 and has been tested on both CUDA 8.0 and CUDA 10.1 (though there might be some differences from CUDA 10.1 to 8.0) on Ubuntu 18.04. You will need to install:
All to be run under the same terminal:
Install OpenBLAS
git clone https://github.com/xianyi/OpenBLAS.git
cd OpenBLAS
make NO_AFFINITY=1 USE_OPENMP=1
sudo make install
Export CMAKE LIBRARY PATH that include OpenBLAS:
CMAKE_LIBRARY_PATH=/opt/OpenBLAS/include:/opt/OpenBLAS/lib:$CMAKE_LIBRARY_PATH
Install Torch (old school with Lua):
git clone https://github.com/nagadomi/distro.git ~/torch --recursive
cd ~/torch
./install-deps
./clean.sh
./update.sh
. ~/torch/install/bin/torch-activate [this will activate the torch installation! You may need to run this if you open a new terminal, or you can just append to your path]
Install unsup package:
luarocks install unsup
The Full Dataset is also available here for future work in both grayscale and color Metamers, they can be found in the Datasets/ folder
To complete the installation please run:
$ bash download_models_and_stimuli.sh
Generate a V1 metamer for the 512x512
image 10.png
with a center fixation, specified by the rate of growth of the receptive field: s=0.25
. Note: The approximate rendering time for a metamer should be around a second.
$ th NeuroFovea.lua -image Dataset/1_color.png -scale 0.25 -refinement 1 -color 1
To create a V2 metamer, change the scale from 0.25 to 0.5. Scale is computed via receptive field size over retinal eccentricity of that receptive field and the values are only relevant given the size of the stimuli (26 x 26 degrees of visual angle rendered at 512 x 512 pixels). To compute the reference image, set the reference flag to 1.
Please read our paper to learn more about visual metamerism: https://openreview.net/forum?id=BJzbG20cFQ
We hope this code and our paper can help researchers, scientists and engineers improve the use and design of metamer models that have potentially exciting applications in both computer vision and visual neuroscience.
This code is free to use for Research Purposes, and if used/modified in any way please consider citing:
@inproceedings{
deza2018towards,
title={Towards Metamerism via Foveated Style Transfer},
author={Arturo Deza and Aditya Jonnalagadda and Miguel P. Eckstein},
booktitle={International Conference on Learning Representations},
year={2019},
url={https://openreview.net/forum?id=BJzbG20cFQ},
}
Other inquiries: [email protected]