This is the code for "Video Quality Assessment with Serial Dependence Modeling, TMM, 2021".
The code was created in 2019 when I had the first attempt with deep learning, so the code seems very disordered, and I have no plan to reorganize them :)
The model consists of two parts:
- feature extraction with MATLAB in the folder
./matlab
; - recurrent modeling with PyTorch.
-
Feature extraction is based on the previous work (FAST-TMM-2019), where OpenCV is requested. More details see in the folder. Features from the four VQA databases (i.e., LIVE, CSIQ, IVPL, and IVC-IC) can be downloaded from GoogleDrive or 百度云(提取码:i8y8) (~1G in total).
-
Once features are extracted, the major is to set the paths in config files
xxx.yaml
to note where the precomputed feature data stores, and where the dataset information file (xxx_list_for_VQA.txt
, we have put them in the folder).
Example code would be seen in demo_loop.py
or demo_IVC-IC.py
. More details can be determined with a step-by-step running in debug mode. If you are only interested in the implementation of A-LSTM
or attention
, please check them in the file ./model/rnn_imp.py
.
*** In current work, we only validate the performance on FR-VQA, but we hope this work could be transferred into broader scenarios (for example, UGC-VQA, or others). It is easy to reserve the second step but substitute the first step with specific modeling. ***
The method is implemented with MATLAB R2016a, OpenCV 2.4.13 on windows for feature extraction, and Python 3.6.5, scipy 1.1.0, numpy 1.14.3, PyTorch 1.1.0 on Ubuntu for sequential modeling. Different versions may occur slight changes in performance.
If you are interested in the work, or find the code helpful, please cite our work
@ARTICLE{sdm,
author={Liu, Yongxu and Wu, Jinjian and Li, Aobo and Li, Leida and Dong, Weisheng and Shi, Guangming and Lin, Weisi},
journal={IEEE Transactions on Multimedia},
title={Video Quality Assessment with Serial Dependence Modeling},
year={2022},
volume={24},
number={},
pages={3754-3768},
doi={10.1109/TMM.2021.3107148}
}
If any question or bug, feel free to contact me via [email protected]
.