Repository for the voluntary Challenge Exercise in the Master's course "Operating Systems".
Presentation Video: https://youtu.be/YA1CG8ZNVCY
- Rework docker import
- Tensorflow GPU support
As host environment for the docker container Ubuntu 16.04 LTS (64 Bit) was used. To install docker execute this on the Ubuntu host machine:
sudo apt-get update
sudo apt-get install linux-image-extra-$(uname -r) linux-image-extra-virtual
sudo apt-get install apt-transport-https ca-certificates curl software-properties-common
Add the docker PGP key to the host machine:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
Finally install docker:
sudo apt-get update
sudo apt-get install docker-ce
To verify the installation run:
sudo docker run hello-world
sources: https://docs.docker.com/engine/installation/linux/docker-ce/ubuntu/
The docker container image was exported using:
sudo docker export [container name] | gzip car_detector.tar.gz
It can be downloaded from https://1drv.ms/u/s!Aoi3Wc_cMCrSsfpj5Ldlzqb5U39-tg and imported/reused with docker using:
sudo docker import
More information: https://docs.docker.com/engine/reference/commandline/import/
Execute the following commands on the host system to start a docker container with CPU support only and connect via bash:
docker run -it -p hostPort:containerPort gcr.io/tensorflow/tensorflow:latest-devel bash
An existing machine can be resumed and accessed with:
sudo docker start [machine name]
sudo docker attach [machine name]
The machine name can be found using:
sudo docker ps -a
sudo apt-get install protobuf-compiler python-pil python-lxml
sudo pip install jupyter
sudo pip install pillow
sudo pip install lxml
sudo pip install matplotlib
Clone into:
git clone https://github.com/tensorflow/models.git
Download the files. The folder does not matter but for the dockerfile the models folder is in /tensorflow/tensorflow/models
Check for latest protoc version with:
protoc --version
If it is not the latest version download it from the following link for python and build it with: Download: https://github.com/google/protobuf/releases
./configure
make check
make install
Execute in the tensorflow/models/research directory:
protoc object\_detection/protos/*.proto --python_out=.
Run this command in every terminal started or add them to your ~/.bashrc
export PYTHONPATH=$PYTHONPATH:`pwd`:`pwd`/slim
python object_detection/builders/model_builder_test.py
source: https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/installation.md
To edit and run the jupyter notebook navigate in the container to the /tensorflow/models/research/object_detection/ folder and run the notebook server with:
jupyter notebook --allow-root
Or just run the python script in the same folder with:
python car_detector.py
Once started the container starts collection image data from the web address http://tpark-cam.cs.aalto.fi/ which is used for demonstration purposes. The script can also be altered to use locally saved images and a further improvement for the image acquisition could be, instead of downloading, to push image data to the tensor flow program.
TensorFlow is then using the Object Detection API to find objects in the received image. Due to the bad quality of the demo image stream the sensitivity of the system was set to a value of 0.3 which means that objects that are 40% or more likely to be a car will be counted.
The metadata (amount of cars, array with their positions in the image, and probability for each car to actually be a car) is then sent to the data handler.
The data handler is saving the data in a database and provides an interface for an application or a user to get the data from there using HTTP.
The data handler frontend can be found on: https://g6-os.herokuapp.com/
The repository with the implementation of the data handler can be found here: https://github.com/cell2749/ImageProcessingDataHandler
The data handler consists of one or more dyno (https://devcenter.heroku.com/articles/dynos) containers running in a heroku (https://www.heroku.com/) web application. This allows the data handling to be scaled to whatever needs the application has. The metadata coming from the tensor flow docker containers is sent to the handler using the POST request method. The metadata is saved in a Mongo database and can be queried using HTTP and a Node JS interface.