3 Reasons Why I Deploy Tensor Flow As Docker Containers

Docker Mysql

This post discusses the 3 reasons why I decided to deploy my tensor flow application as docker container for production systems.

1. Removes Incompatibility Issues

When deploying the tensor flow application to production, I seemed to suffer with incompatibility issues if any of the supporting python libraries were updated.

When deploying on slightly different production environments, like Linux version, I would also run into many problems.

By using a docker image, I can freeze the OS and all library versions for a particular version of a tensor flow application.

Libraries I had issues with were as follows:

sudo pip install numpy==1.16.1 # numpy==1.14.3

sudo pip install scipy==1.0.0
sudo pip install scikit-learn==0.19.1

pip install arff==0.9
sudo pip install pandas==0.22.0
pip install rnn	
 		
pip install tensorflow==1.8.0 

sudo apt-get install python-mysqldb

2. Version control

I can build different versions of the application as it is updated and bug-fixes are applied. I can build these a new images, and then deploy to production in a quick and clear way. This allows me to roll back to previous version if necessary.

3. Monitor Deployment

Using the in built docker command, I can monitor and see the up time of my docker containers with a quick and easy command.

docker ps -a

I can also use docker commands to stop / start and control the tensor flow application.

docker start tensorApp

docker stop tensorApp

to reattach

docker attach tensorApp

to detach

Ctrl+p Ctrl+q

You can read more about how I have used docker in my other projects.

If you would like to read more about the background to this work, it was conducted as part of my research into music and its effect on mood study.

Creating your first programming language is easier than you think,
...also looks great on your resume/cv.