Tutorial: Full-stack JavaScript for the Enterprise Getting started with: Ext JS, Node.js, Express, MongoDB and Docker. (8)

Posted on in Docker Environment

This is the last part of this tutorial series, it will cover Github and the Docker Hub.

Github

Navigate to github, to add a new repository:
https://github.com/new

Create two git repositories:

  • docker-ext-client
  • docker-node-server
  • Add a .gitignore file to the following folders:
  • dockerextnode/client/
  • dockerextnode/server/dockerextdjango/

It should contain the following ignore rules:
https://gist.github.com/savelee/970c0d72195ed5b9ca7c5ca533d0a4de

Type for both folders, the following commands on the command-line:

$ git init
$ git status
$ git add .
$ git commit -m “First commit”
$ git remote add origin https://github.com/myrepo/myrepo.git
like: git remote add origin https://github.com/savelee/docker-ext-client.git
$ git push -u origin master --force

github

Docker Hub: Distribution of containers

Now that you’re reading this guide, you might be interested, or maybe you just want to see these examples working live. Well with Docker, you can very run these container images. In case you have the Docker Toolbox installed, this should be very easy. You just need to have access to my containers. Enter Docker Hub! Docker Hub is like Github but for Docker images.

The Docker Hub is a public registry maintained by Docker, Inc. It contains images you can download and use to build containers. It also provides authentication, work group structure, workflow tools like webhooks and build triggers, and privacy tools like private repositories for storing images you don’t want to share publicly.

Let me first show you how you can add your images to the Docker Hub, afterwards I will show you how to checkout these images.
First, we are going to add an Automated build repository in Docker Hub. For that, we first need to push the code to Github. If you followed this guide, you should have done this by now.

DockerHub

Adding images to Docker Hub

We will need to have a working images, which you will have when you have done the previous chapters.

Next, we will link our Github account with Docker Hub to add an automated build repo. You will need a Docker Hub account: https://hub.docker.com/login/

We will automate the Docker builds, by linking Github to Docker Hub, so everything I push to Git, it will automatically push to Docker as well. We can achieve this with webhooks.
Go to: https://hub.docker.com/account/authorized-services/

You can choose to link to Github or Bitbucket. See: https://docs.docker.com/docker-hub/github/
I’m using Github for this tutorial.

Choose between; public & private or limited access. The “Public and Private” option is the easiest to use, as it grants the Docker Hub full access to all of your repositories. GitHub also allows you to grant access to repositories belonging to your GitHub organizations. If you choose “Limited Access”, Docker Hub only gets permission to access your public data and public repositories.

I choose public & private, and once I am done with that, it forwards me to a Github page. (I’m logged in on Github), which asks me to grant permission, so Docker Hub can access the Github repositories:

authorize

Once you click Authorize application, you will see the DockerHub application in the Github overview: https://github.com/settings/applications

Now go back to your DockerHub dashboard, and click on the Create > Create Automated Build from the dropdown, which you will see next to your account name, in the top right:

automatedbuilds1

Select Create Auto-Build Github, select your Github account, and then select the repository:
docker-ext-client, enter a description of max 100 characters and save. Redo these steps as well for docker-node-server.

automatedbuilds2

Once the Automated Build is configured it will automatically trigger a build and, in a few minutes, you should see your new Automated Build on the [https://hub.docker.com/](Docker Hub) Registry. It will stay in sync with your GitHub and Bitbucket repository until you deactivate the Automated Build.

Now go to Build Settings. You should see this screen:

automatedbuilds3

You could click the Trigger button, to trigger a new build.

Automated Builds can also be triggered via a URL on Docker Hub. This allows you to rebuild an Automated build image on demand. Click the Active Triggers button.

Creating an automated build repo means that every time you make a push to your Github repo, a build will be triggered in Docker Hub to build your new image.

Make sure, when committing the docker-ext-client app to Git, that you will check in the production build/production/Client folder, as this folder will be used by the Docker images, not the folder with your local Sencha (class) files.

Running images from Docker Hub

Now that we know, how we can add Docker images to the Docker Hub, let's checkout some images.

First download the image from the Docker Hub:

$ docker pull savelee/docker-ext-client

Then run the new Docker image

--name = give your container a name
--p = bind a port to the port which is in the Dockerfile
-d = the image name you like to run

For example:

$ docker run --name extjsapp -p 80:80 -d savelee/docker-ext-client

Here’s the code for running the Docker container:

$ docker pull savelee/docker-node-server
$ docker run --name nodeapp -p 9000:9000 -d savelee/docker-node-server

Conclusion

The last part of the tutorial focussed on publishing Docker images to the Docker Hub. If you followed all the tutorials of this 8 series, you've learned the following:

  • Full stack JavaScript for the enterprise with JavaScript on the front-end (with Ext JS 6).
  • Node.js on the back-end
  • A NoSQL database with MongoDB and Mongoose
  • About Docker, and how to create containers
  • How to link Docker containers with Docker Compose
  • How to publish Docker images with Github and Docker Hub

The best part of this all, is that you can easily swap one technology for another. For example, I could link new Docker images, with Ext JS 6 on a Python/Django with MySQL environment, or an Angular 2 app on Node.js with CouchDB...

I hope you like it, and that this might come in handy.

Cheers!

Tutorial: Full-stack JavaScript for the Enterprise Getting started with: Ext JS, Node.js, Express, MongoDB and Docker. (7)

Posted on in Docker Environment

This is part VII of the tutorial, and covers: Docker Compose

Docker Compose: Linking containers

Docker Compose is a tool for defining and running multi-container Docker applications.

Docker is a great tool, but to really take full advantage of its potential it's best if each component of your application runs in its own container. For complex applications with a lot of components, orchestrating all the containers to start up and shut down together (not to mention talk to each other) can quickly become confusing.

The Docker community came up with a popular solution called Fig, which allowed you to use a single YAML file to orchestrate all your Docker containers and configurations. This became so popular that the Docker team eventually decided to make their own version based on the Fig source. They called it: Docker Compose.
In short, it makes dealing with the orchestration processes of Docker containers (such as starting up, shutting down, and setting up intra-container linking and volumes) really easy.

So, with Docker Compose you can spin off various Docker images, and link it to each other.
That’s great, because in case you ever decide to get rid of the Node.js back-end, and instead like to make use of something else; let’s say Python with Django; you would just link to another images.

(For example: here's the same API back-end service, but build in Python with Django/Django Rest Framework:
https://github.com/savelee/docker-django-server)

You will use a Compose file (docker-compose.yml) to configure your application’s services. Then, using a single command, you create and start all the services from your configuration.
For more information, see: https://docs.docker.com/compose/overview/


Remember, how we wrote in our client Sencha app, URLs to the Node.js back-end? We hardcoded it to the localhost URL. Now this won’t work. When the container is running, it won’t know localhost, only it’s own ip address.

Let’s figure out what the docker machine ip address is. While you are still in the Docker terminal, enter the following command:

$ docker-machine ip

We will now need to change the Sencha URLs. You could hardcode this to the Docker machine ip, or you could let JavaScript detect the hostname, you are currently using. (Remember, our Node server is on the same host as our Sencha app, it just has a different port.)

The live URL in the client/util/Constants.js needs to be changed to:

'LIVE_URL': window.location.protocol + "//" + window.location.host + ':9000',

You will need to build the Sencha app, before moving on with Docker. We will copy the Sencha build directory over to our container, and this one needs to be finalized, concatenated and minimized, to leverage performance while serving the page.
(Manually copying builds over to folders can be automated too, btw. Take a look in one of my previous posts: https://www.leeboonstra.com/developer/how-to-modify-sencha-builds/)

Navigate to the dockerextnode/client folder:

$ sencha app build classic
$ sencha app build modern

We’re going to run our MongoDB database and our Node.js back-end on separate containers as well. We can use official images for this. Node.js has an official Docker image: https://hub.docker.com/_/node/
And also MongoDB has its own Docker image: https://hub.docker.com/_/mongo/

The Node.js image, we will need to configure, because we need to copy over our own back-end JavaScript code. Therefore create one extra Dockerfile which we create in the server folder.
The contents will look like this:

server/Dockerfile:
https://github.com/savelee/docker-node-server/blob/master/Dockerfile

Once we are done with that, we can create our Docker composition, in the root of our dockerextnode folder:

Build with:

$ docker-compose up --build

After building the composition, you can quickly boot up all the containers in once with:

$ docker-compose up

Note:
By the way, to build and run this image on its own, using these commands:

$ docker build -t nodeserver .
$ docker run -d --name dockerextnodeserver -p 9000:9000 nodeserver

You can test it in your browser by entering the ip address plus /users:
http://192.168.99.100:9000/users

Now you can visit the application in your browser. You will need to figure out what the ip address is. Remember:

$ docker machine ip

For me it gives back this ip address: http://192.168.99.100/

You will need to create the first login credentials. Open Postman or use CURL:

$ curl -H "Content-Type: application/json" -X POST  -d '{ "username": "myusername", "password": "mypassword"  }' http://192.168.99.100:9000/register

For Postman:
- Choose the method: POST
- With the URL: http://192.168.99.100:9000/register
- Select the body tab
- create 2 x-www-form-urlencoded fields: username & password, also specify the values that belong to these fields.

Now you can test your application!

Woops. There’s a problem with this code. The Node.js server can’t connect to my MongoDB!
This is because it’s trying to connect to Mongo database on localhost, but our Mongo database isn’t on local machine. You could hardcode the container IP ofcourse, in your Node.js script, or you can use environment variables, which are automatiaclly added by Docker, when it links the container:

In server/libs/users/index.js, change the mongoose.connect line to:

mongoose.connect('mongodb://'+settings.mongoAddress+':'+settings.mongoPort+'/'+settings.dbName);

Open server/config/local_settings.js and change it to the below code, so it contains the environment variables:

module.exports = {
  "secret": "mysecret",
  "mongoAddress": process.env.MONGO_PORT_27017_TCP_ADDR || 'localhost',
  "mongoPort": process.env.MONGO_PORT_27017_TCP_PORT || 27017,
  "dbName": 'dockerextnode'
}

compose

That's awesome, you've now learned how to setup multiple Docker containers and link them together.
In our next tutorial, we will look into the distribution of containers.

READ THE NEXT PART

https://www.leeboonstra.com/developer/tutorial-full-stack-javascript-for-the-enterprise-getting-started-with-ext-js-node-js-express-mongodb-and-docker-8/