Automatic website deployment with Gitlab using any hosting

Automatic website deployment, how hard can it be? In this tutorial I’ll describe in detail how to publish your site with Gitlab CI/CD.

Eventually all websites (there are special cases) need to be published online, I used to use tools like Filezilla where I’d FTP to my server and transfer all the files while hoping I didn’t forget one.

To me this was a pain, and more than never I had issues once my website was published because I either had overwritten something I shouldn’t or something I’ve forgotten along the way.

This is where automatic website deployment with CI/CD comes to life, from the day I learned how to deploy sites automatically, I never touch a FTP program again.

Déjà vu?

You know that feeling when you upload a fully working local environment and the moment you test it online it doesn’t work? Chances are you’ve forgotten about a few important files. This is where we can integrate CI/CD automatic website deployment with Gitlab CI.

To fully understand the use of Git, I wrote an easy to understand article about Git right here, it will really help you get started with Git.

This is where Gitlab CI/CD comes in, each push to Gitlab will trigger a deployment job so all changed files are always in sync with your production website.

How does Gitlab CI/CD work?

When pushing any code to your repository Gitlab will check if you’ve a .gitlab-ci.yml file in your root directory.

This file will define how Gitlab CI should interact with your project.

Gitlab CI is based on the Docker environment, this means you can make use of thousands of Docker images which are hosting on Docker Hub.

You define in the file which Docker image should be used and build your commands from there on.

Deployment demo time!

First of all, here’s an example of how I build my very basic Gitlab CI file.

image: php:7.2-apache

    - tests
    - staging
    - production

    - ...

    stage: tests
        - ...

    stage: staging
        - git push ssh://<username>@<server-ip>:<port>/~/<path-to-online-git-repo> HEAD:refs/heads/master

    stage: production

The Docker image chose here is php:7.2-apache, which is defined by the image type.

You can see the stages as steps, if the first step fails it won’t go do the next one, which is good because if a test fails you don’t want to actually deploy anything.

The before_script in the root will be executed on every job, in case you need run composer install or npm install for every job.

The jobs are here stylecheckto_stagingto_production, each with their own stage defined. But of course, you can have multiple jobs per stage.

The automatic deployment feature

The deployment of different apps can be quite different, but one thing is always certain. You’ll need to get your code onto your server somehow. This can be either done using FTP, SFTP or SSH.

Prepare your server

Preparing your server is half the work, most servers and even shared hosting services have git pre-installed. (If not contact your hosting provider)

To transfer the files easily from Gitlab to the server I use a bare repository which will wait for a push to transfer the new code to the right position.

cd ~
mkdir repos/<my-site>.git && cd repos/<my-site>.git
git init --bare
nano hooks/post-receive

You now created an bare repository in the ~/repos/<my-site>.git directory. Now with nano paste in the following code:

echo "Pushing code to working directory.."
GIT_WORK_TREE=<absolute-path-to-directory> git checkout -f
cd <absolute-path-to-directory>
composer install #only needed using Symfony
php bin/console doctrine:schema:update --force #only needed using Symfony

This file acts as a hook when pushing to your server. It will take the code sent from Gitlab and copy it to the defined directory.

In case you need the commands like composer install on your server you can just add them like above.

To finish things, give execute rights to the hook chmod +x hooks/post-receive

Setup Gitlab

Already now you can use the post-receive hook by pushing directly from your local repository, this defeats the purpose of using Gitlab CI though since you’re not running any tests before deploying.

Depending on if you’re using FTP or SSH you’ll need either a FTP password or SSH private key to continue.

Since we don’t want to pass this valuable information as cleartext to our .gitlab-ci.yml file I add them to the Gitlab CI secret variables.

The secret variables can be found under your project. Settings -> CI/CD -> Secret variables

Add the variable SSH_PRIVATE_KEY and paste in your private ssh key cat ~/.ssh/id_rsa. (You first need to generate ssh key, with ssh-keygen).

Setup .gitlab-ci.yml file.

Now that your server and Gitlab is ready it’s time to setup the .gitlab-ci.yml for deployment!

Remember, we just need to push our code to the hook made on our server, that’s it!

        - "which ssh-agent || ( apk --update add openssh-client )"
        - eval $(ssh-agent -s)
        - mkdir -p ~/.ssh && chmod 700 ~/.ssh && rm ~/.ssh/id_rsa && touch ~/.ssh/id_rsa
        - ssh-keyscan -t rsa <server-address> >> ~/.ssh/known_hosts
        - echo "$SSH_PRIVATE_KEY" >> ~/.ssh/id_rsa && chmod 400 ~/.ssh/id_rsa

        - git push <username>@<server-address>:~/repos/<my-site>.git HEAD:refs/heads/master

That’s it! Let’s explain a little.

  1. You push your code from your local repository to Gitlab
  2. Gitlab sees your .gitlab-ci.yml file and will check for syntax errors.
  3. It will loop the stages and it’s jobs, so in this example the test jobs will be ran first.
  4. Once in the staging process, it will validate the ssh connection between Gitlab CI and the client by using it’s $SSH_PRIVATE_KEY defined in the secret variables.
  5. Now that we can connect to the server, execute a push command to it using the current branch.
  6. The server will receive an incoming push command on it’s hook and will copy all of the code to the correct directory.
  7. It may or may not execute composer install depending on what you’ve setup.

Pro tip: I suggest you use your own Docker image, you can easily install composer, npm and all other dependencies. This decrease the time needed to execute the commands.

If you have any questions, like always feel free to comment below!

Leave a Reply

Notify of