Posts tagged #ssh

How to forward your local ports to a remote server using SSH

SSH is a great tool which is used by Linux/Unix sysadmins all over the world. One neat little thing that it allows you to do is: to connect to ports on a remote computer through SSH. This process is also called tunneling or creating an SSH tunnel. And the TCP traffic/communication that happens on this connection is encrypted over SSH. So, you get security without opening up ports on your remote computers to the public.

Let us take a simple example of accessing a postgresql database which is present on a remote server (on one of the server providers like aws, digital ocean). You don’t want to open the 5432 (the port on which postgresql runs) port to the internet. As, this will allow everyone to try bruteforce attacks on your database. So, you deny traffic on 5432. You can access this port from your local computer through an SSH tunnel.

The ~/.ssh/config syntax for a tunnel is simple:

Host myserver
  Hostname or the IP
  Port 22
  User goodcode
  # forward our local port 4000 to the localhost:5432 on the remote server which is the postgresql server
  LocalForward 4000
  # forward our local port 5000 to the localhost:6379 on the remote server which is the redis server
  LocalForward 5000

Let us break it down:

  1. Host myserver: creates an ssh configuration with a name myserver. Which you can connect to using ssh myserver
  2. Hostname signals ssh to connect to this port when you run ssh myserver
  3. Port 22: you can drop this if your port is the default port 22. However, if you are running ssh on a different port on the remote server, you can change this.
  4. User goodcode: this tells ssh to use the username goodcode when connecting using ssh myserver
  5. LocalForward 4000 This is what creates the actual tunnel, Here we are forwarding our local port 4000 to the port 5432 on the remote server.
  6. LocalForward 5000 Just to show that you can create multiple tunnels on the same SSH connection, I have also forwarded port 5000 to the redis instance on the remote server

Once we have this setup and open an ssh connection using ssh myserver. We can connect to postgresql and redis using the following commands

# connect to postgresql on the remote server
psql --host localhost --port 4000 database_name
# connect to the redis instance on the remote server
redis-cli -h localhost -p 5000

You can also pass all these options via the command line without creating an ssh config.

ssh -L 4000:localhost:5432 -L 5000:localhost:6379 --port 22  goodcode@

How to migrate your web application to a different server with minimum downtime

I had to move one of my large web applications to a different server yesterday. That too across providers (from Digital Ocean to an AWS EC2 instance). Here are the steps I took, hopefully it helps others in the future:

  1. Install all the libraries needed for the app. Basically, follow the steps I would for a new install. For me this required the installation of ruby, postgresql, nginx, letsencrypt

  2. Get the app running with some fake data. This step may require you to copy over the ssl certs from your previous server.

  3. Create an entry in your /etc/hosts (on your local computer) to point to your new web server, e.g.

  4. Open your app and test it out. At this point I found that I had forgotten to move over the .env file which had the secrets and keys needed for the web application. So, I moved them and got the application working.

  5. Add your new server’s public key to your old server’s ~/.ssh/authorized_keys. This is to allow us to move data directly to the new server from the old server

  6. Import your database over ssh from the remote server. My app uses a postgresql server. So, I had to run the following:

    ssh “sudo -u postgres pg_dump -Fc —no-acl —no-owner simpleform_production | gzip” | gzip -d | sudo -u simpleform pg_restore —verbose —clean —no-acl —no-owner -d simpleform_production

  7. Test your app with the new filled out database. At this point I realized I had to move over files that were uploaded into the old app. So, I scped them over.

  8. You should have stup the systemd scripts or any other init scripts in step 1.

  9. Don’t reload nginx’s configuration after this step. Setup your old server’s nginx config to point to the new server so that It proxies all traffic to your new server. You can do this by adding a proxy_pass You’ll also have to create an /etc/hosts entry on your old server so that it points to the new server.

    try_files $uri/index.html $uri.html $uri @proxy;
    location @proxy{
      proxy_set_header  X-Real-IP        $remote_addr;
      proxy_set_header  X-Forwarded-For  $proxy_add_x_forwarded_for;
      proxy_set_header X-Forwarded-Proto $scheme;
      proxy_set_header Host $http_host;
      proxy_redirect off;
  10. Now, write a script which can be executed from the new server.

    #!/bin/bash -e echo stopping simpleform

    this is so we can drop the database on the new remote server

    sudo systemctl stop

    drop the database on the new server

    echo dropping db sudo -u simpleform dropdb simpleform_production

    create a fresh database for your new remote server

    echo creating db sudo -u simpleform createdb simpleform_production

    stop the application on the old server

    echo stopping simpleform ssh “sudo stop simpleform”

    import the database from the old server to thew new server

    echo importing db (ssh “sudo -u postgres pg_dump -Fc —no-acl —no-owner simpleform_production | gzip” | gzip -d | sudo -u simpleform pg_restore —verbose —clean —no-acl —no-owner -d simpleform_production) || /bin/true

    start the application on the new server

    echo start local simpleform sudo systemctl start echo reloading remote nginx

    reload the nginx configuration on the new server

    ssh sudo nginx -s reload

  11. Change your DNS entries so that they point to the new server’s IP

That is it! Now, your application is up on the new server. The dance which is done in step #10 is required so that you don’t have anyone sending you data which is present on the old server and not on the new server. So, you will have some downtime which will most probably be less than 5 minutes.