Tag Archives fordocker

Docker on Windows: /usr/bin/env: bash\r No such file or directory

Getting the following error when you want to start your Docker container on Windows: /usr/bin/env: bash\r : No such file or directory

I found several topics with several fixes. But what fixed it for me was:

Solution: Setting the line-endings correctly

  • In my editor: \n
  • In git: git config --global core.autocrlf false

I chose the \n line ending as this is stated in PSR-12: 2.2 Files:
All PHP files MUST use the Unix LF (linefeed) line ending only.

I use PHPStorm and had to go to
Settings > Editor > Code Style > tab General > Unix and macOS (\n)

NOTE: you may have to do the following to fix files with the wrong line-endings:

  • remove the built Docker image on Windows: first list docker images and then delete: docker rmi <imagename>
  • fix the line-ending of the file. You might do that by removing the newline and adding a new one. Don’t forget to save, commit and push

Set up NGINX as a proxy for your Docker containers

Recently I’m a fan of serving docker containers over serving Virtual Hosts using a webserver.

In order to use regular domainnames without ports, I set up Nginx to receive the request on the domainname and let it forward the request to the relevant Docker container on the specific port it is running on.

Example

Imagine I have a Docker webserver-container hosting my app. It runs on my server exposing port 8080. I use the URL app.pauledenburg.com.

I don’t want people to use http://app.pauledenburg.com:8080 but just the URL without the port

http://app.pauledenburg.com

 .

I use nginx for this:

server {
    listen 80;
    server_name app.pauledenburg.com;

    location / {
        proxy_pass http://localhost:8080;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}

And now add SSL to it 🙂

Complete ELK-stack example with Docker

I wanted a quick setup for an Elasticsearch Logstach and Kibana (ELK-)stack to work with. But searching on the internet gave me too many long-winded not really working examples.

That’s why I created this page. Use it to quickly get up-and-running with an ELK-stack of your own.

Create the file docker-compose.yml

# file: docker-compose.yml
version: "3"

services:
  elk:
    image: sebp/elk
    ports:
      - "5601:5601"
      - "9200:9200"
      - "5044:5044"
    environment:
      - MAX_MAP_COUNT=262145
      - ELASTICSEARCH_START=1
      - LOGSTASH_START=1
      - KIBANA_START=1
      - TZ="Europe/Amsterdam"
    volumes:
      - elk-data:/var/lib/elasticsearch

volumes:
  elk-data:

Now start up with docker-compose up -d. That’s it!

5601: endpoint for Kibana
9200: endpoint for elastic search

Add some security

Don’t leave your elastic-search open for everyone.

Add some basic security by adding a .htpasswd config to your webserver.

$ sudo sh -c "echo -n 'myelasticuser:' >> /etc/nginx/.htpasswd"
$ sudo sh -c "openssl passwd -apr1 >> /etc/nginx/.htpasswd"
Password:
Verifying - Password:

Add it to your webserver, like nginx.

server {
    listen 80 default_server;
    listen [::]:80 default_server ipv6only=on;

    root /var/www/html;
    index index.html index.htm;

    server_name localhost;

    location / {
        try_files $uri $uri/ =404;
        auth_basic "Restricted Content";
        auth_basic_user_file /etc/nginx/.htpasswd;
    }
}

Reload nginx.

$ sudo nginx -t
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
nginx: configuration file /etc/nginx/nginx.conf test is successful

$ sudo service nginx reload

Some notes

I chose the Docker image of sebp because he’s got great documentation. Go check it out!

Especially the part with the Frequently Encountered Issues.

There, you’ll see that you’ll:

  • need 4GB of memory for the Docker container
  • need to set the amount of virtual memory on linux by setting the max map count:sudo sysctl -w vm.max_map_count=262144

SonarQube with Postgres on docker-compose

Struggling to get a working environment with SonarQube and PostgreSQL?

Use the following docker-compose file and be up and running in minutes.

It is as ‘bare’ as possible:

  • use of official Docker images for both PostgreSQL and SonarQube
  • no other configuration required
  • use of volumes so you can backup your data


version: "3"

services:
  sonarqube:
    image: sonarqube:7.9.2-community
    restart: unless-stopped
    environment:
      - SONARQUBE_JDBC_USERNAME=sonar
      - SONARQUBE_JDBC_PASSWORD=v07IGCFCF83Z95NX
      - SONARQUBE_JDBC_URL=jdbc:postgresql://db:5432/sonarqube
    ports:
      - "9000:9000"
      - "9092:9092"
    volumes:
      - sonarqube_conf:/opt/sonarqube/conf
      - sonarqube_data:/opt/sonarqube/data
      - sonarqube_extensions:/opt/sonarqube/extensions
      - sonarqube_bundled-plugins:/opt/sonarqube/lib/bundled-plugins

  db:
    image: postgres:12.1
    restart: unless-stopped
    environment:
      - POSTGRES_USER=sonar
      - POSTGRES_PASSWORD=v07IGCFCF83Z95NX
      - POSTGRES_DB=sonarqube
    volumes:
      - sonarqube_db:/var/lib/postgresql
      # This needs explicit mapping due to https://github.com/docker-library/postgres/blob/4e48e3228a30763913ece952c611e5e9b95c8759/Dockerfile.template#L52
      - postgresql_data:/var/lib/postgresql/data

volumes:
  postgresql_data:
  sonarqube_bundled-plugins:
  sonarqube_conf:
  sonarqube_data:
  sonarqube_db:
  sonarqube_extensions:

Start this stack with docker-compose up -d You can reach your SonarQube instance at http://localhost:9000Use the default credentials admin/admin to login.

Useful links:

>