Category Archives for computers

Bash: use colored text in script

I often use these handy helperfunction to write colored text in my scripts.

XPATH: search element by text

When I write selenium/kantu end-to-end test I often need to use xpath to find certain elements.
And while it’s definitely not the fastest, searching elements by the text in it, is easy and fast to use. It’s also more descriptive to an untrained eye.

I often refer to this post on StackOverflow which has it written down quite neatly: https://stackoverflow.com/a/2994336

It basically comes down to this:

Literal match (= must match exactly and completely):

//*[text()='match']
//button[text()='Save']

This matches: elements containing (just) the text ‘match’ and a button with (just) the text ‘Save’

Do you need to strip whitespace from texts in elements before you do the matching, then use this:

//*/text()[normalize-space(.)='match']/parent::*

Partly matching:

If it’s ok to be not so strict, use this more loosely matching:

//*[contains(text(),'match')]

This matches texts like: ‘this match comes up‘ and ‘matches this’

If your text must match with words beginning with the text or lines starting or ending with that text, use this xpath 2.0 expression:

//*[matches(text(),'(^|\W)match($|\W)','i')]

PRO-tip: use the inspector to test your xpath.

open the inspector and search (<C-f> or <Command+F>)

Watch files and execute command upon change


Find yourself executing the same command over and over again after applying changes to certain files? Pywatch will be you best friend!

Meet pywatch: a cool little app that watches directories and files. Whenever it finds a file that changed, it executes the command you provided.

TL;DR

As an example; I use this to build a Docker image whenever I save a change to my Dockerfile.

Or execute tests whenever I make a change to one of the sourcefiles.

This keeps an eye on all *.php and *.feature files under ./tests.

When one of these files changes, it executes $commandToExecute which resolves to executing behat in a Docker container.

Install

Download the pywatch app from github: https://github.com/cmheisel/pywatch.

Then unzip and install with python.

Advanced usage

Nice one: run tests when files change and create a Mac notifier whenever the tests fail.

This way you can keep the tests running in the background and you’ll be notified whenever a test failed.

 

 

Set up NGINX as a proxy for your Docker containers


Recently I’m a fan of serving docker containers over serving Virtual Hosts using a webserver.

In order to use regular domainnames without ports, I set up Nginx to receive the request on the domainname and let it forward the request to the relevant Docker container on the specific port it is running on.

Example

Imagine I have a Docker webserver-container hosting my app. It runs on my server exposing port 8080. I use the URL app.pauledenburg.com.

I don’t want people to use http://app.pauledenburg.com:8080 but just the URL without the port http://app.pauledenburg.com .

I use nginx for this:

And now add SSL to it 🙂


Complete ELK-stack example with Docker


I wanted a quick setup for an Elasticsearch Logstach and Kibana (ELK-)stack to work with. But searching on the internet gave me too many long-winded not really working examples.

That’s why I created this page. Use it to quickly get up-and-running with an ELK-stack of your own.

Create the file docker-compose.yml

Now start up with docker-compose up -d. That’s it!

5601: endpoint for Kibana
9200: endpoint for elastic search

Add some security

Don’t leave your elastic-search open for everyone.

Add some basic security by adding a .htpasswd config to your webserver.

Add it to your webserver, like nginx.

Reload nginx.

Some notes

I chose the Docker image of sebp because he’s got great documentation. Go check it out!

Especially the part with the Frequently Encountered Issues.

There, you’ll see that you’ll:

  • need 4GB of memory for the Docker container
  • need to set the amount of virtual memory on linux by setting the max map count: sudo sysctl -w vm.max_map_count=262144


Free SSL certificates with LetsEncrypt


Getting your website on https can be done in a matter of minutes. So there is no excuse anymore to go without it. Not even on your test and dev websites.

As this example is on CentOS, it really goes for any other linux distro.

Excellent, tailor-made instructions per webserver and OS are found on the website of Certbot:
https://certbot.eff.org/

Here, a short recap of that for my own archive.

You’ll need the repel repository for this. After that, install the certbot software.

 

Getting your website secured with SSL is now as simple as answering some questions on the following command.

Note: I’m using a method which takes a bit of downtime because LetsEncrypt is in the middle of an update. Read all about it

 

Things which might throw you an error

python-urllib3 version

First caveat for CentOS7 is that you need specific version 1.21 for urllib3. I had 1.22 installed via yum which gave me the following error.

You can see the currently installed version with pip:
pip freeze | grep urllib
To resolve this, first remove the old version it with yum and then add it with pip:

pyOpenSSL version

Just like urllib3, pyOpenSSL was of an unsupported version.

Error message stating that the CA can’t be satisfied

After running
certbot --nginx
you get the following error:

Due to legal reasons there currently is no

From the github certbot website:

If you’re serving files for that domain out of a directory on Nginx, you can run the following command:

If you’re not serving files out of a directory (for instance if you are using proxy_pass), you can temporarily stop your server while you obtain the certificate and restart it after Certbot has obtained the certificate. This would look like:

 

Install python pip


From the pip website:

pip is already installed if you’re using Python 2 >=2.7.9 or Python 3 >=3.4 binaries downloaded from python.org, but you’ll need to upgrade pip.

To install pip, run the following.

 

SonarQube with Postgres on docker-compose

Struggling to get a working environment with SonarQube and PostgreSQL?

Use the following docker-compose file and be up and running in minutes.

It is as ‘bare’ as possible:

  • use of official Docker images for both PostgreSQL and SonarQube
  • no other configuration required
  • use of volumes so you can backup your data

Start this stack with docker-compose up -d You can reach your SonarQube instance at http://localhost:9000Use the default credentials admin/admin to login.

Useful links:

>