Category Archives for computers

Store current datetime in variable in Selenium IDE and use it for a random email address

I use variables all the time. And to be able to re-use a test over and over again, I need random email addresses whenever I fill in forms.

For this I define a variable with the current date and time and then a variable which will hold the email address which uses the current date and time.

My random email address will look like: selenium-20220318_122803@pauledenburg.com

Just store the following as 1 string into the ‘Target’ part of your command.

const date=new Date(); return String(date.getFullYear())  + String(date.getMonth()+1) + String(date.getDate()) + '_' + String(date.getHours() < 10 ? "0"+date.getHours() : date.getHours()) + String(date.getMinutes() < 10 ? "0"+date.getMinutes() : date.getMinutes()) + String(date.getSeconds() < 10 ? "0" + date.getSeconds() : date.getSeconds())

It will look like this in Selenium IDE:

storing the current datetime in a variable in Selenium IDE

Now you can use this to create your email address which is unique every time you run your test:

And use it when you want to fill a form.

Add bearer authentication to your Swagger endpoint

In your .json definition file:

{
  "swagger": "2.0".
  ...
  "securityDefinitions": {
    "bearerAuth": {
      "type": "apiKey",
      "in": "header",
      "name": "Authorization",
    }
  },
  ...
  "paths": {
    "get": {
      "/path": {
        "security": [
          {"bearerAuth": []}
        ],
        ...
      }
    }
  }

official documentation is here: https://swagger.io/docs/specification/authentication/bearer-authentication/

Git remove local branches that don’t exist remote

The quick way:

git branch --merged master | grep -v '^[ *]*master$' | xargs git branch -d
git remote prune origin

Use the following to have the branches displayed before you’re asked to delete them.

branches=$(git branch --merged master | grep -v '^[ *]*master$'); \
printf '\n\nBranches to be removed:\n---\n'; \
echo ${branches} | xargs -n1; \
printf '---\n\nRemove the branches above? [Ny] ' \
    &&  read shouldDelete \
    && [[ "${shouldDelete}" =~ [yY] ]] \
      && echo $branches | xargs git branch -d \
      || echo 'aborted' 

source: https://stackoverflow.com/a/16906759

Process ‘wanwakuang` with high process load

I noticed today that my server was very slow. Looking at the running processes, I noted that process wanwakuang and 000000 were going crazy.

process wanwakuang caused the load to go very high

Searching wanwakuang on Google did not yield much results, but this article on HackerNews was very helpful: https://translate.google.com/translate?sl=auto&tl=en&u=http://hackernews.cc/archives/34789

Appearently wanwakuang is a mining process.

However, I could not find the binary on my system. My server is only running Docker containers, so probably one of the containers was at fault.

To find the docker container with the exploit, I executed the command:

$ find /var/lib/docker -type f -name wanwakuang
/var/lib/docker/overlay2/1752e86653539d82b50cf24c3d3f69b203fe059ca1650447016ca69033d468bf/diff/root/.configrc/a/wanwakuang
/var/lib/docker/overlay2/1752e86653539d82b50cf24c3d3f69b203fe059ca1650447016ca69033d468bf/diff/tmp/.W10-unix/.rsync/a/wanwakuang
/var/lib/docker/overlay2/1752e86653539d82b50cf24c3d3f69b203fe059ca1650447016ca69033d468bf/merged/root/.configrc/a/wanwakuang
/var/lib/docker/overlay2/1752e86653539d82b50cf24c3d3f69b203fe059ca1650447016ca69033d468bf/merged/tmp/.W10-unix/.rsync/a/wanwakuang

To find out which Docker container was attached to this overlay, I issued this command I found on stackoverflow:

$ docker inspect $(docker ps -qa) \
  | jq -r 'map([.Name, .GraphDriver.Data.MergedDir]) \
  | .[] | "(.[0])\t(.[1])"' \
  | grep '1752e86653539d82b50cf24c3d3f69b203fe059ca1650447016ca69033d468bf'

Knowing the name I could terminate the container. It was being used for SSH and could be removed.

Debugging cron on Docker

I had the issue of cronjobs not working (correctly) on my Docker instances.

This is what I did to fix it:

  1. Set correct permissions: chmod 0600 /etc/cron.d/cronjob
  2. Set correct owner: chown root /etc/cron.d/cronjob

When it failed, I could not find the logs of why it failed.
In order to see the output of the failed cronjobs, I installed postfix (because the output of cronjobs is being mailed) and I installed rsyslog

  1. Install postfix:
    apt-get update; apt-get install -y postfix; mkfifo /var/spool/postfix/public/pickup; service postfix restart
  2. Install rsyslog:
    apt-get update; apt-get install -y rsyslog; rsyslogd &

Now, whenever a cronjob failed, I could find output in either two locations:

  1. In the syslog: /var/log/syslog
  2. In the mail sent to root: /var/mail/root

CakePHP SplFileInfo::openFile(../tmp/cache/..) failed to open stream: Permission denied

You need to configure CakePHP to create the cachefiles with the right permissions.

You do this with setting 'mask' => 0666 in file app.php for the Cache setting:

// file src/config/app.php (AND app.default.php!)
...
    /**
     * Configure the cache adapters.
     */
    'Cache' => [
        'default' => [
            'className' => 'Cake\Cache\Engine\FileEngine',
            'path' => CACHE,
            'url' => env('CACHE_DEFAULT_URL', null),
            'mask' => 0666,
        ],

        ...

       '_cake_core_' => [
            'mask' => 0666,
            ...
        ],
       '_cake_model_' => [
            'mask' => 0666,
            ...
        ],
       '_cake_routes_' => [
            'mask' => 0666,
            ...
        ],

        ...

Docker Ubuntu cron doesn’t work

Most probably:

  • cron service is not running
  • your cron-file in /etc/cron.d is not in file mode 0600
  • your cron-file in /etc/cron.d is not owned by root

To fix the above:

service cron start
chmod 0600 /etc/cron.d/<your file>
chown root /etc/cron.d/<your file>

If that does not work, install rsyslog on your container and debug:

apt-get update; apt-get install -y rsyslog
rsyslogd

Now keep track of your /var/log/syslog file. It will log all issues it receives from cron.

I found the answer here: https://stackoverflow.com/a/33734915