Gitlab CI upload artifact fails: too large
Today I wanted to add a package-job to my Gitlab CI as instructed in this nice Gitlab tutorial.
I created the tar-file but when it came to uploading it failed with Request entity too large
.
(...) ERROR: Uploading artifacts to coordinator... too large archive id=243 responseStatus=413 Request Entity Too Large status=413 Request Entity Too Large token=JYszbA9F FATAL: Too large ERROR: Job failed: exit status 1
It took me some digging, but this is how I fixed this (note, the Nginx proxy was the one giving me a hard time).
Step 1: Set the maximum artifacts size
In your gitlab, go to Settings > Continuous Integration and Deployment > Maximum artifacts size (MB) and set it to the desired value. The default is 100MB.
Step 2: Set the nginx upload size
In the gitlab.rb file, mine at /etc/gitlab/gitlab.rb
, set or uncomment the following line.
nginx['client_max_body_size'] = '250m'
And reconfigure gitlab to get this to work.
gitlab-ctl reconfigure
Step 3: (optional) update your proxy(!)
I run gitlab on docker containers. On the server, I run nginx as a proxy to redirect requests for gitlab to these containers.
I failed to update the proxy configuration to allow the POST-ing of data to the containers.
As I use nginx, this is the line I added. For Apache, just google and you’ll find your answer.
client_max_body_size 0;
This will set no limits on clients sending data.
For reference, this is my whole nginx vhost file.
server { listen 80; server_name git.pauledenburg.com; client_max_body_size 0; location / { proxy_pass http://localhost:8080; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; } }
Don’t forget to reload nginx.
$ sudo nginx -t nginx: the configuration file /etc/nginx/nginx.conf syntax is ok nginx: configuration file /etc/nginx/nginx.conf test is successful $ sudo service nginx reload