How to reduce the size of a PDF document

 When you own a big collection of PDF files the used storage space can increasing quite high. Sometimes I own PDF documents with more than 100 MB. Well nowadays this storage capacities are not a big issue. But if you want to backup those files to other mediums like USB pen drives or a DVD it would be great to reduce the file size of you PDF collection.

Long a go I worked with a little scrip that allowed me to reduce the file size of a PDF document significantly. This script called a interactive tool called PDF Sam with some command line parameters. Unfortunately many years ago the software PDF Sam become with this option commercial, so I was needed a new solution.

Before I go closer to my approach I will discuss some basic information about what happens in the background. As first, when your PDF blew up to a huge file is the reason because of the included graphics. If you scanned you handwritten notes to save them in one single archive you should be aware that every scan is a image file. By default the PDF processor already optimize those files. This is why the file size almost don’t get reduced when you try to compress them by a tool like zip.

Scanned images can optimized before to include them to a PDF document by a graphic tool like Gimp. Actions you can perform are reduce the image quality and increase the contrast. Specially for scanned handwritten notes are this steps important. If the contrast is very low and maybe you plan to print those documents, it could happens they are not readable. Another problem in this case is that you can’t apply a text search over the document. A solution to this problem is the usage of an OCR tool to transform text in images back to real text.

We resume shortly the previous minds. When we try to reduce the file size of a PDF we need to reduce the quality of the included images. This can be done by reducing the amount of dots per inch (dpi). Be aware that after the compression the image is still readable. As long you do not plan to do a high quality print like a magazine or a book, nothing will get affected.

When we wanna reduce plenty PDF files in a short time we can’t do all those actions by hand. For instance we need an automated solution. To reach the goal it is important that the tool we use support the command line. The we can create a simple batch job to perform the task without any hands on.

We have several options to optimize the images inside a PDF. If it is a great idea to perform all options, depend on the purpose of the usage.

  1. change the image file to the PNG format
  2. reduce the graphic dimensions to the real printable area
  3. reduce the DPI
  4. change the image color profile to gray-scale

As Ubuntu Linux user I have all of the things I need already together. And now comes the part that I explain you my well working solution.


GPL Ghostscript is used for PostScript/PDF preview and printing. Usually as a back-end to a program such as ghostview, it can display PostScript and PDF documents in an X11 environment.

If you don’t have Ghostscript installed on you system, you can do this very fast.

sudo apt-get update
sudo apt-get -y install ghostscript

 Before you execute any script or command be aware you do not overwrite with the output the existing files. In the case something get wrong you loose all originals to try other options. Before you start to try out anything backup your files or generate the compressed PDF in a separate folder.

gs -sDEVICE=pdfwrite \
   -dCompatibilityLevel=1.4 \
   -dPDFSETTINGS=/default \
   -dNOPAUSE -dQUIET -dBATCH -dDetectDuplicateImages \
   -dCompressFonts=true \
   -r150 \
   -sOutputFile=output.pdf input.pdf

The important parameter is r150, which reduce the output resolution to 150 dpi. In the manage you can check for more parameters to compress the result more stronger. The given command you are able to place in a script, were its surrounded by a FOR loop to fetch all PDF files in a directory, to write them reduced in another directory.

The command I used for a original file with 260 MB and 640 pages. After the operation was done the size got reduced to around 36 MB. The shrunken file is almost 7 times smaller than the original. A huge different. As you can see in the screenshot, the quality of the pictures is almost identical.

As alternative, in the case you won’t come closer to the command line there is a online PDF compression tool in German and English language for free use available.

PDF Workbench

Linux Systems have many powerful tools to deal with PDF documents. For example the Libreoffice Suite have a button where you can generate for every document a proper PDF file. But sometimes you wish to create a PDF in the printing dialog of any other application in your system. With the cups PDF print driver you enable this functionality on your system.

sudo apt-get install printer-driver-cups-pdf

As I already explained, OCR allows you to extract from graphics text to make a document searchable. When you need to work with this type of software be aware that the result is good, but you cant avoid mistakes. Even when you perform an OCR on a scanned book page, you will find several mistakes. OCRFeeder is a free and very powerful solution for Linux systems.

Another powerful helper is the tool PDF Arranger which allows you to add or remove pages to an existing PDF. You are also able to change the order of the pages.



Network spy protection with AdGuard Home on a Raspberry Pi and Docker

Maybe you have bought you like me an Raspberry Pi4 with 4GB RAM and think about what nice things you could do with it. Since the beginning I got the idea to use it as an lightweight home server. Of course you can easily use a mini computer with more power and obviously more energy consumption too. Not a nice idea for a device is running 24/7. As long you don’t plan to mine your own bitcoins or host a high frequented shop system, a PI device should be sufficient.

I was wanted to increase the network security for my network. For this reason I found the application AdGuard which blocks many spy software from internet services you use on every device is connected to the network where AdGuard is running. Sounds great and is not so difficult to do. Let me share with you my experience.

As first let’s have a look to the overall system and perquisites. After the Router from my Internet Service Provider I connected direct by wire my own Network router an Archer C50. On my Rapsbery PI4 with 4GB RAM run as operation system Ubuntu Linux Server x64 (ARM Architecture). The memory card is a 64 GB ScanDisk Ultra. In the case you need a lot of storage you can connect an external SSD or HDD with an USB 3 – SATA adapter. Be aware that you use a storage is made for permanent usage. Western Digital for example have an label called NAS, which is made for this purpose. If you use standard desktop versions they could get broken quite soon. The PI is connected with the router direct by LAN cable.

The first step you need to do is to install on the Ubuntu the Docker service. this is a simple command: apt-get install docker. if you want to get rid of the sudo you need to add the user to the docker group and restart the docker service. If you want to get a bit more familiar with Docker you can check my video Docker basics in less than 10 minutes.

sudo apt-get install docker
sudo gpasswd -a <user> docker
sudo dockerd

After this is done you need to create a network where the AdGuard container is reachable from your router to a static IP address on your PI.

docker network create -d macvlan -o parent=eth0 \
--subnet= \
--ip-range= \
--gateway= \

Before you just copy and past the listing above, you need to change the IP addresses to the ones your network is using. for all the installation, this is the most difficult part. As first the network type we create is macvlan bounded to the network card eth0. eth0 is for the PI4 standard. The name of the network we gonna to create is lan. To get the correct values for subnet, ip-range and gateway you need to connect to your router administration.

To understand the settings, we need a bit of theory. But don’t worry is not much and not that complicated. Mostly your router is reachable by an IP address similar to – this is a static address and something equal we want to have for AdGuard on the PI. The PI itself is in my case reachable by, but this IP we can not use for AdGuard. The plan is to make the AdGuard web interface accessible by the IP OK let’s do it. First we have to switch on our router administration to the point DHCP settings. In the Screenshot you can see my configuration. After you changed your adaptions don’t forget to reboot the router to take affect of the changes.

I configured the dynamic IP range between to This means the first 4 numbers before can be used to connect devices with a static IP. Here we see also the entry for our default gateway. Whit this information we are able to return to our network configuration. the subnet IP is like the gateway just the digits in the last IP segment have to change to a zero. The IP range we had limited to the because is one number less than where we configured where the dynamic IP range started. That’s all we need to know to create our network in Docker on the PI.

Now we need to create in the home directory of our PI the places were AdGuard can store the configuration and the data. This you can do with a simple command in the ssh shell.

mkdir /home/ubuntu/adguard/work
mkdir /home/ubuntu/adguard/conf

As next we have to pull the official AdGuard container from the Docker Hub and create a image. This we do by just one command.

docker run -d --name adguard --restart=always \
-p 3000:3000/tcp --net lan --ip \
-p 53/tcp -p 53/udp -p 67/udp -p 68/udp -p 80/tcp \
-p 784/udp -p 8853/udp \
-p 443/tcp -p 443/udp \
-p 853/tcp -p 853/udp \
-p 5443/tcp -p 5443/udp \
-v /home/ubuntu/adguard/work:/opt/adguardhome/work \
-v /home/ubuntu/adguard/conf:/opt/adguardhome/conf \

The container we create is called adguard and we connect this image to our own created network lan with the IP address Then we have to open a lot of ports AdGuard need to do the job. And finally we connect the two volumes for the configuration and data directory inside of the container. As restart policy we set the container to always, this secure that the service is up again after the server or docker was rebooted.

After the execution of the docker run command you can reach the AdGuard configuration page with your browser under: Here you can create the primary setup to create a login user and so on. After the first setup you can reach the web interface by

The IP address you need now to past into the field DNS Server for the DHCP settings. Save the entries and restart your router to get all changes working. When the router is up open on your browser any web page from the internet to see that everything is working fine. After this you can login into the AdGuard web console to see if there appearing data on the dashboard. If this is happened then you are don e and your home or office network is protected.

If you think this article was helpful and you like it, you can support my work by sharing this post or leave a like. If you have some suggestions feel free to drop a comment.

Installing NextCloud with Docker on a Linux Server

For business it’s sometime important to have a central place where employees and clients are able to interact together. NextCloud is a simple and extendable PHP solution with a huge set of features you can host by yourself, to keep full control of your data. A classical Groupware ready for your own cloud.

If you want to install NextCloud on your own server you need as first a well working PHP installation with a HTTP Server like Apache. Also a Database Management System is mandatory. You can chose between MySQL, MariaDB and PostgreSQL servers. The classical way to install and configure all those components takes a lot of time and the maintenance is very difficult. To overcome all this we use a modern approach with the virtualization tool docker.

The system setup is as the following: Ubuntu x64 Server, PostgreSQL Database, pgAdmin DBMS Management and NextCloud.


  • Docker Basics
  • Installing Docker on a Ubuntu server
  • prepare your database
  • putting all together and make it run
  • insights to operate NextCloud

Docker Container Instructions

# create network
docker network create -d bridge --subnet= service

# postures database server
docker run -d --name postgres --restart=always \
--net service --ip \
-e PGPASSWORD=s3cr3t \
-v /home/ed/postgres/data:/var/lib/postgresql/data \

# copy files from container to host system
docker cp postgres:/var/lib/postgresql/data /home/ed/postgres

# pgAdmin administration tool
docker run -d --name pgadmin --restart=no \
-p 8004:80 --net services --ip \
-e \

# nextcloud container
docker run -d --name nextcloud --restart=always \
-p 8080:80 --net services --ip \
-v /home/ed/_TEMP_/nextcloud:/var/www/html \
-e POSTGRES_DB=nextcloud \
-e POSTGRES_USER=nextcloud \
-e POSTGRES_PASSWORD=nextcloud \


[1] Tutorial: Learn to walk with Docker and PostgreSQL
[2] Ubuntu Server:
[3] Docker :
[4] PostgreSQL
[5] pgAdmin
[6] NextCloud

If you have any question feel free to leave a comment. May you need help to install and operate your own NextCloud installation secure, don’t hesitate to contact us by the contact form. In the case you like the video level a thumbs up and share it.

Wind of Change – a journey to Linux

1989 was a historic year not just for Germany, it was for the whole world, when the Berlin wall teared down. A few months before, many people had wished that this event happen. But no one imagined it will become true. And even no person had in mind that everything goes so fast. Even me. Grew up on the side were we wished to touch the “always more green grass” of our neighbors. The Scorpions captured the spirit of the time with a song “Wind of change”. The unofficial hymn for the reunion of Germany.

Similar to me it was with the strong dominance of Microsoft Windows as Operation System when I went into those computer things. Never I thought there is coming for me a time without Windows. Windows XP I had loved, after they fixed some heavy problems by Service Pack 1. Also Windows 7 an OS I really was liked to use and never I thought about a change. Really I always defended my Windows OS till this time. Because it was a really good system. But some years ago I changed my opinion – dramatically. Another wind of change. Let me give a brief history about decisions and experiences.

Before I continue I need to clarify that this post will not mention anything from the Apple universe. Till today I never owned or used any Apple device. Why? Because there is no reason for me.

Everything began as I bought my Microsoft Surface 3 Pro with Windows 8.1 OS. In the beginning I was happy with the compact system and the performance. Very portable. A very important fact for me, because I travel a lot. I toked it also to my pilgrimage to Spain on the Camino de Santiago, to write my blog posts whit it. Unfortunately in the last week in Portugal the screen got broken. I was only be able to use it with external mouse. So I was forced to send the device to the Microsoft support. For a very shameless amount, close to a new buy – I got a replacement device with Windows 10. In between I also installed on my ThinkPad 510p Windows 10. I had felt like back in the year 2000 with my old Fujitsu Siemens Desktop and Windows Millennium. Almost every 3 months a re-installation of the whole system was necessary, because the update had broke the system. The most of my time I had spend to sit in front of my machine and wait that my Win 10 system got done whit the updates. During the updates, Windows 10 performance goes dramatically down and the system was not usable. If the device have more than 6 months no internet connection, it also close to impossible to turn on and start work with it. But this is not the whole drama. Every mayor update is breaking the customized configuration. Apps of the MS store already was deleted appeared again, additional language settings get broken and deactivated unwanted features was activated again. Another point is when you haven’t enough disk space for the Windows 10 update. This costs a lot of pain and frustrations. Till today the situation is not really changed. By the way Ubuntu Mate can also be run on a Microsoft Surface 3 PRO device.

The best you are be able to do with a Surface 3 PRO and its Windows 10 installation.

After I decided to run away from the Windows OS, I was needed to chose my new operation system. Well Linux! But which distribution? Usually in my job I only got experience with server and not by desktop systems. I remind me to my first SUSE Linux experiences in the early 2000. I think it was the Version 7 or something. I bought it in a store, because it came with a printed documentation and the download of more tha a Gigabyte with my 56k modem was not an option. Some years later I was worked with Ubuntu and Fedora. In the beginning Ubuntu I was not wanted to use for my change, because of the Unity engine. The first installation of Fedora was needed a lot of hands on to establish services like Dropbox and Skype. I was in search for a system I can code with it. For this requirements many persons recommended Debian. But Debian is more for experienced user and for the most people get first time in touch with Linux not an good advice. After some investigations I found Ubuntu Mate. A Gnome Desktop based on Debian whit a huge software repository. This was sounding perfect for my necessities. ANd its still today the system of my choice.

After I installed Ubuntu Mate on my machine I really liked it from the first moment. Fast and simple installation, excellent documentation and all application I was needed was there. Because of my travels I bought some years ago an Asus ZenBook UX and I run also Ubuntu from the first unboxing on it. Always when people see my system they are surprised because everything looks like a iBook, but is much better.

The change to Linux Ubuntu Mate was much easier than I expected. With some small tricks a re-installation of the whole system takes me now less than 2 hours. The main concept is always clean and backuped the bash history file. Then I be able to rerun all needed commands for installation and configuration. Some of the applications like my favorite IDE Apache NetBeans, I backuped the configuration settings. The prefix of the filename is always the date when I perform the settings export. For example the NetBeans backup file is named as 2017-03-31_NetBeans. Those backups I do currently manually and not scripted. The time to develop a full automatism takes me in the moment too much time and the services I have to backup are not so much. For this reason is a manual action sufficient. Typical services are e-mail, SFTP, Browser favorites and so on as part of my manual backup procedure.

Since Firefox changed his API 2018 a very useful tool for export and import passwords is not available any more. To avoid the usage of a cloud service you should not trust to store all account passwords online, I decided to use the crypto tool KeePass. After I add all my web accounts to this tool the storage of the passwords is also more secure. With the browser plugin the accounts can be shared between all popular browsers like Firefox, Chrome and Opera. The KeePass file wiht my stored passwords is automatically included into my backup. The only important procedure of password storing is to keep the password database up to date.

One thing I was used heavily on Windows was the Portable Apps ecosystem. My strategy was to have a independent installation of many well configured services for work I just need to include to my current system. Something like this exist also in Linux. Just without the Portable App environment. My preference is always download the ZIP version of a software which not needs an installation. This I store on a second partition. In the case of a virgin OS setup the partition just need to linked back to the OS and its done. The strategy to deal whit an separate disk partition offers also a high flexibility, we can use for another step, virtualization.

I don’t want to remind myself how much time I spended in my life to configure development services like web server and database. For this problem is Docker now the solution of my choice. Each service is in an own image and the data got linked from a directory into the container. Also the configuration. Start and stop of the service is a simple command and the host system stays clean. Tryouts of service updates are very easily and complete conflict free. A rollback can performed every time by deleting the virtualization container.

The biggest change was from MS Office to Libre Office. Well with the functionality of Libre Office I already was fluent. The problem was all the Presentations and and Word documents I had. If you tried to open those files in both office applications the formatting getting crazy. To avoid this problem, I had find out it is often just a problem of the fonts. I decided to download from google a free and nice looking font. This I installed on the old Windows machine to convert all my office document away from MS Office.

My resume after some years of Linux usage, today I can say absolutely honest I do not miss anything. Of course I have to admit, I do not play games. After some hours every day working on a computer I prefer to move back into reality to meet friends and family. Its nice to explore places during my travels or simply read a book. With Linux I have a great performance of my hardware and as of now I did got any issue with drivers. All my Hardware is still working under Ubuntu Mate Linux well as I expect. With Linux instead of WIndows I save a lot of lifetime and frustrations.