Becoming my Own Package Maintainer (the Portage way)

06/03/2020

I was frustrated with my update flow. For all my computers, I would essentially be installing the same things on different systems manually with different package managers. I use Gentoo and Portage on my "faster" machines, but usually leave less powerful computers on systems that rely on binary package providers. Loving the Portage philosophy and hating redundancy, I wanted to reduce the number of different tools I was using to keep all of my systems up-to-date. Did I really need apt, pacman, etc, etc, to install slightly different builds of ffmpeg on similar systems? I already configure most of my systems to operate similarly for simplicity, but I wanted to automate it.

My solution, luckily, was already implemented in a severely underrated feature in Portage. Portage allows you to build your own packages and then package them into binaries for you to distribute to other computers, given that there won't be any compile or USE flag conflicts, with support for ftp, ssh, and http transfer. Since a lot of my posts just involve pointing at a wiki page, I'll go ahead and provide it here.

To test how this could work, I wiped a cheap laptop clean and started a Gentoo Linux installation. A key part of the way binary packages work in Portage is the quickpkg program, which translates installed atoms to binaries. By default, generation via quickpkg enables CONFIG_PROTECT, which obfuscates the configuration files in your system related to a package (see "equery f"). What went over my head, however, is that enabling CONFIG_PROTECT doesn't obfuscate your configuration file by generating a skeleton file on installation, but rather omits a configuration file altogether. This can make installing critical system packages for the first time via binary dangerous. Even if you disable CONFIG_PROTECT, you risk writing configuration files that are incorrect for your system. This means that you should still compile most mission critical programs if you're installing them for the first time. If you still want to write out / overwrite configuration files in a case where you're absolutely sure, you can set CONFIG_PROTECT="-{atom}" to (as per the manpage) "mercilessly auto-update your config files."

As for server configuration, I settled with ssh, with a user dedicated to serving packages. Setting --getbinpkg in EMERGE_DEFAULT_OPTS on the client lets you seamlessly and automatically pull binaries of the same atom when you use emerge. I tend to include configuration files on build for builds that write trivial configuration files, like the GIMP or Libreoffice, since for those, something is better than nothing on first install.

I'm surprised this isn't noted more when I hear discussions about Portage. You get to make your own ecosystem of specialized packages that you distribute to yourself, to keep low-power systems updated but also optimized. The only thing you might be giving up is a bit of specialization, since the distributor needs to be using the same USE and compile options as the client. Save for a kernel upgrade, the updates to my old machine take less time than on my high-power machines, and I can trust all my computers to behave similarly.


home | blog index | rss | contact me | PGP public key?