Patch management for multiple boards

Im newer to administrating Linux in general, Im trying to create a server to download updates for a 905x and configuring the boards to go to this server to grab its updates. is the best way to do this to pull the files and directories from security debian org/debian-security and deb libre computer/repo, run a web server to access these files, and change the configs in /etc/apt/sources.list and /etc/apt/sources.list.d/libre-computer-deb.list to reflect the update management server? Any alternatives that would be a better process?

I use apt-cacher-ng. That’s the official Debian reference page; the project page and documentation links there have more info.

Note that https repositories can make things complicated… but it does help with bandwidth use and update speed quite a bit in my experience.

It works pretty much as you described in your “wish list” but is even easier to implement: choose one system with a good amount of storage space as the proxy/cache and install apt-cacher-ng; then on the client systems add the /etc/apt/apt.conf.d/[configuration file] that tells the clients to send update requests to the proxy.

When the client requests an update it will try to pull from the proxy; the proxy will pull down the update files from the various repositories and cache them. Subsequent requests from other clients for the same update file will be provided from the proxy’s cache. Eventually one of the clients will request a file or an updated version that the cache doesn’t have, and apt-cacher-ng will pull down the request and make it available to keep the cache fresh.

It will auto-expire old/unused files after a while. You can also pre-seed the cache if you already have files (I never have). And there’s a web interface where you can track stats and do other maintenance tasks. But it’s mostly seamless, very efficient and takes care of itself pretty well.


Per @bisco, apt-cacher-ng is our recommended path for proxying apt updates. There are many Ubuntu/Debian guides online on using it. We also use it internally with great success.

1 Like