This repository contains assorted shell scripts for personal use.
Installation
Clone the repository somewhere and place symbolic links to scripts of interest into a directory on your $PATH.
Contents
merge-config.py
Manage Linux system configuration file updates on Arch-Linux based systems. Merges configuration file changes from updated packages with local configuration file changes.
merge-config.py
offers 4 subcommands to manage a changed configuration file:
backup
: Creates a permanent backup of a config file with ".original" suffix for use when the next package update occursmerge
: Creates a temporary backup, then performs a 3-way merge with the active configuration file, the original copy and the updated copy created by pacman using the ".pacnew" suffix- Merge offers multiple resolution strategies for merge conflicts, see it's help for details
finish
: Remove the temporary backup created by the merge command, and renames the pacnew file to ".original", to be used when the next package update occurs.revert
: Undo a merge attempt using the temporary backup
The merge logic uses gaspra
as the merge algorithm,
which must be installed via pip
.
Workflow
- Before editing a configuration file for the first time, create a backup of it:
merge-config.py backup /path/to/file
- Edit the configuration file
When updating packages, pacman creates .pacnew
files for locally edited configuration files.
For each config with an associated .pacnew
file,
- run
merge-config.py merge /path/to/base-config-file
. (This step also uses the.original
file, so have that available!) - Verify the merged file is valid. If so
- run
merge-config.py finish /path/to/base-config-file
- run
- If the merge is invalid, try a different merge strategy or manually merge it.
- (Use the
marker
merge strategy for manual merges) - Once finished, end the process with
merge-config.py finish /path/to/base-config-file
- (Use the
dl-nyaa.xsh
Xonsh script to download torrents from nyaa.si using aria2c. Takes any number of numerical torrent ids (or URLs containing them), and fetches them in order.
network_webcam.sh
Bash script to connect a network camera streaming video via SRT and use it as a standard webcam, as if it were a USB camera.
Uses v4l2_loopback as the driver backend, and ffmpeg to run a filter chain for de-noising the input to make it easily compressible.
The goal is to have a static image without any noise when nothing moves in front of the camera.
Use case is streaming a low movement environment in high resolution over a bandwidth-starved connection.
For example, online playing a TCG with physical cards, by having a camera filming the play area from above.
There is low movement, but card texts require high resolution images to have them legible.
fix_srt_timestamps.py
Fixes invalid timestamps in SRT subtitle files. Ensures that all milliseconds in timestamps are properly zero-padded to three digits.
scan_flatbed.xsh
Xonsh script to scan documents via a flatbed scanner. It uses
- scanimage to obtain images
- imagemagick to fix half-pixel color channel offsets
- optipng to minimize the file size.
It can output as a series of PNG images or a single PDF document. Image post-processing is run asynchronously in the background, and processes images in parallel up to the number of CPU cores in the system. It starts as soon as the first image scan is finished to minimize the runtime.
Default values for optional parameters are read from a configuration file in $XDG_CONFIG_HOME/ShellTools/scan_flatbed.ini
(The file is created and populated with the default values, if it does not exist.)
scan_adf.xsh
Xonsh script to scan (duplex) documents via a (simplex) ADF scanner. It uses
- scanimage to obtain images
- imagemagick to fix half-pixel color channel offsets and apply deskewing
- optipng to minimize the file size.
In duplex mode, it asks for confirmation when one side is done, so that mistakes can be corrected, for example when the scanner pulls multiple pages at once or pulls overlapping pages.
Default values for optional parameters are read from a configuration file in $XDG_CONFIG_HOME/ShellTools/scan_adf.ini
(The file is created and populated with the default values, if it does not exist.)
backup_paperless
Backup tooling for a paperless-installation.
Dependencies
- Uses bup to create a de-duplicated, versioned backup with daily snapshots.
- par2 for parity data generation on git pack files. Used via
bup fsck
command. - sshfs to mount the remote file system locally and sync local to remote.
- rsync to sync local backup to the remote
Overview
Consists of a backup script to be placed in $PATH, a backup sync script, and systemd units to control the behavior:
- Two mount targets, one for a local file system, and one for a remote filesystem mounted via sshfs.
- A systemd service that runs the backup snapshot creation script and requires the local mount.
- A systemd service that runs the sync script and requires both local and remote mounts. It pushes changes from the local backup to the remote.
- A systemd service that remounts the local filesystem read-only. Required by the sync service, ensuring that both targets are not writable simultaneously.
- A systemd timer that starts the backup service once per day.
The paperless installation runs on a Raspberry Pi, with the data on an SSD. The local backup resides on a dedicated partition on the MicroSD card
and is only mounted during the backup process to protect the data from a potential stray rm -rf /*
.
The remote is a plain USB flash drive attached to an all-in-one router device running in the LAN, and shared via SFTP.
It is also only mounted during the backup sync process.
The remote file system is not required to perform a snapshot, and if unavailable, the sync script skips syncing the local backup with the remote.
paperless_leaflet_bot
A bot that manages shop leaflets in paperless. Consists of two scripts run via systemd services and triggered via timers:
- A cleanup script deleting old leaflets. Deletion moves the files to the Paperless trash bin.
- A download script that downloads new leaflets and adds them to Paperless via it's API.
- A weekly timer running the cleanup script.
- A set of weekly timers running on different days, triggering the download script.
The paperless instance is configured to put leaflets into a separate storage path, which is excluded from backups via the paperless backup script.
rpi_load_logger
Logs some system stats in regular intervals into an SQLite database.
Meant to run as a systemd service on a Raspberry Pi.
Logs:
- CPU load %
- RAM and swap usage
- CPU temperature
run-in-gui
Automatically run a GUI application on (USB) device plugin. Uses udev rules and systemd user services.
Currently provided is a service for launching Yubico Authenticator when plugging in any Yubikey FIDO2/TOTP hardware token (with USB vendor id 0x1050)
License
This applies to all scripts found here:
Copyright (C) 2023-2024 Thomas Hess <thomas.hess@udo.edu>
This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.