R Numbers

A few days ago there was a post on /r/de showing a nice graph of now famous R number for Germany:


Exciting for a couple of reasons: that the data-set was available (i’d been looking for it for a while); the reported number was over 1 … which isn’t good.

The post linked to a .pynb script that had been used to produce the image. The code was obviously Python and somehow related to Jupyter Notebook – time to learn something new!

Since the original script was published the Excel spreadsheet download added a cover sheet, which obviously broke things. Having patched things up a little (and translated the labels to English), here is the updated plot:


One of the labels has gone missing… oops.

Update: the missing label was important! The above graph is for a new RKI dataset that tracks R on a 7 day average, the original series was too sensitive (see below). For “reasons” that series doesn’t include error estimates for the last data points, which broke calculation done on the final point. Below is a new plot on the 4 day averaged series – now really an update of the original image:


As you can see the R value for the original chart has been revised down, and the current value remains below 1. The brief bump up was attributed to the infection of slaughter house slave labour, housed in cramped shared dormitories. Come of Germany – be better than that!

Update: weekends are obviously good for tracking down datasets and visualizations! There is a GitLab project running model simulations on the regional data, below is the plot for Hamburg:


Having no background in epidemiology (or statistical analysis…) all i can do is accept it as presented, and note that it correlates well with the recent decline in reported new infections. The RKI made an interesting observation on the sensitivity of R the other day, noting that as the number of active cases (infections) fell any new hotspots (such as the slaughterhouse outbreaks) would have a larger impact on the reported number.

Concatenation with ffmpeg

This is has been something that regularly happens. Never remember how to do it…

Create a file that contains each file to be concatenated:

% for i in `ls`; do
echo file \'$i\' >> files.txt
jje@wretched cat % cat files.txt
file '1.mp4'
file '2.mp4'
file '3.mp4'

and now let ffmpeg do it’s magic:

% ffmpeg -f concat -safe 0 -i files.txt -c copy out.mp4

This works well for sites that make youtube-dl download files in pieces… and having written this, i’ll no longer need to reference it again.

The State of “Social” (Redux)

This post, back in 2018, mused a little about the general state of social networking, and the failure of Mastadon to gain any traction. Despite the obvious and ongoing twin clusterfucks of Facebook and Twitter, little seems to have changed.

Since 2018 i’ve (not for the first time) deleted my Twitter account, and will not be going back. They don’t seem to have noticed…

Both Facebook and Twitter appear to have doubled down on the strategy of “outrage generates engagement. engagement generates profits”, despite the obvious and one might say catastrophic impact it has had on civil society. Even the mainstream press is driven by the vagaries of tweet storms. Facebook looks (from the outside) to be an high stakes version of the Stanford Prison Experiment being run as a profit centre. WhatsApp and Instagram? Please don’t look behind the curtain!

The group of humans with whom i’ve been luck enough to become acquainted are, to varying degrees: intelligent, socially liberal, technically savvy, literate, politically aware, beautiful, artistic, and, er… lovely. That’s why i try to stay in touch with them despite them being spread out over multiple continents.

Unfortunately keeping in touch is not easy. This is, mea maxima culpa, entirely a problem of my own making. Currently there are sets of friends or individuals spread across a large number of … systems. These range from Email (it should be called GMail at this point), XMPP and SMS to iMessage, Signal, and even Discord. Not including Mastadon as nobody ever even remembers it exists.

A large majority of those people are active on Twitter, Instagram, WhatsApp, others on Facebook. The path of least resistance would definitely be to surrender to the herd, and i completely understand that it’s me that makes all this difficult… and yet, i’m not going back.

Everything has a price. Some aspect of the performative / promotional / social engagement with traditional public social networks makes it worthwhile for my friends to continue that relationship. The balance of power has definitely shifted, but people who profited from the early phase largesse are struggling to move on now the situation has become borderline abusive. The societal damage is spread thin enough, especially for those of us who have the free time to waste worrying about it, that it makes sense to try wait it out. Maybe the balance will shift again? It won’t, at least not back from whence it came!

The idea that a new form of social networking, one that reflects and respects the real life boundaries of our actual friendships, will evolve or emerge from the current mess of highly commercialized exploitation now seems ridiculous to me. Prior transitions (MySpace to Facebook, etc) feel like poor guides to the future as the context in which they happened is entirely different. Sure the kids are gravitating to different venues, away from their parents, but the owners of new venues are that same predatory ad tech creeps, with a bag full of “How do you do, Fellow Kids” memes slung over their shoulder.

[As an aside, i’m really intrigued by the concept of posting being in some way “performative”, the micro-doses of attention coming from posting to something like Twitter has become meaningful enough to be habit forming. Future studies of this phenomenom are going to be a rough read…]

A system like Secure Scuttlebutt seems like it would be a far more “human” venue in which to interact. I still love the idea of staying in touch with a geographically dispersed circle of friends, with the possibility of serendipitous encounters of friends-of-friends, having friends help friends… but not all happening under the Unblinking Algorithmic Eye.

APT-E In the Wild?

Along with all the b&w negatives, and a few colour slides, there were a few strips of colour negative film. At first i couldn’t work out why they would have been taken. They are mostly out of focus, blurred by camera shake… and lets not talk about the weird colour casts!

Having actually scanned them, it eventually dawned on my that they might have been taken by my Grandfather at the behest of my Father, who was probably on the train that loosely features in the shots.

The other clue is that there is a picture of my Grandfathers Jaguar parked up by the side of the road. If there is one thing that i’ve learned from scanning all these old pictures it is that my Grandfather loved photographing his cars. We’ll come back to that!


More APT-E Photographs

More photographs from my Father’s archives. Earlier that the previous shots, maybe 1968 or so. No idea what i’m looking at… but presumably the shop in which the prototype was built, and maybe the parts he’d worked on?

Odd that there is nobody around.

To Grate Cheese

IMG_5018This is “Masspot”, a Mac Pro 1,1, upgraded to 12GB of RAM, and replaced disks. It is currently running OS X 10.7.5.

My best guess is that it is over 10 years – probably purchased in 2008 as my main work machine. It survived being shipped via UPS from Japan to Germany. Upon arrival in Hamburg i saw it thrown out of the back of a delivery truck, and very nearly roll into a drainage ditch… it didn’t seem to care. Still runs just fine.

Not sure if it can still grate cheese.

The majority of my image library (Lightroom catalogs) still lives on this machine, backed up to multiple locations, and in the process of transitioning to a RAID6 array Synology NAS.

While i’ve no idea how much it cost at the time, it has undoubtedly paid for itself hundreds (or thousands?!) of times over. My plan is to replace it with… well, that’s the question.

The original Cinema display is now tinged a distinct pink around the edges, and not usable for any image processing. That’s not a huge problem for minor tasks (vnc is good enough with everything cabled) but long-term it’s not viable. The only other monitor, an LG Ultrafine 5K, is now dedicated to my work environment. Not that there is any feasible way to connect it to the current graphics card of the Mac Pro!

Not being a video person all new Apple machines feel like overkill to me. Especially if it’s just image processing. But given the likely lifespan of any new machine that i’m going to buy, maybe it won’t feel that way by the end of it’s life?

In short, i’m thinking about buying a low-end / base model Mac Pro cheese grater. Or an iMac Pro. The latter scares me a little as it’s obviously a non-upgradable route, where the whole thing becomes useless if a component fails. And, i dont really want any more screens in my office.

Ideally, there is a Thunderbolt 3 KVM setup to let me switch between two machine and a single screen. That has to exist… right?

Oh, with a new graphics card and hacked EFi the original Mac Pro can be upgraded to a far more recent macOS release, which would make me less scared about having it on my network!

RSS-Bridge and Nitter

For the last couple of months my interface with Twitter has been through an instance of TwitRSS.me running in a Docker container on my laptop. It has been working remarkably well… given that it hasn’t updated for months. Also it hadn’t kept up with the way that Twitter embedded image data in tweets, this in turn meant clicking on a lot of entries, and opening them in Twitter – which was what i was trying to avoid.

There are a couple of new / actively maintained projects that provide alternative methods of interacting with Twitter. Two of them, RSS-Bridge and Nitter, also provide RSS feeds of accounts / searches / hashtags.

Nitter is very cool. It uses the twitter api to return tweets, returning them in chronological order, in the form of website. For example, here is a random BBC stream, from which it’s possible to get an RSS feed. This is cool as it also converts all links in twitter into links into the instance of Nitter. There are people hosting Nitter instances, and it should be possible to host your own in Docker locally.

RSS-Bridge is more of the same, but instead of providing an alternative front-ends it provides plugable modules to scrape or generate RSS feeds directly. It has plugins for a bunch of sites that don’t provide RSS feeds, and while i’m not currently using any of them it’s cool that they are there.

Both Nitter and RSS-Bridge produce nice RSS feeds with options to include images, replies, user avatars, etc.

Currently i’m trying out RSS-Bridge and it seems to be working well. If Nitter continues to be worked on, and there are no issues (thinking TLS / certificate hassles) then perhaps that is the future answer. The downside of RSS-Bridge is that it’s written in PHP which always gives me the heebie-jeebies. Nitter, on the other hand, is written in something that feels entirely new called Nim… which looks cool, but it likely niche.

Oh, and the Nitter project has spawned browser plugins that redirect Twitter into Nitter. I’m using ‘twitter-to-nitter-redirect‘. Nice!

Grrraspberry Pi

My Raspberry Pi apparently draws …


so that’s close to nothing. It should probably stay on all the time.

As it is it gets turned on / off on average once a day. The end result is that periodically, usually when i’m hoping to not be messing around with computers, the SD card gets trashed. Then it’s a couple of hours remembering how to set it all up again.


Now with the SD Card stuffed in my Mac:

$ diskutil list


/dev/disk2 (internal, physical):
   #:                       TYPE NAME                    SIZE       IDENTIFIER
   0:     FDisk_partition_scheme                        *8.0 GB     disk2
   1:             Windows_FAT_32 boot                    268.4 MB   disk2s1
   2:                      Linux                         7.7 GB     disk2s2

Make backup:

$ sudo dd if=/dev/disk2 of=SDCard-backup.dmg

Takes a while, but should save some time the next time the filesystem dies… assuming i can work out how to restore it.. my best guess is something like:

$ sudo diskutil unmountDisk /dev/disk2
$ sudo dd if=SDCard-backup.dmg of=/dev/disk2

Not forgetting to eject:

$ sudo diskutil eject /dev/disk2

Yes, this entry is obviously intended to remind me of what’s what when this inevitably happens again.

Sony WH-1000XM3

After my mandatory multi-month waiting period for conspicuous consumption, i finally bought a pair of Sony WH-1000XM3 noise cancelling headphones. The only others considered were Sennheiser MOMENTUM Wireless (why are we shouting Sennheiser?) which definitely look cooler, and were moderately more comfortable during the short time that i tried them. They were also twice the price… which felt like a lot given that they had half the battery life, and didn’t feel to do as good a job at noise cancellation.

Noise cancellation was my primary reason for wanting new headphones. There are definitely times when there is stuff going on around the house / in the neighbourhood that annoys / distracts me when trying to concentrate on work.

Having used them for a few weeks they certainly do make the space around your head quiet… very quiet… almost too quiet! Unless you’re listening to music the silence can be a little overwhelming – to the point that you become aware of your heart beat! Initially this is unnerving, but now feels more manageable – if not entirely natural.

It’s a similar story wearing them out of the house. The noise cancellation is so strong that you feel isolated and cut off. Again, it gets better as you get used to the change. And, listening to Richter staring out of a train window is definitely a quality of life improvement!

The one kind of noise that isn’t completely cut out in higher pitched human speech… it’d be interesting to try them in japan and see if i still left wanting to murder on days when there was a chance of forgetting their fucking umbrella… <cough> for example.

Firefox Vacation

Apple has recently updated Safari (v13) to further restrict the ability of plugins… not actually sure that statement is entirely true without qualification, but for the sake of simplicity it is essentially correct. The immediate result of which is that uBlock-Safari is no longer usable.

The writing has been on the wall for a long time as the project was unmaintained for months – nothing had been pulled from upstream since April of 2018. Yes, it still worked but many of the countermeasures deployed by the attention thieves were becoming annoying.

Despite having a pi-hole configured on the network, which is blocking some 15 – 20% of requests (yes, that’s right roughly a fifth of the internet traffic is tracking / ads / trash), it’s still useful to be able to block at the element level. This “necessity” has lead me back to using Firefox as my default browser for the first time in many years.

For a long time Firefox on OS X / macOS has been horribly inefficient. Just regular browsing would chew through laptop battery at an amazing rate, playing video would immediately spin up fans. The same usage pattern in Safari is much much better. Lately changes have started to land in the Firefox Nightly release that start to address these issues. My understanding is that it is now using the Core Animation APIs and changing the way the compositing happens. You can read all about it in ticket Bugzilla 1429522.

In the longer term, my plan is to return to Safari as the default browser. What is missing is a content blocker that provides element level blocking… most likely that is going to be 1Blocker but i’ve not got around to validating that is the case.

The reason behind this desire to go back to Safari is not entirely straightforward. Despite being annoyed that uBlock Origin can no longer ship as it once did, the rational behind the change to Content Blocking API is essentially sound: plugins doing content level blocking require full visibility of the data being displayed in order to remove the elements which are blocked. With the new API this is somewhat inverted, the content blocker tells the browser what to block and there is no need to trust a third-party with processing all the data. Yes, it’s a pain to go through the transition but given the amount of trust it is necessary to have in a browser and, by extension, it’s plugins, it makes sense.

However, the above is not the entire story. Despite the improvements in Firefox Nightly, it’s still more power hungry than Safari, it just doesn’t integrate as neatly with macOS. Sharing and hand-off don’t happen. Gesture support is lacking (mostly pinch to zoom, which might be supported, but doesn’t work for me in the nightlies). And finally there are niggles like the reader not being a polished, scrolling not feeling as smooth… things that would eventually become annoying.

In summary: back using Firefox; it’s much improved; will probably stay for a while; still expect to return to Safari in the future.