Firefox Vacation

Apple has recently updated Safari (v13) to further restrict the ability of plugins… not actually sure that statement is entirely true without qualification, but for the sake of simplicity it is essentially correct. The immediate result of which is that uBlock-Safari is no longer usable.

The writing has been on the wall for a long time as the project was unmaintained for months – nothing had been pulled from upstream since April of 2018. Yes, it still worked but many of the countermeasures deployed by the attention thieves were becoming annoying.

Despite having a pi-hole configured on the network, which is blocking some 15 – 20% of requests (yes, that’s right roughly a fifth of the internet traffic is tracking / ads / trash), it’s still useful to be able to block at the element level. This “necessity” has lead me back to using Firefox as my default browser for the first time in many years.

For a long time Firefox on OS X / macOS has been horribly inefficient. Just regular browsing would chew through laptop battery at an amazing rate, playing video would immediately spin up fans. The same usage pattern in Safari is much much better. Lately changes have started to land in the Firefox Nightly release that start to address these issues. My understanding is that it is now using the Core Animation APIs and changing the way the compositing happens. You can read all about it in ticket Bugzilla 1429522.

In the longer term, my plan is to return to Safari as the default browser. What is missing is a content blocker that provides element level blocking… most likely that is going to be 1Blocker but i’ve not got around to validating that is the case.

The reason behind this desire to go back to Safari is not entirely straightforward. Despite being annoyed that uBlock Origin can no longer ship as it once did, the rational behind the change to Content Blocking API is essentially sound: plugins doing content level blocking require full visibility of the data being displayed in order to remove the elements which are blocked. With the new API this is somewhat inverted, the content blocker tells the browser what to block and there is no need to trust a third-party with processing all the data. Yes, it’s a pain to go through the transition but given the amount of trust it is necessary to have in a browser and, by extension, it’s plugins, it makes sense.

However, the above is not the entire story. Despite the improvements in Firefox Nightly, it’s still more power hungry than Safari, it just doesn’t integrate as neatly with macOS. Sharing and hand-off don’t happen. Gesture support is lacking (mostly pinch to zoom, which might be supported, but doesn’t work for me in the nightlies). And finally there are niggles like the reader not being a polished, scrolling not feeling as smooth… things that would eventually become annoying.

In summary: back using Firefox; it’s much improved; will probably stay for a while; still expect to return to Safari in the future.

Twitter over RSS

What happened was that in a fit of optimism (quite out of character) i deleted my Twitter account and moved to Mastodon. That ended well – all alone on Mastodon. People i still care about post on Twitter, but i can’t bring myself to sign up again. Too stubborn for that…

Meanwhile, NetNewsWire is back, and i’m using RSS as a way to keep up with a few things. Wouldn’t it be great if there were RSS feeds for Twitter?! Yes, but they stopped supporting that years ago.

There is, however,, a project that scrapes Twitter, and returns RSS. Which is great except for a couple of things, 1) Twitter also loves to change their markup, 2) Twitter loves to rate-limit.

While there isn’t much that to be done about the first issue except update the code, the second issue has pushed me to running the service… wait for it… in Docker container… yep, you guessed it… on my laptop. Oh my.

This isn’t very difficult:

  • install the Docker Desktop package (obviously without creating an account, etc)
  • clone the repo from github
  • follow the instructions in the (noting that –port can be replaced with -p)
  • re-add all your feeds in NetNewsWire using the instance of TwitRSS running locally (http://localhost:3000 unless you changed the port)

Time will tell how stable this, but as of now it keeps me in touch with peeps, while keeping Twitter at arms length.

Group Chats

This caught my eye:

In some ways, group chat feels like a return to the halcyon era of AOL Instant Messenger, once the most widespread method of messing around with your friends on the internet.

Group Chats are Making the Internet Fun Again

It’s probably not a good look to admit that you’ve been experimenting on your friends, but for the last couple of years i’ve been badgering people into using Signal, and creating group chats for circles of friends / acquaintances. Periodically i’ll create a temporary group to help plan a specific event or to aide collaboration on a task.

In the time that i’ve been doing this nobody in my circle has ever invited me to join a group they have created. It’s possible that they all hate me, or at the very least resent me setting the venue. More likely (i hope) it is still simpler to do everything publicly on a platform like Twitter / Facebook / Instagram.

I found that there were things that it would be better to communicate to a specific subset of my “followers” (such a creepy term!) on Twitter, but most seem quite comfortable with the broadcast model. Now i’m a little more isolated online, but less overwhelmed by interruptions.

Oh, and sorry about the experiments. Friends don’t experiment on friends… or do they?


There have been a couple of times where it would be have been useful to be able to fake the origin of my internet traffic. Services that geo-block, differentiated regional pricing, etc.

To some extent this can be done with Tor, requesting new circuits until you get an exit node in the locale you need. However, many exit nodes get blacklisted, or made difficult to use by the “saints” (hello Cloudflare!) running some of the largest virtual hosting environments.

Time for a VPN?

The thing about VPNs is that, unless you are very careful, they provide only an incremental improvement in privacy. Still, if the goal is to be able to defeat geo-blocking it’s not a bad answer… except that the OpenVPN, the software supported by most VPN providers doesn’t have a stellar record for security (due to it’s size, it presents a large act surface), has relatively high-over head, and kills batteries.

Wireguard is a proposed answer to these issues. It’s new code, but is building on modern security libraries freeing it from some of the baggage that OpenVPN has been lugging around since 2001, and is small enough that it can realistically be reviewed / validated. Conceptually it’s a lot simpler than what came before it… i’d emphasise the “conceptually” part – making no claims to have truly understood the implications of the security choices that have been made.

And there are now VPN providers out there that are supporting Wireguard. Having signed up with one of them to experiment, it seems to be a big step forward. For my home connection (100MBit / 50MBit) it’s possible to saturate the downlink when connected to VPN… when using a server exit in Germany. Leaving it connected on my laptop there isn’t any noticeable change in battery life. I’d assume there is a change, but it’s not enough for me to worry about it.

The clients i’m using (macOS / iOS) are from the respective AppStores, and have been straightforward to configure.

All that said, not sure that i’m going to carry on paying for a VPN. The most compelling reason to do so is using it from my phone over sketch “free” WiFi… but if that is the only thing i care about it would be just as easy to install Wireguard at home and route all the traffic via that connection. The only thing stopping me doing that is laziness.

Wireguard on Synology. That’s the thing!

Building PaulStretch on Mojave in 2019

It’s a weakness. Software archeology. The topic of PaulStretch came up during the week, and of course i wanted to play with it again.

Back in 2013 there were downloadable binaries, but it seems they probably stopped working a few macOS releases ago. And besides, where is the fun in that!

The good news is that despite not being touched for years the code does actually compile.   There is no bad news, it’s working perfectly, and i’m listening to the Alien Sex Fiend track ‘I Walk The Line’ stretched out to 35mins.

After cloning the code from GitHub, it’s just a matter of installing the right libraries. I’m using MacPorts and the other dependency that was missing was mini-xml. That is also on GitHub, but the latest release isn’t compatible, fortunately the v2.12 release is fine.

Work from the script here:

  • install FLTK (available in MacPorts)
  • run the two fluid commands
  • run the g++ command (changing $outfile to paulstretch
  • install the missing libraries until you get a binary

The missing libraries were:

  • audiofile
  • libmad
  • portaudio
  • fftw-3-single (this is suggested as optional, but it wasn’t hard to include)
  • mini-xml

To install mini-xml it was the usual:

./configure –prefix=/opt/local ; make ; sudo make install

Everything else was just:

sudo port install

Screenshot 2019-03-23 at 17.26.21.png

Aint it pretty!

$ fluid -c GUI.fl
$ fluid -c FreeEditUI.fl

$ g++ -ggdb GUI.cxx FreeEditUI.cxx *.cpp Input/*.cpp Output/*.cpp `fltk-config –cflags“fltk-config –ldflags`-laudiofile -lfftw3f -lz -logg -lvorbis -lvorbisenc -lvorbisfile -lportaudio -lpthread -lmad -lmxml -o paulstretch

FTW, i tried to statically link it but it start to complain about missing libraries. Perhaps i’ll be motivated to learn how to package a .dmg file if there is interest. Surely it can’t be that difficult… right?

Update: it doesn’t look that difficult to create a ‘’, there is some Info.plist to create, copy all the libraries into the directory structure:

$ otool -L paulstretch

and then update the paths with ‘install_name_tool -change’ to update the paths to reference the copy in the App directory structure.

Check back tomorrow!

Update: after a few more shenanigans fixing library references in libraries that reference libraries (it’s turtles all the way down, etc) there is now a “nicely” packaged ‘’ that can be distributed in a DMG image. It has even been tested off my machine!

$ tree
└── Contents
    ├── Info.plist
    ├── MacOS
    │   ├── libFLAC.8.dylib
    │   ├── libaudiofile.1.dylib
    │   ├── libfftw3f.3.dylib
    │   ├── libfltk.1.3.dylib
    │   ├── libmad.0.dylib
    │   ├── libmxml.dylib
    │   ├── libogg.0.dylib
    │   ├── libportaudio.2.dylib
    │   ├── libvorbis.0.dylib
    │   ├── libvorbisenc.2.dylib
    │   ├── libvorbisfile.3.dylib
    │   ├── libz.1.dylib
    │   └── paulstretch
    └── Resources
        └── PaulStretch.icns

Key Expiration / Rotation

It’s probably a good idea to check if you PGP key has expired. The easiest way to do this is in the tools that you use to manage your keys.

Another is to go to somewhere like, and search for your mail address. Doing this lately have been an exercise in frustration as the servers always seem to be offline, slow, or out of sync.

My most recent key was old and short, so i’ve generated a new one, and expired the old one. As linking to the key servers seems to be hit-and-miss, here it is:


This (0x1273F11F) replaces my previous key (0x7D69EE91), and you should now get warnings when using the old one… assuming that you can update keys from the key servers at some point in the future.

On macOS i’ve been using GPG Tools, and had considered giving them money to continue to use it. However, having watched a 35c3 talk titled, “Attacking end-to-end email encryption” which covers all the ways that PGP is broken in mail clients (“except mutt!”) i’m more convinced than ever that secure mail with PGP is essentially a disaster waiting to happen.

Signal, despite its lack of UI / UE polish, remains a much better option if you can get the other party to agree to use it. If you have to send and receive PGP mail of any import, do as the experts suggest and compose it outside of a mail client. And, for the love of gub, don’t do it anywhere near a browser!


MacBook Pro 2018 “Review”

My work has provided me with a 2018 MacBook Pro. This is my reaction to living with it for a couple of days. For perspective, i’ve been using Apple machines, pretty much exclusively, for 20 years. Much of that time they were glorified terminals to solaris or linux systems, but more recently (the last decade..) as my primary development environment.

The Good

  • the screen is amazing.
  • the speakers are a huge improvement over the already pretty good speakers in the 2014 model

The Bad / Ugly

  • no magsafe charging… who thought that was a good idea? For years it has been possible pick up a laptop and safely knock the charge cable off the machine with your thumb, but now it’s a cable to pull, and an easy way to drag a machine off a desk. At least usb-c is physically better than thunderbolt 1/2 connector… which is actually negative as it’s now more of a hassle to disconnect the thing!
  • in order to plug in almost anything you need a stupid dongle. Can’t even charge an iPhone without buying a new cable! I get that everything is wireless now, but i don’t want a wireless keyboard that will inevitably need to be charged, have a battery that gets worse over time. It only has a battery because it’s wireless and that’s dumb. Who are these crazy people who are desperate for free movement of keyboards and trackpads? Idiots.
  • on the subject of keyboards – this one sucks. Just looking at it you can tell it’s bad – the keys have no travel, are comically large, and stupidly flat. To make things worse they make a stupid amount of noise when struck. Is that supposed to trick me into thinking they have some kind of sturdy mechanism? It’s not working. Feels like a toy.
  • still on the subject of keyboards, the fn keys are missing. Not there. The replacement for this is a strip of plastic that provides no physcical feedback, and blinks distractingly as if it has some useful purpose. Part of the collateral damage here is that the Escape key is missing. What a fucking joke.

It seems unlikely that i’ll be using this machine as a true laptop for any extended period of time. In order to use it at all i’m connecting a keyboard (wired, obviously), a trackpad (which has batteries that will fail when it is least convenient), and will probably hook it up to a monitor at some point.

Due to the vagaries of corporate IT and Security weenies it’s hard to say if the proceeding four years have yielded any performance improvement. It certainly doesn’t feel any faster, even if it does now handle builds with spinning up fans. However, that could be the slew corporate spyware installed. Hard to say.

In my opinion it’s a hateful thing. A disaster of form over function. A miscarriage of design to hit the wrong targets. I don’t care if it’s n millimeters thinner, don’t care if it’s a couple of grams lighter – the primary interface (the keyboard) is an abomination. Couldn’t give a fuck about a new physical interface in search of a reason to exist – the Touch Bar is utterly pointless and distracting.

Would not recommend this machine, as a pure laptop, to my worse enemy.