Youtube… ffs

For the longest time i’ve been using a Safari Extension called ClickToPlugin, which replaced Youtube’s video player with the native Safari video player. There were a couple of reason for this, the biggest of which was the horrendous amount of CPU that the YouTube HTML5 player uses. It also disabled autoplay, another scourge of the ad-supported web. Oh, and it never played ad.

The recent re-design broke all this, and it doesn’t look like it’ll be repaired. Time to find another solution… <sigh>

There are other Youtube focused extension out there for Safari, but none of them seem to exactly what i want. Firefox has a few plugins to allow downloading, or copying the video URL, which gives you a way to choose the player. There doesn’t, however, seem to be anything that does exactly what ClickToPlugin managed.

For a few weeks i’ve been using a Firefox plugin to copy the video URL, pasting that into Safari, and letting it play it with the native player. But it means opening Firefox, and switching between browsers, etc.

More recently i started playing with youtube-dl. If i’m going to copy and pasting URLs why not give them to a script, and have it spawn a Quicktime player? Well, the Quicktime player doesn’t have a command line… and who wants to wait until a video has downloaded before watching? It would be better to pipe the output of youtube-dl to a player… but that will have to be something other than Quicktime.

When in doubt try ffmpeg – the true swiss army knife of video! The ffmpeg distribution includes a tool ffplay, which can play video piped into stdin. Looks like we have everything needed:

$ youtube-dl -q -f best --buffer-size 16K https://www.youtube.com/watch?v=DLzxrzFCyOs -o - | ffplay -fast -fs -loglevel quiet -

Now all i need is a dumb bash script in my path, which takes a URL, and plugs it into that command:

#!/bin/bash
if [ $# -ne 1 ]; then
    echo Usage: yt url
    exit 1
fi

url=$1

youtube-dl -q -f best --buffer-size 16K $url -o - | \
 ffplay -fs -loglevel quiet -

Yes, the amount of time and effort involved in avoiding the unavoidable smartness of the smartest people in Silicon Valley…

Advertisements

Signal Desktop (Again)

It could be that i’m still confused about the release channel for the standalone Signal-Desktop client, but it doesn’t appear to be released as a bundle.

My previous build from the ‘electron’ branch stopped working, telling me i needed to update. However, the electron branch has gone… which is actually good news, as it means that the changes have merged to master.

Starting from scratch, but with all of the NodeJS NPM / Yarn junk still around, all it took was cloning from GitHub:

$ git clone –recursive https://github.com/WhisperSystems/Signal-Desktop.git

$ cd Signal-Desktop/
$ yarn pack-prod

Edit: the module dance can be avoided with the following:

$ node –version
v8.6.0
$ yarn –version
1.1.0
$ git clone –recursive https://github.com/WhisperSystems/Signal-Desktop.git

$ cd Signal-Desktop/
$ yarn install
$ yarn pack-prod

And then the usual dance to add modules until things started working. That part of the process defies description, and short of trying re-install all the npm / yarn stuff and starting from nothing, its unlikely that we’ll see the same things. In my case i had to ‘npm install’ the following:

  • electron-icon-maker
  • grunt

and ‘yarn add’:

  • grunt-sass

I’d have thought that there is a actually a module dependency list in the bundle, and a yarn / npm command to process it… no idea what that might be!

signal-desktop

It would be nice if there was an official build soon. Would like to stop having to do this dance – especially as the builds have been working perfectly for what i need for months now!

Various Failures

Recently when working through the backlog of film in the fridge, i managed to develop a roll that had obviously been nowhere near a camera. Oops. So professional.

Perhaps to balance that out there was also a roll that had been through the XPan at least twice, and perhaps three times, in London (once in August 2016, again in August 2017), Hamburg, and Nagano. The results are, to say the least, chaotic.

Also in this batch of films was a roll of Ilford SFX 200 shots of my neighbours water damaged ceiling. Obviously under-exposed and consequently rather “moody”. It’s all so much water under the bridge, etc.

Baumkamp-EG_05

Through Yellow Sands

The_Untouchable_City copy

Been spending a lot of time working through old pictures for the next I Wrote This For You book. The image above (from the entry ‘The Untouchable City’), taken from top of Roppongi Hills, with 黄砂 (こうさ) on the glass, has really stayed with me!

Now i want to get it printed A2 and on a wall… maybe with the oddness in th bottom left cropped out.

… No!

Last weekend / week some valiant efforts were made to import a Blogger export file into the latest WordPress release running on my laptop. Just getting that shitshow on the road was not a walk in the park. I’ll leave the next paragraph as a warning to future travellers on macOS:

WordPress will not connect to MySQL / MariaDB on localhost, update your config to use 127.0.0.1.

How PHP manages to be so consistently bad is a source of amazement.

My initial excitement about having found a plugin was tempered by the eventual realisation that the hooks are only there when saving a new entry. Presumably the code could be re-used in a pipeline that uses the Blogger REST API to pull entries, pushing them into WordPress, attaching images and rewriting img tags as it goes.

My hopes of getting this done without writing my own toolchain are slowly fading into the ether…

Just

You do it to yourself, you do
And that’s what really hurts
Is you do it to yourself, just you
You and no-one else
You do it to yourself
You do it to yourself

That’s the thing, if i was better at ignoring things…

Last weekend’s adventure was making a local backup of all my images in Flickr. This was really about getting to a point where it wouldn’t matter if that account got closed, or the new owners (Verizon?) decided to move everything, etc. It was a good start, but there is another (bigger) problem that is going to stop me from moving forward: the IWTFY blog.

There are hundred… i don’t know, maybe thousands of posts there that link directly to the images hosted on Flickr. Oops. Besides that being against the terms of service it rather neatly ties me to keeping the Flickr account for as long as there is an IWTFY blog… and that’s ongoing for a decade at this point!

Obviously a plan is needed!

  1. Export IWTFY from Blogger / Google – this seems straightforward.
  2. Setup a local instance of WordPress where i can run plugins.
  3. Write a plugin that pulls all of the images into the local WordPress instance
  4. Export IWTFY from the local instance
  5. Import into a hosted WordPress

This feels very much like a process that someone else must have already been through, and either made available / sells as a service… right? Proceeding onwards assuming that i’m not able to find that person / service, it looks like it isn’t difficult to use something like  the WP_Http API call to grab images from within a blog post. And somewhere in here there has to be a way loop through all the entries in blog. No idea what kind of WordPress *thing* i’m supposed to be coding, but yeah… Bildung!

Can’t actually believe i’m going to do this. Save me from myself? Please.

[Yes, the tags are something of a honeypot.]

Edit: someone already wrote the plugin i need! Import External Images – even if it doesn’t do exactly what is needed, its only a little PHP hacking… how bad could it be?!