End-of-central-directory signature not found

You download a zip file only to find that it barfs when you try to un-archive it:

$ unzip -t test.zip 
Archive:  test.zip
  End-of-central-directory signature not found.  Either this file is not
  a zipfile, or it constitutes one disk of a multi-part archive.  In the
  latter case the central directory and zipfile comment will be found on
  the last disk(s) of this archive.

What a nightmare.

 

Now you’re left with the prospect of attempting to download again, or see if you can salvage what you have. Bandwidth is cheap, but the machine at the other end is no longer responding. Great… time to learn how to extract a partially downloaded / corrupt zip file!

It’s actually a lot easier than you might think… which makes me wonder why i’ve never learnt it before. First try a little force:

$ zip -F test.zip --out partial.zip
Fix archive (-F) - assume mostly intact archive
	zip warning: bad archive - missing end signature
	zip warning: (If downloaded, was binary mode used?  If not, the
	zip warning:  archive may be scrambled and not recoverable)
	zip warning: Can't use -F to fix (try -FF)

zip error: Zip file structure invalid (test.zip)

Nope. Now a little more force:

$ zip -FF test.zip --out partial.zip
Fix archive (-FF) - salvage what can
zip warning: Missing end (EOCDR) signature - either this archive
is not readable or the end is damaged
Is this a single-disk archive? (y/n): y
Assuming single-disk archive
Scanning for entries...
copying: selected exported/3 monkeys.jpg (2629234 bytes)
...
copying: selected exported/worried and walking.jpg (21563355 bytes)
Central Directory found...
zip warning: reading central directory: Undefined error: 0
zip warning: bad archive - error reading central directory
zip warning: skipping this entry...

Good to go?

$ unzip -qt partial.zip 
No errors detected in compressed data of partial.zip.

Good to go!

Advertisements

Imagination Lacking

Untitled.jpeg

Trying to imagine what my life would have to be like to live through that week of Tory arse-hattery and think, “you know what we need? we need more arse-hattery!”.

It becomes increasingly difficult to express sympathy.

Spicy Octopus Pasta

  • octopus legs (one per person?)
  • garlic 2 – 3 gloves
  • shallot
  • tin of whole tomatoes
  • dried chilli flakes 1 tsp
  • olive oil

Wash the octopus in salty water, rinse and cut into thick slices. Thinly slice the garlic and shallot. Sauté the garlic and shallots in olive oil until they start to have a little colour, add the octopus pieces, cook for a minute, then add the chilli flakes and tomatoes. Break up any large pieces of tomato. Season. Simmer for 15 – 20 minutes on a medium heat, partially covered.

Uncover and cook for another 10 minutes to let it reduce / thicken.

Really good with a pasta like Bucatini or Linguine.

Optional: sit back and remember how good this was when you ate it in Valetta.

Pi(e) Holing Facebook

It all started with a click. While reading the newspaper i clicked on a link to Facebook and was shocked when it opened.

The reason for my surprise was that in my /etc/hosts i had the following entry:

# Block Facebook
127.0.0.1   www.facebook.com
127.0.0.1   facebook.com

a rather blunt instrument, but one that until now had been effective at shitcanning any links. So why had it stopped working? After some confused poking around it became obvious that my new ISP provided way more IPv6 routing than the old ISP, and macOS was now favouring IPv6 traffic. As a consequence the hack in my /etc/hosts grew to include entries for IPv6:

fe80::1%lo0 www.facebook.com
fe80::1%lo0 facebook.com

And once more Facebook was back in the shitcan.

Note: adding hosts to /etc/hosts is obviously tedious – you can’t wildcard and blocking the root domain doesn’t block sub-domains. In order to get rid of all Facebook servers (just the obvious ones) takes over ten entries, all of which need to now be repeated for IPv6.

At this point any rational person would conclude that this is not a sane thing to be doing. Obviously it’s time to be running my own DNS server and sinkhole and shitcanning domains with wildcards!

Fortunately there are still plenty of people on the internet who haven’t given up, for example, Pi-hole. By installing Pi-hole on a Raspberry PI hanging off the back of my router, and updating clients to use it as a DNS, i have a place where it is possible to wildcard block entire domains.

As a well as providing DNS Pi-hole also maintains a (partial) list of domains that serve ads. This means that devices on your home network that aren’t running ad blocking now has a good chance of not being served ads. This was a partially solved problem, as the Raspberry PI also runs Privoxy  which also blocks a good percentage of ads.

As an aside, the war between ad blockers and ad pushers has been quietly escalating and i’ve been starting to notice that a few news sites are managing to execute Javascript that blocks uBlock Origin. Sites that employ such measures are still blocked from displaying ads by Pi-hole and / or Privoxy.

While installing Pi-hole it was necessary to make some decisions about what to use as a DNS authority. There are some obvious answers like 8.8.8.8 (Google), 9.9.9.9 (IBM and some shady law enforcement types), OpenDNS, OpenNIC, etc. None of which seem ideal.

You probably won’t be surprised to hear that all your DNS queries are sent, unencrypted, over port 53. Which initially sounds like a really bad thing – it would provide your ISP with an easy way to know every site that you looked up. However, in all likelihood they aren’t doing that… mostly because they have stronger, government mandated, requirements to meet, such as tracking every site that you actually visit and when you visited it, not just the ones that you happen to lookup, and then subsequently visit via a cached lookup. If all you had to do was run your own DNS to avoid tracking… yeah, not going to happen.

Despite the above rational, there exists a parallel DNS infrastructure called DNSCrypt, mostly volunteer run, that proxies encrypted access to DNS. Assuming that you can trust that they aren’t logging (something you’re already doing with the DNS providers listed above…) then you can effectively block any visibility of your DNS activity to your ISP… not that they’ll care. If your traffic isn’t leaving your machine via an encrypted tunnel (think VPN, Tor, etc) then you can assume that it is being inspected and logged at the packet level.

In terms of increasing privacy DNSCrypt doesn’t seem to offer very much. It does offer some other protections against DNS spoofing attacks, but i’m not sure how widespread those are in the wild. I’d also guess that the other major providers of DNS are taking countermeasures as they are needed… and are maybe more effective than the volunteer force behind DNSCrypt.

I’ll probably end up installing the dnscrypt-proxy on the Raspberry PI and using it as the resolver for Pi-hole. In the end it’s just going to be an encrypted proxy for OpenNIC, which if given a choice is where i’d want my DNS to be resolved.

I’d recommend looking into Pi-hole it’s a really nice of tools to have a better understanding and control of what devices on your network are actually doing. Oh, and keep in mind that IPv6 is now a thing, running in parallel to the IPv4 internet for which you probably had some reasonable mental model… learning about RA, SLAAC and it’s Privacy Extensions) DAD, etc. was an eye opener for me!

Youtube… ffs

For the longest time i’ve been using a Safari Extension called ClickToPlugin, which replaced Youtube’s video player with the native Safari video player. There were a couple of reason for this, the biggest of which was the horrendous amount of CPU that the YouTube HTML5 player uses. It also disabled autoplay, another scourge of the ad-supported web. Oh, and it never played ad.

The recent re-design broke all this, and it doesn’t look like it’ll be repaired. Time to find another solution… <sigh>

There are other Youtube focused extension out there for Safari, but none of them seem to exactly what i want. Firefox has a few plugins to allow downloading, or copying the video URL, which gives you a way to choose the player. There doesn’t, however, seem to be anything that does exactly what ClickToPlugin managed.

For a few weeks i’ve been using a Firefox plugin to copy the video URL, pasting that into Safari, and letting it play it with the native player. But it means opening Firefox, and switching between browsers, etc.

More recently i started playing with youtube-dl. If i’m going to copy and pasting URLs why not give them to a script, and have it spawn a Quicktime player? Well, the Quicktime player doesn’t have a command line… and who wants to wait until a video has downloaded before watching? It would be better to pipe the output of youtube-dl to a player… but that will have to be something other than Quicktime.

When in doubt try ffmpeg – the true swiss army knife of video! The ffmpeg distribution includes a tool ffplay, which can play video piped into stdin. Looks like we have everything needed:

$ youtube-dl -q -f best --buffer-size 16K https://www.youtube.com/watch?v=DLzxrzFCyOs -o - | ffplay -fast -fs -loglevel quiet -

Now all i need is a dumb bash script in my path, which takes a URL, and plugs it into that command:

#!/bin/bash
if [ $# -ne 1 ]; then
    echo Usage: yt url
    exit 1
fi

url=$1

youtube-dl -q -f best --buffer-size 16K $url -o - | \
 ffplay -fs -loglevel quiet -

Yes, the amount of time and effort involved in avoiding the unavoidable smartness of the smartest people in Silicon Valley…