For some rather convoluted reasons i’ve been doing some thinking about what it would mean for machines to gain consciousness… seriously, don’t ask why. This thinking lead me to Roger Penrose‘s work, in particular, Shadows of the Mind, and The Emperors New Mind.
It’s a somewhat guilty pleasure reading things that confirm your thinking, but it’s nice to have some rigour to back up what has previously been instinctual. I’m currently plodding through his thoughts on the implications of the Gödel’s incompleteness theorems, and beginning to wonder if i can’t just trust my instincts…
As an example of the kind of mindfuckery that we’re talking about here, in a section on intuitionism:
The most famous example of this occurs with Bertrand Russell’s paradoxical ‘set of all sets that are not members of themselves’. (If Russell’s set is a member of itself then it is not; if it is not then it is!)
I’d write more, but i need to go and clean-up the few chunks of my brain that are still stuck to the bathroom tiles… always good to read this kind of stuff in the bath.