Saturday, June 23, 2012

Alan Turing: 100th birthday

Echo comments migrated: The migration process was underway when I was originally writing these paragraphs and was completed in 2-3 hours, around 6:40 pm Pilsner Summer Time (Saturday).

As expected, the reply from DISQUS hadn't been too helpful. The avatars and recognized identities are lost during import (much like "Liked by [Echo users]" lists, of course); maybe an offer to merge your unregistered comments from the past with your DISQUS account will ultimately catch you. Also, the reply structure "should be preserved" except that it isn't. They told me to try another format of XML – DISQUS' most current format – to import which would mean to rebuild a 100-megabyte file with 20 renamed different types of tags, differently formatted dates, differently formatted CDATA, different character sets etc., with an outcome that could fail to work, anyway.

My answer is no, sorry: I won't do that if even a company with only 33 employees that earns millions for this very job doesn't do it right.

I had preemptively prepared a reaction to such an outcome of my conversation with the DISQUS support team. The decision prepared for this unhelpful reply was not to waste additional dozens of hours for some professionals' lousy job and to immediately start the import of my current XML file which has been cured (especially when it comes to the right URLs and missing author names) by tens of hours and a rather long Mathematica code. This plan became a reality within minutes and the import started to proceed a few hours later, after some time in the queue. So the old Echo comments will be preserved but:
  1. The reply structure is lost. Only if you sketch a full plan to retroactively build the nested structure, e.g. by some DISQUS API commands (I have no experience with any API), I will revisit the problem.
  2. All imported Echo comments have the same anonymous avatar and their authors can't "overtake them" by their current DISQUS accounts: the true identities of the authors aren't recognized.
  3. Some special characters are screwed due to wrong character sets – which was changing between ANSI and UTF-8 during the Haloscan history. So you will see lots of Lubo Motl's, to mention a damaged name of a random TRF Echo commenter. ;-) Sorry, Alexander Aè-Ač, that you were not picked as the toy model.
The attached 171 pictures from js-kit.com should be preserved and linked by an extra paragraph and URL that I have added. See how a thread with 16 comments recently imported from Echo to DISQUS looks like. This thread has 24 DISQUS comments that combine 14 formerly Blogger.com "slow" comments with 10 Echo comments: they're just mixed with each other into a multiracial society and have the same anonymous feel. ;-)

DISQUS has no image-based smileys and my attempts to insert scripts that replace the smileys by GIFs would probably be futile because DISQUS comments get displayed in a separate iframe (which I consider a rather good policy, by the way). The pirate icon under the blog entries sends you to the Echo archives (with all the good old smileys and like-by's and reply structure) but you can no longer add new Echo comments which is why you may want to get familiar with DISQUS 2012 which is arguably a superior system, anyway. The Echo archives will disappear on October 1st; the pirates will go extinct soon afterwards.

I recommend regular users to look at the capabilities of DISQUS. You may surely have your "D" account that displays a nickname, not a real name, and you may choose your avatar etc. in gear-icons (right upper corner of DISQUS), Edit your profile, go through the menus. The conversations may be ordered from the newest, oldest, or best comments (click at the Discussion in the upper left corner of the DISQUS iframe). Choosing e.g. "oldest" again is a fast way to reload the DISQUS iframe, by the way. Try to go through the buttons to see what DISQUS allows you. Non-troll commenters are being quickly placed on the white list, to avoid the "due to abuse" message (which is a buggy message that stands for "this comment awaits a moderator").
As Google's Doodle reminds us today, Alan Turing would celebrate his 100th birthday on this very day if he hadn't swallowed some cyanide when he was 41.



Because he may be considered the forefather of computer science, it's kind of incredible to realize how fast the progress in computer science and computer industry has been in the recent 100 years. Given the current state of medicine, the forefather of these fields could easily be with us today.

He's not but that's just due to an unlucky accident.





And we must appreciate that the first 24 years of this century of computer science were passive: only when Turing was 24, in 1936, he described his a-machine (automatic machine) or Turing machine, as we call it today. This machine was important as a "proof of concept" to show that deterministic gadgets based on a rather simple philosophy were capable of doing pretty much all mechanical tasks that humans can do with the information and numbers.

Today, we understand this principle rather well and real-world computers have lots of internal structure not described by the Turing machine design. However, the Turing machine is still being used as a mathematical model of a device that can do everything that men (and women) can do with the information – and everything that equivalent computers can do.

What is a Turing machine?

It has an infinite tape with infinitely many squares. Each of them contains a symbol from a finite set that can be rewritten. One symbol/square in this infinite memory is just being read by the "head". This symbol is \(a\). However, you can't live with the tape only. You also need some "processor", right?

A Turing machine has a processor that can be reduced to a finite memory, the state register whose immediate state is denoted \(q\), and... What about the part that actually calculates? This whole part is represented by a table and nothing else. That's enough. The table says\[

q(t)a(t) \mapsto q(t+1)a(t+1)d(t+1).

\] Depending on the variable data, namely the state \(q\) of the state register and the character \(a\) that the "head" is just reading, the table determines what the new state of \(q,a\) should be – those things are being rewritten in the state register as well as on the tape – and for these values of \(q,a\), the table also says whether the head should stay at the same place, i.e. \(d=0\), or move by one square to the right or to the left, \(d=\pm 1\). That's it. That's everything you need. That was Turing's representation of the Al Gore Rhythm that he was able to invent years before Al Gore, the inventor of the Internet and the savior of the world, was even born.

Well, for finite projects, you don't really need the infinite tape, either. A finite tape would be enough and its memory could be incorporated in the state register, your humble correspondent would simplify the Turing machine. ;-) But with this simplification, the machine would be really dull: everything it could do would be written in the table. The separation of the memory to the tape and the state register in the Turing machine conveys the fact that a computer is supposed to process a huge amount of data with instructions, codes, and processors that are much more limited.

What your CPU or GPU is doing is just one gigantic table. Of course, it is a table with many patterns that contains addition and multiplication tables and many other more complicated tables as its subtables. These tables aren't being remembered in a gigantic amount of memory. Instead, they're being calculated. New computers are calculating – and doing many complicated things with graphics as well – by a synchronized dance of many transistors. Modern computers have many layers of memory (and cache) that interpolate between the tape and the state register, too. But in principle, a large enough table would be enough. It wouldn't be equally practical but it would be mathematically equivalent.

If you haven't heard about the cyanide story, I must tell you. In 1952, he was arrested and accused of a sexual contact with another gay – the same kind of a "crime" that has previously haunted Oscar Wilde as well and that would instantly promote you to the chair of an Ivy League university president nowadays. He was fired and had to undergo an estrogen therapy. Because of the latter, the thin marathon runner got fat and acquired breasts. He was disgusted by these new organs so in 1954, he got inspired by his favorite fairy-tale hero, Ms Snow White. In particular, he took "an especially keen pleasure in the scene where the Wicked Queen immerses her apple in the poisonous brew". He did the very same thing. Just like in the 1937 Snow White movie, he died – but something else was born at the same moment. Of course, I am talking about the logo of Apple:



Now, you just add a few more details by John von Neumann, Bill Gates, Steve Jobs, and a few others – apologies if I have forgotten about your name but the state register of this particular blog entry was limited and I didn't want to write it on the tape – and lots of straightforward work and you get to the current state of the computer science and computer industry. ;-)

Happy birthday and rest in peace, Alan Turing.

12 comments:

  1. The story that the logo of Apple stems from Turing may be little more than a popular myth. In Walter Isaacson's biography of Steve Jobs, no mention of Turing is found. It seems that the Apple logo is an apple due to the fact that Steve Jobs had worked at a friend's hippie commune in Oregon picking apples.

    ReplyDelete
  2. Maybe but the legend about Turing's end surely sounds more impressive than a seemingly equally speculative yet down-to-Earth hippie story.

    ReplyDelete
  3. In another biography of Jobs, they say that the Apple logo stems from the proverbial apple that fell on Newton's head, thus sparking his "creativity" and, ultimately, allowing him to find the law of universal gravitation. Since Apple always marketed itself as a maker of products for "creative" people, this myth is the most appealing of all. No need to eat apples, just byte the apple ;-)

    Wozniak was an engineering wizard, Jobs was a master marketer with only one semester of college. It seems unlikely to me that any of them had ever heard of Turing.

    ReplyDelete
  4. My guess is that Alan Turing was one of the most important players of WWII.


    Somewhat off topic, I heard an interview with Carlos Bueno, the author of the computer science for kids book "Lauren Ipsum".


    "Lauren Ipsum is lost in Userland. She knows where she is, or where she's going, but maybe not at the same time."


    http://www.cbc.ca/player/Radio/Spark/ID/2243650443/


    The interview starts at 26:30.


    Bueno made a statement something to the effect "Computer Science is not really a science and not about computers". The "not a science" part I get, more like a branch of mathematics. The "not about computers" part I hadn't thought about, but upon consideration is correct.


    I'm thinking about buying the book for my 10 year old son. What type of books did your parent buy for you when you were young?

    ReplyDelete
  5. i am definitely not an expert on apple, although OS X is a good operating system except OS X Lion that sucks. from what i know AJ is more right but the first apple logo was much different than the one used today and it was not bitten. what maybe happened is that they used the image of an apple that from what i know Newton sort of jokingly said that fell from a tree and some years later maybe they said that since they are using the apple as a logo, they should change it to a bitten apple because of Turing. Gordon if you have more details about the apple like you had about the heads not being chopped off (i knew they weren't) last time, you are welcome to comment.

    ReplyDelete
  6. Hehe, the Apple logo got them into a dispute with the record company of the Beatles. This has only recently been overcome to allow one to get Beatles music on iTunes.

    I can't remember who claimed ripped off who, but if it was Apple Records claiming Apple Computer ripped them off, any of those stories could have been their defense.

    ReplyDelete
  7. Babbage was certainly ahead of his time in his thinking
    but the real
    inventors of the computer as we know it were Alan Turing and John Von
    Neumann, based largely with their war time experience with the
    specialized machines Colossus and ENIAC respectively.

    ReplyDelete
  8. Prof. Jack Copeland, a "Turing expert," said yesterday that by the standards of today the evidence would not support a verdict of suicide; Turing may have killed himself by accident.

    http://www.bbc.co.uk/news/science-environment-18561092

    ReplyDelete
  9. They agreed to let Apple have the name since Apple Computer would never be involved with music.

    ReplyDelete
  10. The persecution of Alan Turing is a permanent stain on the 20th century
    history of the UK. Churchill acknowledged that he and his colleagues at
    Bletchley Park probably shortened the war by two years (yet still
    insisted on destroying all of their working machines and documents,
    rather than using them as the springboard for a post-war UK computer
    industry). His face on a banknote is the least that should be done to
    honor his memory - so don't expect any action any time soon.

    ReplyDelete
  11. I always thought the explanation of the apple logo was the obvious one - that the apple represents the apple of knowledge from the Garden of Eden, with a healthy bite taken out by Eve. A kind of gnostic influenced "yay science" piece of symbolism. Not that I have seen any source that confirm this, it was just my impression. I've also sometimes wondered that the Isaac Newton apple-on-the-bonce story might be a self conscious tounge-in-cheek reference to this kind of gnostic symbolism, as Newton himself was into alchemy/gnosticism and so forth. Again, I'm haven't actually researched this, but it seems plausible and wouldn't surprise me.

    ReplyDelete
  12. The price is right:

    http://monkeybuddha.blogspot.ca/2012/04/apple-i-advertisement.html

    ReplyDelete