Tuesday, October 03, 2017 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

David Gross and the cloud

Today, as I totally expected, the physics Nobel prize went to LIGO fathers: Weiss 50%, Thorne 25%, Barish 25% which exactly matches my recommendation so OK! PC warriors didn't manage to remove Barish despite his black face joke and they didn't add a political spokeswoman despite her more than equal sexual organs, either; I knew it would be LIGO at 11:42 when someone said "Einstein" in the hall LOL.

It's a good reason to talk Nobel prize winners. A Nobel prize winner in physics came to a shop and told the clerk: "I would like a new telephone, my budget is not constrained at all."

"In that case," the clerk responds, "we have this new iPhone X for you. It has Face ID with a TrueDepth camera, Animoji, an HDR OLED edge-to-edge superretina display, faster 64-bit A11 Bionic processor with a neuron engine and integrated motion coprocessor M11, wireless charging, FDD-LTE, Bluetooth 5, NFC, compass, iBeacon microlocalization..."

The Nobel prize winner interrupts the clerk and screams: "Dear Sir, you must have misheard me. I want a TE-LE-PHONE!" ;-)

So David Gross starred in a similar sketch yesterday and reports about it must have brightened the morning not just for me:

What do you mean "in the cloud"? Where is it actually, Preskill? LOL.

Now, we can't be quite sure what Gross' background exactly was, why he asked, and what he meant by the question. Maybe Gross knows what the cloud computing is – and he just wanted to have a mental control over the events. An experimental physicist should better know where his experiment is located, at least roughly.

A simulator ultimately runs on some computers and they're still located somewhere (although we like to think that at least the black hole event horizon degrees of freedom are heavily delocalized in some important way). In practice, the words "in the cloud" make the reality sound way more abstract, uncertain, and delocalized than it actually is: the very term may be heavily overrated. In most cases, "in the cloud" really is in a server building of Amazon or Google or Microsoft or other companies or some combination of those. The word "cloud" sometimes just suggests that "you don't need to know where the servers are, and you shouldn't even ask." But we can still ask, right? It can be interesting or useful to know where the servers actually are!

Well, given David's age, it's more natural to assume that he really heard an explanation of "cloud computing" for the first time yesterday. Lots of people who are much younger than he is don't really understand anything about the world of modern computation, mobile devices, and stuff like that.

The progress is fast and completely new "theoretical constructs" in the IT technology are becoming rather crucial for the IT professionals every couple of years. So you know, I am old enough that as a kid, I really began with BASIC in the early 1980s – and learned the machine codes for Z80, 8080, and especially 6510 processors to write numerous programs in the machine code – and in assembler – I created one for the 6510/Commodore C64. I only learned Pascal when I was already decided not to focus on computing and my knowledge of more modern things remained superficial – so I have used things like JavaScript, Perl, Python, and especially Wolfram Mathematica for what I consider a counterpart of the coding (which I did much more as a kid, anyway). I've only written three programs in C – a "translation from Pascal" – and I needed assistance in one of them. "C" or "C++" may still be the rough cut that divides the programmers from non-programmers today.

An interview with the Big and Young Sheldon. In the U.S., Young Sheldon opened with impressive 17 million viewers. And I absolutely loved the pilot, too.

Some young experts may even be ignorant of what these numbers mean. Some equally old and bit younger people appreciate how immensely outdated my background (and the background of some similarly old people) is. Some older readers surely have more outdated backgrounds and they remember FORTRAN or ALGOL or COBOL – at least the former seems to be still slightly alive in the world of particle colliders which I find rather incredible.

BASIC was good enough to actually do something with computers. It was enough to do anything in principle. (In fact, I think that some of the programming contests I was good at assumed programs in a "Czech translation of BASIC", as I would put it. The focus was on the ideas of the algorithms, of course, not formalities.) But the programming was illogical, inelegant. So in coming years, there was a growing consensus that coders should switch to modular programming (I still mostly did), object-oriented programming (I already largely didn't), event-driven programming (in some sense, people who act as "amateur webmasters" have to encounter it sometimes), and a few more advances like that. In recent years, data and computations were being transferred to the cloud – some generally "delocalized" service with servers out there that keep the data and/or do the computations and you may connect to it from any place and any device. My definition probably differs from what real experts would say.

On top of that, new layers of structure have been added that underlie the complex transmission of information – packets, layers that encapsulate various aspects of the information, cryptography, whole sophisticated collaborative schemes and backup systems etc.

So you know, while I scored some medals in Czechoslovak-level programming olympiads in the late 1980s, I surely ceased to be a top (or real) programmer in the following years, as did many others who chose a different field (or subfield). But I do understand many of the developments and that's enough to guess that all of them or almost all of them have some merit, some justification, that represent progress.

But I can surely imagine how David Gross feels when some slightly younger – and probably (even more) "progressive" – scholar starts to talk about things that are "in the cloud". The "progressive" younger chap must have his head in the clouds or he must be high, just like many others who repeat the empty idealist PC clichés without any content. You know, while David Gross is surely left-wing, I think that he would still agree that people are losing the contact with the beef and too many people are switching from researching and doing hard things to superficial babbling.

And that's probably the general mood with which he must he listened to Preskill's sentence about the "cloud" and why he had the moral duty to interrupt Preskill. Needless to say, I think that a top reason that made me laugh intensely is the realization that David Gross is extremely brilliant and his brain is still operating extremely intensely – but nevertheless, due to his age and background, he was able to produce a question that sounds really funny and is so easy to mock.

I remember a "classmate" at the Rutgers graduate school who was over 70 (such "students" paid no tuition, I believe). He often interrupted instructors with would-be clever, creative questions. I remember one very well: The instructor divided the blackboard to three regions by three lines, clearly in order to explain three concepts. The "classmate" has asked: "What is the big Y on the blackboard!?" Believe me, when an instructor gets 20 such questions per hour, the character of the lecture changes.

While a similar hilarious scene sometimes arises, I think it is extremely healthy for the likes of David Gross – and hopefully for others who have sufficiently comparable credentials (better than my former Y-classmate) – to interrupt the speakers with similar fundamental questions.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :