Thursday, July 05, 2012

Stephen Wolfram on Higgs, particle physics

A Moment for Particle Physics: The End of a 40-Year Story?

Guest blog by Stephen Wolfram, a trained particle physicist, entrepreneur, and software designer

The announcement early yesterday morning of experimental evidence for what’s presumably the Higgs particle brings a certain closure to a story I’ve watched (and sometimes been a part of) for nearly 40 years. In some ways I felt like a teenager again. Hearing about a new particle being discovered. And asking the same questions I would have asked at age 15. “What’s its mass?” “What decay channel?” “What total width?” “How many sigma?” “How many events?”

When I was a teenager in the 1970s, particle physics was my great interest. It felt like I had a personal connection to all those kinds of particles that were listed in the little book of particle properties I used to carry around with me. The pions and kaons and lambda particles and f mesons and so on. At some level, though, the whole picture was a mess. A hundred kinds of particles, with all sorts of detailed properties and relations. But there were theories. The quark model. Regge theory. Gauge theories. S-matrix theory. It wasn’t clear what theory was correct. Some theories seemed shallow and utilitarian; others seemed deep and philosophical. Some were clean but boring. Some seemed contrived. Some were mathematically sophisticated and elegant; others were not.

By the mid-1970s, though, those in the know had pretty much settled on what became the Standard Model. In a sense it was the most vanilla of the choices. It seemed a little contrived, but not very. It involved some somewhat sophisticated mathematics, but not the most elegant or deep mathematics. But it did have at least one notable feature: of all the candidate theories, it was the one that most extensively allowed explicit calculations to be made. They weren’t easy calculations—and in fact it was doing those calculations that got me started having computers to do calculations, and set me on the path that eventually led to Mathematica. But at the time I think the very difficulty of the calculations seemed to me and everyone else to make the theory more satisfying to work with, and more likely to be meaningful.

At the least in the early years there were still surprises, though. In November 1974 there was the announcement of the J/psi particle. And one asked the same questions as today, starting with “What’s the mass?” (That particle’s was 3.1 GeV; today’s is 126 GeV.) But unlike with the Higgs particle, to almost everyone the J/psi was completely unexpected. At first it wasn’t at all clear what it could be. Was it evidence of something truly fundamental and exciting? Or was it in a sense just a repeat of things that had been seen before?

My own very first published paper (feverishly worked on over Christmas 1974 soon after I turned 15) speculated that it and some related phenomena might be something exciting: a sign of substructure in the electron. But however nice and interesting a theory may be, nature doesn’t have to follow it. And in this case it didn’t. And instead the phenomena that had been seen turned out to have a more mundane explanation: they were signs of an additional (4th) kind of quark (the c or charm quark).

In the next few years, more surprises followed. Mounting evidence showed that there was a heavier analog of the electron and muon—the tau lepton. Then in July 1977 there was another “sudden discovery”, made at Fermilab: this time of a particle based on the b quark. I happened to be spending the summer of 1977 doing particle physics at Argonne National Lab, not far away from Fermilab. And it was funny: I remember there was a kind of blasé attitude toward the discovery. Like “another unexpected particle physics discovery; there’ll be lots more”.

Click the screenshot for the "mobile" minimalistic version.

But as it turned out that’s not what happened. It’s been 35 years, and when it comes to new particles and the like, there really hasn’t been a single surprise. (The discovery of neutrino masses is a partial counterexample, as are various discoveries in cosmology.) Experiments have certainly discovered things—the W and Z bosons, the validity of QCD, the top quark. But all of them were as expected from the Standard Model; there were no surprises.

Needless to say, verifying the predictions of the Standard Model hasn’t always been easy. A few times I happened to be at the front lines. In 1977, for example, I computed what the Standard Model predicted for the rate of producing charm particles in proton-proton collisions. But the key experiment at the time said the actual rate was much lower. I spent ages trying to figure out what might be wrong—either with my calculations or the underlying theory. But in the end—in a rather formative moment for my understanding of applying the scientific method—it turned out that what was wrong was actually the experiment, not the theory.

In 1979—when I was at the front lines of the “discovery of the gluon”—almost the opposite thing happened. The conviction in the Standard Model was by then so great that the experiments agreed too early, even before the calculations were correctly finished. Though once again, in the end all was well, and the method I invented for doing analysis of the experiments is in fact still routinely used today.

By 1981 I myself was beginning to drift away from particle physics, not least because I’d started to work on things that I thought were somehow more fundamental. But I still used to follow what was happening in particle physics. And every so often I’d get excited when I heard about some discovery rumored or announced that seemed somehow unexpected or inexplicable from the Standard Model. But in the end it was all rather disappointing. There’d be questions about each discovery—and in later years there’d often be suspicious correlations with deadlines for funding decisions. And every time, after a while, the discovery would melt away. Leaving only the plain Standard Model, with no surprises.

Through all of this, though, there was always one loose end dangling: the Higgs particle. It wasn’t clear just what it would take to see it, but if the Standard Model was correct, it had to exist.

To me, the Higgs particle and the associated Higgs mechanism had always seemed like an unfortunate hack. In setting up the Standard Model, one begins with a mathematically quite pristine theory in which every particle is perfectly massless. But in reality almost all particles (apart from the photon) have nonzero masses. And the point of the Higgs mechanism is to explain this—without destroying desirable features of the original mathematical theory.

Here’s how it basically works. Every type of particle in the Standard Model is associated with waves propagating in a field—just as photons are associated with waves propagating in the electromagnetic field. But for almost all types of particles, the average amplitude value of the underlying field is zero. But for the Higgs field, one imagines something different. One imagines instead that there’s a nonlinear instability that’s built into the mathematical equations that govern it, that leads to a nonzero average value for the field throughout the universe.

And it’s then assumed that all types of particles continually interact with this background field—in such a way as to act so that they have a mass. But what mass? Well, that’s determined by how strongly a particle interacts with the background field. And that in turn is determined by a parameter that one inserts into the model. So to get the observed masses of the particles, one’s just inserting one parameter for each particle, and then arranging it to give the mass of the particle.

That might seem contrived. But at some level it’s OK. It would have been nice if the theory had predicted the masses of the particles. But given that it does not, inserting their values as interaction strengths seems as reasonable as anything.

Still, there’s another problem. To get the observed particle masses, the background Higgs field that exists throughout the universe has to have an incredibly high density of energy and mass. Which one might expect would have a huge gravitational effect—in fact, enough of an effect to cause the universe to roll up into a tiny ball. Well, to avoid this, one has to assume that there’s a parameter (a “cosmological constant”) built right into the fundamental equations of gravity that cancels to incredibly high precision the effects of the energy and mass density associated with the background Higgs field.

And if this doesn’t seem implausible enough, back around 1980 I was involved in noticing something else: this delicate cancellation can’t survive at the high temperatures of the very early Big Bang universe. And the result is that there has to be a glitch in the expansion of the universe. My calculations said this glitch would not be terribly big—but stretching the theory somewhat led to the possibility of a huge glitch, and in fact an early version of the whole inflationary universe scenario.

Back around 1980, it seemed as if unless there was something wrong with the Standard Model it wouldn’t be long before the Higgs particle would show up. The guess was that its mass might be perhaps 10 GeV (about 10 proton masses)—which would allow it to be detected in the current or next generation of particle accelerators. But it didn’t show up. And every time a new particle accelerator was built, there’d be talk about how it would finally find the Higgs. But it never did.

Back in 1979 I’d actually worked on questions about what possible masses particles could have in the Standard Model. The instability in the Higgs field used to generate mass ran the risk of making the whole universe unstable. And I found that this would happen if there were quarks with masses above about 300 GeV. This made me really curious about the top quark—which pretty much had to exist, but kept on not being discovered. Until finally in 1995 it showed up—with a mass of 173 GeV, leaving to my mind a surprisingly thin margin away from total instability of the universe.

There were a few bounds on the mass of the Higgs particle too. At first they were very loose (“below 1000 GeV” etc.). But gradually they became tighter and tighter. And after huge amounts of experimental and theoretical work, by last year they pretty much said the mass had to be between 110 and 130 GeV. So in a sense one can’t be too surprised about the announcement today of evidence for a Higgs particle with a mass of 126 GeV. But explicitly seeing what appears to be the Higgs particle is an important moment. Which finally seems to tie up a 40-year loose end.

At some level I’m actually a little disappointed. I’ve made no secret—even to Peter Higgs—that I’ve never especially liked the Higgs mechanism. It’s always seemed like a hack. And I’ve always hoped that in the end there’d be something more elegant and deep responsible for something as fundamental as the masses of particles. But it appears that nature is just picking what seems like a pedestrian solution to the problem: the Higgs mechanism in the Standard Model.

Was it worth spending more than $10 billion to find this out? I definitely think so. Now, what’s actually come out is perhaps not the most exciting thing that could have come out. But there’s absolutely no way one could have been sure of this outcome in advance.

Perhaps I’m too used to the modern technology industry where billions of dollars get spent on corporate activities and transactions all the time. But to me spending only $10 billion to get this far in investigating the basic theory of physics seems like quite a bargain.

I think it could be justified almost just for the self-esteem of our species: that despite all our specific issues, we’re continuing a path we’ve been on for hundreds of years, systematically making progress in understanding how our universe works. And somehow there’s something ennobling about seeing what’s effectively a worldwide collaboration of people working together in this direction.

Indeed, staying up late to watch the announcement early yesterday morning reminded me more than a bit of being a kid in England nearly 43 years ago and staying up late to watch the Apollo 11 landing and moonwalk (which was timed to be at prime time in the US but not Europe). But I have to say that for a world achievement yesterday’s “it’s a 5 sigma effect” was distinctly less dramatic than “the Eagle has landed”. To be fair, a particle physics experiment has a rather different rhythm than a space mission. But I couldn’t help feeling a certain sadness for the lack of pizazz in yesterday’s announcement.

Of course, it’s been a long hard road for particle physics these past 30 or so years. Back in the 1950s when particle physics was launched in earnest, there was a certain sense of follow-on and “thank you” for the Manhattan project. And in the 1960s and 1970s the pace of discoveries kept the best and the brightest coming into particle physics. But by the 1980s as particle physics settled into its role as an established academic discipline, there began to be an ever stronger “brain drain”. And by the time the Superconducting Super Collider project was canceled in 1993, it was clear that particle physics had lost its special place in the world of basic research.

Personally, I found it sad to watch. Visiting particle physics labs after absences of 20 years, and seeing crumbling infrastructure in what I had remembered as such vibrant places. In a sense it is remarkable and admirable that through all this thousands of particle physicists persisted, and have now brought us (presumably) the Higgs particle. But watching yesterday’s announcement, I couldn’t help feeling that there was a certain sense of resigned exhaustion.

I suppose I had hoped for something qualitatively different from those particle physics talks I used to hear 40 years ago. Yes, the particle energies were larger, the detector was bigger, and the data rates were faster. But otherwise it seemed like nothing had changed (well, there also seemed to be a new predilection for statistical ideas like p values). There wasn’t even striking and memorable dynamic imagery of prized particle events, making use of all those modern visualization techniques that people like me have worked so hard to develop.

If the Standard Model is correct, yesterday’s announcement is likely to be the last major discovery that could be made in a particle accelerator in our generation. Now, of course, there could be surprises, but it’s not clear how much one should bet on them.

So is it still worth building particle accelerators? Whatever happens, there is clearly great value in maintaining the thread of knowledge that exists today about how to do it. But reaching particle energies where without surprises one can reasonably expect to see new phenomena will be immensely challenging. I have thought for years that investing in radically new ideas for particle acceleration (e.g. higher energies for fewer particles) might be the best bet—though it clearly carries risk.

Could future discoveries in particle physics immediately give us new inventions or technology? Years ago things like “quark bombs” seemed conceivable. But probably no more. Yes, one can use particle beams for their radiation effects. But I certainly wouldn’t expect to see anything like muonic computers, antiproton engines or neutrino tomography systems anytime soon. Of course, all that may change if somehow it’s figured out (and it doesn’t seem obviously impossible) how to miniaturize a particle accelerator.

Over sufficiently long times, basic research has historically tended to be the very best investment one can make. And quite possibly particle physics will be no exception. But I rather expect that the great technological consequences of particle physics will rely more on the development of theory than on more results from experiment. If one figures out how to create energy from the vacuum or transmit information faster than light, it’ll surely be done by applying the theory in new and unexpected ways, rather than by using specific experimental results.

The Standard Model is certainly not the end of physics. There are clearly gaps. We don’t know why parameters like particle masses are the way they are. We don’t know how gravity fits in. And we don’t know about all sorts of things seen in cosmology.

But let’s say we can resolve all this. What then? Maybe then there’ll be another set of gaps and problems. And maybe in a sense there’ll always be a new layer of physics to discover.

I certainly used to assume that. But from my work on A New Kind of Science I developed a different intuition. That in fact there’s no reason all the richness we see in our universe couldn’t arise from some underlying rule—some underlying theory—that’s even quite simple.

There are all sorts of things to say about what that rule might be like, and how one might find it. But what’s important here is that if the rule is indeed simple, then on fundamental grounds one shouldn’t in principle need to know too much information to nail down what it is.

I’m pleased that in some particular types of very low-level models I’ve studied, I’ve already been able to derive Special and General Relativity, and get some hints of quantum mechanics. But there’s plenty more we know in physics that I haven’t yet been able to reproduce.

But what I suspect is that from the experimental results we have, we already know much more than enough to determine what the correct ultimate theory is—assuming that the theory is indeed simple. It won’t be the case that the theory will get the number of dimensions of space and the muon-electron mass ratio right, but will get the Higgs mass or some as-yet-undiscovered detail wrong.

Now of course it could be that something new will be discovered that makes it more obvious what the ultimate theory might look like. But my guess is that we don’t fundamentally need more experimental discoveries; we just need to spend more effort and be better at searching for the ultimate theory based on what we already know. And it’s certainly likely to be true that the human and computer resources necessary to take that search a long way will cost vastly less than actual experiments in particle accelerators.

And indeed, in the end we may find that the data necessary to nail down the ultimate theory already existed 50 years ago. But we won’t know for sure except in hindsight. And once we have a credible candidate for the final theory it may well suggest new particle accelerator experiments to do. And it will be most embarrassing if by then we have no working particle accelerator on which to carry them out.

Particle physics was my first great interest in science. And it is exciting to see now after 40 years a certain degree of closure being reached. And to feel that over the course of that time, at first in particle physics, and later with all the uses of Mathematica, I may have been able to make some small contribution to what has now been achieved.


  1. Thanks so much, Dr Wolfram, for agreeing to present your very interesting memories and thoughts about particle physics of the past and present on this blog. (For everyone, I was informed Dr Wolfram will have limited but perhaps nonzero time to invest in coming days so we shouldn't exaggerate the questions when it comes to the time needed to answer them, I guess.)

    I've been always very interested in the evolution of a powerful thinker who has clearly had all the skills to do high-energy physics as I understand it and who ended with a qualitatively different viewpoint, while achieving amazing success in all kinds of adjacent human activities. Some of this mystery has been clarified by this essay in my eyes, parts of it have not. I really agree with the description of the various theories in HEP physics - various combinations of mundane or unusable, among other adjectives. And the Higgs boson is a hack, one may say; Shelly Glashow calls it the Weinberg toilet, after all.

    The difference between our attitudes could be that I just trained myself not to care about these impressions that different theories have on my mood. The Higgs boson is a hack but it also seems like one could demonstrably show that it was needed for the unitarity of the WW scattering or in equivalent ways (to preserve the electroweak symmetry in some form or at high energies which is needed both for the good behavior as well as beauty of the electroweak theory), so it's probably true, whether or not we also have the temptation to say it's a hack. That's been my attitude. Is that wrong?

    So many things, theories, and explanations in physics are inevitably more straightforward and down-to-Earth and unholy than we thought that they "should" be; others are, on the contrary, more abstract, seemingly more complicated or more contrived than we expected, or filled with intellectual or mathematical diamonds, or connected with realms of science we thought of as being independent, or having uncertainties we thought that shouldn't exist, and so on. Our expectations about the level of abstractness, purity, required labor, required imaginations don't always match the reality. But I think it's important for scientists to readjust all these expectations according to what we learn from the experiments and mathematical derivations applied to data - including very general data and their patterns - from the past experiments.

    So while I totally understand the emotions that make various theories in HEP physics look disappointing etc. and perhaps even the perception of much of the research as stuck (except for string theory, things haven't really changed conceptually in 40 years), I still haven't understood the intellectual step going from this situation to the perception that one could use the classical cellular automata or something like that instead of the Standard Model or general relativity or string theory. They just don't seem to pass the tests, like Lorentz invariance or the violations of Bell-like inequalities that QM can achieve. So how can they replace the Standard Model or GR? Aren't the tests and agreements more important than the moods?

    My paragraphs above are implicitly questions; I just didn't want to formulate them as clear questions so that they don't suggest an obligation to answer...

  2. Wolfram is an egotist.

  3. although i don't need to use it now, Mathematica is the best analysis software but they should make options to make it more lightweight, have an online version with more options, try to increase the number of casual users by making it cheaper for people that use it less-they have to find more ways to get money although they have already become more flexible compared to before. they also have to change their dull offices where they do the software development.

  4. Dear Dr. Wolfram,

    thanks a lot for this awesome guest blog :-)
    Your thoughts and memories made such a nice, fascinating and exiting reading that I completely forgot to follow the Startreck series I was watching, hanging in my armchair with my smart phone in my hand. I just could not stop reading ...

    The analogy between routine investments in business for example and the money needed to build an accelerator is very good. I never understood why (too many) people are screeming that fundamental physics are too expensive etc ...
    I`d somehow like to hit the rock (consisting of nice new particles) with my skull too, just to reassure myself ... ;-)

    The discussion about possible underlying fundamental theories confused me a bit. Do you then think we are still far from there ...? About cellular automata I now next to nothing, so I`m interested in the answers to Lumo's implicit questions too :-)


  5. A truly enthralling article from Stephen Wolfram. Thank you ! A question I would have is : how long would it take to duplicate our universe into a computer program compared to building another more powerful collider ?

  6. Your humble correspondent he is not. :)

  7. This was such a pleasure to read, thank you Dr Wolfram.

    (Thank you also for suggesting (in NKS) that the speed of light might be due to the computation speed of the universe, I think you will be proved right with this idea, but I expect it will be a non-local and nondeterministic evolution rule, rather than a local deterministic (cellular automata) rule)

  8. Ondřej ČertíkJul 5, 2012, 11:50:00 PM

    Very nice article! Thanks a lot for taking your time to write your own perception of the physics development. Very interesting.

    Besides what Luboš asked, it has never been clear to me, how to use the chapters about Special and General Relativity in the NKS book to actually derive the Einstein equations, so that one can solve problems. The derivation there feels a little too vague to me from reading the chapters. It's seems to me it's more clear/precise to simply postulate the equations (or the action if it's available). Or is it meant as simply a general framework? That we don't know yet the proper cellular automata?

  9. celular automatas are deterministic so is wolfarm claiming thet there is a deterministic theory explaining QM? (like other cracpots)

  10. George's idea is good. They should have an option where you rent the software, perhaps pay by the hour, CPU or human.

    As far as the relationship between J/psi and electron, I still think that there is a sort of relationship and it does imply substructure to the electron and that substructure is related to that of the strong force. There are a total of six leptons. The quarks fit naturally between the leptons (for example, having intermediate charges). Perhaps not coincidentally, there are six J/psi and also six Upsilons (or at least there were the last time I looked). This is related to the Koide mass formulas.

  11. Many thanks for that guest blog and particularly for developing Mathematica.

  12. Absolutely great post.

    "...inserting their values as interaction strengths seems as reasonable as anything"

    Actually I think its quite brilliant.

    On another note, this comment is interesting:
    "...assuming that the theory is indeed simple."

    I am curious as to Dr Wolfram's thoughts about computational complexity as it relates to the theory of everything. It has been pointed out by John Preskill that is possible, if we accept some of the implications of M-theory, the computational complexity of the universe may be greater than what can be computed even on a quantum computer. (slide 48).

    So even if we rightly accept M-theory, we might not be able to really tackle certain fundamental questions except in a very abstract way.

    I know there is a great effort to try to derive fundamental laws using existing computational power, and I think those efforts are important and need to be further developed, but is it a problem if the universe just turns out to be too hard to solve?

  13. I remember to discuss CA with other pals in the second or third year of the university, say 1987; we had discovered Berlekamp Conway Guy book, with the complete chapter on Life. After some discussion, the main problem was how to go to the continuum. Of course, in the days of lattice theory and Kogut renormalisation group, some expectative of solving this issue could be present, but nobody was optimistic about it. The PhysRev papers of Wolfram, which I found after some analogical search in the indexes, did not help in this matter; I was so dissapointed that I was even grieved, briefly, by the fact that they got to pass the editorial filter :-)

    I still think that there is something in the relationship between the discretisation and the continuum that needs to be worked out. We have scattered clues here and there, such as the equivalece between discretisation procedures and operator ordering criteria in path quantisation, or the coincidence between Butcher trees and Connes-Kreimer trees in, again, renormalisation. And of course we have the links between quantum mechanics and discretisation of area in phase space, a question that in some disguises could be told as old as newtonian mechanics itself.

    It is intriguing that -in a different work that Kreimer's- Connes and Lott were able to formulate the Higgs field as a kind of discrete jump between two copies of the space, each with a different algebra. But the copies themselves are not discrete. And they never got to find a need for three generations, just having more than one was enough to recover the standard model. Still, one wonders if your CA world has a continuum limit in this two-sheeted space, with the Higgs still there as a discrete spacing or a relevant distance.

  14. Thanks for the article.

    I'm also saddened by the lack of leaps in science and technology. Perhaps the leaps of the 1960's and 1970's were so large that they haven't yet been consolidated. That we haven't yet come to grips with them as a society.

    OTOH, I observe a relative lack of interest in hard sciences in school, college and university populations. There are easier ways of gaining qualifications for a comfortable income. It is sufficient for them to reap the benefits of what was done in the past, even though how that stuff works is, for them, indistinguishable from magic. They don't actually care to understand.

    In either case, "filling in the gaps" between the pillars established by past leaps in science in order to satisfy perceived "development" _should_ not be at the expense of making more leaps.

  15. Stephen Wolfram is many things, but humble is not one of them. ; )
    Take a look at this article:


  16. Dear commenters, just an explanation: the moderation standards under guest blogs are somewhat stricter. You will surely be able to calibrate your devices and find out the extent. ;-)

  17. Thanks for this post and its historical expositions.

    As an experimentalist I cannot evaluate the new theory (theory in the sense of world view, the greek root literally:"the view that one sees") you are proposing . I do agree with you though that a lot more thought and finance should have gone into exploring new methods of acceleration.

    Back in the '70s and '80s Tom Ypsilantis was exploring the strong electric fields in crystals to utilize them in muon acceleration, for example. I do not know whether ,if there had been more than the incidental financing ( for example access to young bright physicists with fellowships to work on the project) it could have been more successful. At the moment the end result is that crystals can be used to bend muon beams.

    It was not possible to influence the mainstream direction which was to just go for bigger and higher energy accelerators of the same type. I expect the ILC ( International Linear Collider) will be the last in this line, if they do get the financing. Maybe after that there will appear an effort to think about new methods of using conservation laws in order to accelerate elementary particles.

  18. Many thanks to TRF for hosting this guest blog! There are many things to admire about Dr. Wolfram, not least his sponsorship of Theodore Gray's Periodic Table and MathWorld originally created by Eric Weisstein. Is the P.T. Table still the table in your company's conference room? An employer who allows people like Gray and Weisstein room to thrive, without subsuming their personalities under the corporate identity, is my kind of guy.

    If the James Webb Space Telescope gets completed and launched, I expect it will deliiver a bonanza of data allowing us to see farther back in time than ever in portions of the electromagnetic spectrum that have been uncharted territory. There could be some stunning discoveries waiting in the wings that may compel theoretical physicists to revise and expand existing theories, to an extent and in directions that we do not yet conceive of.

    This comment is running too long already and I just want to finish by saying that as a layman I thank Dr. Wolfram for his contributions to carrying science and mathematics into society's mainstream, through his brilliant (if controversial) writings and the software produced by his company.

  19. Dear Dr. Wolfram, thanks so much by your minutes in physics, as you may remember some "good old ideas" from 70's (as Fortran 77 and SM) still work much better than all their modern descendents, up to any scale ;-)

  20. Apologies for turning the Periodic Table Table into a mere Periodic Table in my above post!

  21. Shouldn't it be at least a Periodic
    Table Table
    Table Table
    Table Table
    Table Table
    Table Table
    Table Table
    Table Table for it to be truly periodic?

  22. An off-topic British funny Higgs cartoon:

    'Und zis is ze God particle magnified 10,000,000,000 times... OH MEIN GOTT!'

  23. Lumo LOL :-D

    How did you manage the picture to be directly displayed in the comment ...?

  24. Dear Dilaton, the thumbnails are probably generated automatically with picture URLs and they have fixed the bug that prevented it in the past. Sometimes it may take minutes and reload for the thumbnail to be generated, too.

    Link to picture via Malcolm Ross. ;-)

  25. The Higgs boson sold on eBay in phial :-D

  26. Dear Shannon, the next big collider project is the International Linear Collider. On their website, they are a bit coy about when they expect it to be completed. However, I am guessing that a timeframe from 10 to 25 years is realistic,

    Replicating the evolution of the universe on a computer may take as long as the age of the universe, if I am not mistaken. At least this is so if you view the universe as a machine that computes its own evolution and that this task cannot be done faster in any other way.

  27. Thanks Eugene. I'd say a super computer could "suck up" the time parameter and make things happen in superpositions. However IMO it would "only" be a low quality copy of our Universe (I'd say we'd need to allow a lot of computer mistakes that would be fixed along the way to better match what we know/observe)... at the ILC :-)


  29. Wolfram has nothing so say (except how great Wolfram is).

  30. Thank you so much, Dr. Wolfram, for this blog post.

  31. Wow. I thought this guest post was supposed to be about the Higgs boson, not about the genius of the young Wolfram.

  32. While I hesitate to expose my ignorance to Lubos the Merciless, these descriptions sound like a generalization QM (not that I am anything but somebody who took one introductory course 35 years ago) but the wave functions and probability distributions sound familiar to me. I had always thought that the Higgs was a prediction of string theory, which was a separate discipline only accessible, even at a high level, to people with an IQ equal to mine + some number that is a function of t - 1905 ;)

    The idea that the averages are not zero justify the "God Particle" hype, in my simple estimation, but I may be reading that incorrectly as well.

    For those complaining about the price of Mathematica, let me complain about how hard it is to develop solid software, and how rare new forms of useful software are. is great, even though admittedly I only use it for hobby engineering.

  33. Dr. Wolfram has made enormous contributions to technology development. "Mathematica" is a tour de force; it undoubtedly will contribute to progress and prosperity for a very long time and we should all be grateful for Wolfram's remarkable work.

    However, I have just read the last chapter of his "A New Kind of Science" and it is clear that Wolfram does not grasp the essence of quantum mechanics, a topic discussed here many, many times. He, like so many others, is chasing a ghost.

  34. I see you offer 'A New Kind of Science' for the iPad. I have an Android, which will show Kindle files. Please consider porting to Kindle, or even .pdf (which I can also view on Android).

  35. Still pending to read, but it seems that 't Hooft is suggesting that the 1+1 CA (yep, those of the Mathematica 1.0 About dialog I was ranting about in a previous comment) can be used to describe a worldsheet of string theory. You can look at it in arxiv:1207.3612 today.

  36. That's some ego you've got there... I hope Nature contacts you soon for some guidance on how to do things properly.