## Monday, March 25, 2013 ... /////

### Speed of light is variable: only in junk media

Francis the Emule (Spanish) diplomatically agrees with me...
If you open Google Science News at this very moment, the #1 story is saying things like
new research shows that the speed of light is variable in real space.
The only problem is that the "research" is pure crackpottery. Those stories build upon the following two papers in a journal called European Physical Journal D I have never heard of in the context of fundamental physics:
A sum rule for charged elementary particles by Gerd Leuchs, Luis L. Sánchez-Soto (free: arXiv)

The quantum vacuum as the origin of the speed of light by Marcel Urban, François Couchot, Xavier Sarazin, Arache Djannati-Atai (free: arXiv)
The abstracts are enough to see that the authors aren't just making one or two serious technical errors. Instead, they misunderstand the very logic of science - how arguments in favor of some claims may or may not be phrased.

The first, German-Spanish paper tries to claim that the sum of squared electric charges over all elementary particle species (regardless of their mass) is $\sum_i Q_i^2 \sim 100.$ This is quite a bold statement. You may try to look what is the quantum field theoretical (or stringy?) calculation leading to this condition. What you will find is that there isn't any quantum field theory in the paper at all!

Instead, the paper misinterprets virtual particles etc. in the way you expect from a 10-year-old schoolkid. For them, the virtual particles are real and they're connected by springs of some sort. Some physically meaningless calculations lead them to the sum of the squared charges. If you try to find out where the number $100$ came from, you will find out that it was calculated as a function of three more real parameters whose values were chosen arbitrarily.

It would be a terribly stupid paper even for a 10-year-old boy. But the authors must believe that it's possible to learn things about physics in this way even if they don't know anything about the way how modern physics describes particle species and their interactions with the electromagnetic field – about quantum field theory. So one sentence in the paper refers to quantum field theory, after all. It's the last sentence before acknowledgements that says:
We hope that this result will stimulate more rigorous quantum ﬁeld theoretical calculations.
Wow: they leave the details for their assistants whose task is to convert the ingenious findings that contradict everything that a quantum field theory could say about these matters to a proof in quantum field theory.

Needless to say, it's totally impossible in $d=4$ to have a similar constraint for the sum of squared charges. At most, the sum of cubed charges is what enters the gauge anomalies in $d=4$. But summed squared charges over particle species can't occur in physically meaningful formulae. Moreover, the number of particle species is really infinite – although most of them may have masses near the string scale or higher – so the sum is either ill-defined or divergent. In other words, it's implausible for an important physical formula to deal with particle species regardless of their mass.

Via Gene, off-topic: Lots of music for a buck.

The other paper, the French one, is a similar nonsense about the variable speed of light. These papers are being clumped together because the authors of both of them are clearly pals and they coordinated their invasion into the journals and the media. Let me repost the abstract here:
We show that the vacuum permeability and permittivity may originate from the magnetization and the polarization of continuously appearing and disappearing fermion pairs. We then show that if we simply model the propagation of the photon in vacuum as a series of transient captures within these ephemeral pairs, we can derive a finite photon velocity. Requiring that this velocity is equal to the speed of light constrains our model of vacuum. Within this approach, the propagation of a photon is a statistical process at scales much larger than the Planck scale. Therefore we expect its time of flight to fluctuate. We propose an experimental test of this prediction.
Unbelievable. Look at the first sentence. They think that they "show" that the vacuum permeability and permittivity "may" originate from the magnetization and the polarization of continuously appearing and disappearing fermion pairs. (Needless to say, there's no quantum field theory in this paper, either.) How do they achieve this ambitious task?

It's easy. They forget and ignore everything we know about physics and everything they should have learned about physics, even at the high school. In this state of perfect oblivion, one isn't constrained by any knowledge at all – because there isn't any knowledge – so anything goes and an arbitrarily stupid pile of crackpottery "may" be true and one thus "shows" that it's possible.

Except that a person who knows something about physics may show that pretty much every sentence in this paper is pure rubbish. Their particular nonsense that "may" be true as they "show" is that the vacuum is chaotic for photons so the light propagation is chaotic and the speed is variable. By saying these things, they prove that they don't have the slightest clue about the actual explanation of the existence of light that we have known since the late 19th century.

The actual explanation of light is that it's a type of electromagnetic waves. And electromagnetic waves are simple solutions to Maxwell's equations, the equations that describe all electromagnetic phenomena. These equations are particularly simple in the vacuum. Maxwell's equations in the vacuum are actually the more fundamental ones; the propagation of electromagnetic waves in other mediums requires some extra work.

But in the vacuum, the permittivity $\varepsilon_0$ and permeability $\mu_0$ simply enter Maxwell's equations as conversion factors that disappear – that are replaced by $1$ – if we use more natural units. The reason why I say these things is that $\mu_0,\varepsilon_0$ are not supposed to be "derived" from any complicated mechanism involving lots of charged particles etc. On the contrary, they're players in the most fundamental equations of electromagnetism and it's the behavior of lots of charged particles that is "derived" and that can be reduced to fundamental Maxwell's equations.

Hendrik Lorentz was the first man who showed that Maxwell's equations in general materials may be derived from the vacuum Maxwell's equations combined with some behavior of the charged and magnetized particles that exist inside the materials. It was an important insight (it helped Einstein to think in the right way when he was marching towards relativity) and people could have been unfamiliar with this insight at some point – except that Lorentz found those things more than 100 years ago so they shouldn't be unknown to authors of a journal called European Physical Journal D in 2013.

The authors are trying to derive the light propagation in the vacuum from light propagation in some fictitious complex material – which is exactly the opposite strategy than physics chooses (and it's obvious why it chooses the opposite one): complicated materials are more complicated than the vacuum. In other words, the authors suggest that if their contrived "additional" effects didn't exist, the permittivity and permeability would vanish in the vacuum. But they couldn't vanish. Even when all the chaos is removed, physics must be described by non-singular equations which essentially means – among many other things – that the permittivity and permeability would still have to be finite nonzero constant in the vacuum. We know what these constants are: they are $\varepsilon_0,\mu_0$.

But what is even more important is that the authors don't understand what is primary in science: unrestricted speculations about the ways how the world "may" work, or constraints from observations and experiments? They clearly think that it's the former. They "may" write kilobytes about nonsensical models that have nothing whatsoever to do with the Universe around us and they claim that this "shows" something.

But science doesn't work like that. We actually know that the speed of light has to be completely constant and free of any fundamental "noise". In fact, our definition of one meter is such that the speed of light in the vacuum is tautologically $299,792,458\,{\rm m/s}$. So it's obviously constant. The constancy of the vacuum speed of light follows directly from special relativity and special relativity is what we actually know to be true from the observations. So all the speculations must adjust to this knowledge – and all other empirical knowledge we have. The authors' approach is just the opposite: they want the empirical knowledge to be adjusted to their unconstrained fantasies. They simply don't understand the basic point of science that the self-consistency of a hypothesis isn't enough for such a hypothesis to be a good scientific theory. Empirical knowledge actually matters and kills most of the conceivable guesses.

I can't resist to compare their approach with the following question that a user named John Smith asked on Physics Stack Exchange two days ago:
Why perpetual motion wouldn't be possible if we are so technological advanced?
You see some kind of a fundamental misunderstanding about the inner workings of the Universe and the humanity. John Smith – and similarly the authors of the papers discussed in this blog entry – doesn't get the point that regardless of the technological sophistication, every civilization much like every object in Nature is "obliged" to obey the laws of physics and the non-existence of the perpetual motion machines are among these laws (the so-called first two laws of thermodynamics).

John Smith's – and the authors' – opinion about this basic issue (about the very existence of the laws of Nature) is the opposite one. He believes – and they believe – that there are no permanent laws, there are just limitations that we're constantly breaking as we're getting more technologically advanced and more self-confident. The non-existence of the perpetual motion machines (or similarly the constancy of the speed of light in the vacuum) must be just due to some limitations of technology we can surely transcend in 2013 if we want! ;-)

It doesn't make sense to spend too much time with these silly papers. So I will stop and finish this blog entry with the complaint that the adjective European in the name of the journal could be replaced by Idiots' if we wanted the name of the journal to be more accurate. And that's not a good result for the old continent of ours! At the same moment, these idiotic crackpot papers are widely quoted in the U.S. and other media so Europe is not the unique continent on which similar junk flourishes.

And that's the memo.

#### snail feedback (55) :

If the speed of light in a vacuum varied as much as claimed by the French authors, Rudolf Mossbaeur wouldn’t have gotten that Nobel Prize in 1961. Highly monochromatic photon sources could not even exist.

These guys are living in la-la-land.

Your startling admission that you have never heard of the EPJ series of journals makes me wonder what else you don't know.

Like so many physicists impressed by Einstein, you do not make a clear distinction between a definition, which says nothing about reality just about language and which is true by its semantic construction, and a statement with a physical meaning which may be true or false depending on the nature of physics. This is the basic distinction between analytic and synthetic statements in philosophy. Einstein was a master of this double play, with a definition (speed of light = 1 lightsecond /second) interpreted as a statement about physics in its role as the basic hypothesis of special relativity. The confusion allows physicists to state that a lightsecond can be used as length standard, BECAUSE the speed of light IS constant, as evidenced experimentally. This is like presenting experimental evidence that there are 100 centimeters on a meter.

The Pound-Rebka experiment, done in the Jefferson Laboratory at Harvard using the Mossbaeur Effect, was the first experimental test of General Relativity. Am I correct in saying that the variation in the velocity of light hypothesized by the French guys was already disproved by this experiment?

I had a brief look at the second paper (you can access the first 2 pages). This seems like some primitive attempt to calculate the vacuum polarization but, as Lubos mentions, without QFT. This is ridiculous because the calculation of the vacuum polarization in QED at one loop can be found in most textbooks on quantum field theory. The strangest thing is that the authors are members of 2 well known french universities - I did my 'DEA' at one of them and my PhD at the other. Oh man

The sum of squared electric charges over all elementary particle species (regardless of their mass) is 100 .. dollars? minutes? grams?

Why does anybody take such joke papers serious ...?!

This is really silly, the authors seem to have neither a clue about the scientific method nor what they are talking about. They do not even bother to determine the correct method (QFT more or less advanced ;-)...) to investigate the issue they want to clarify; heck since the speed of light is constant there is not even an issue to clarify ...

Maybe this obscure European journal is among these many new online journals that spam my mailbox with uncalled for invitations to publish a paper with them twice each day, LOL :-D

I knew it! Nothing is constant, not even pi.

http://arxiv.org/pdf/0903.5321v1.pdf

Probably 100 times the squared electron charge. Anyway, it's nonsense.

It's just not a journal anyone in my field would ever publish papers in. I must have seen the name, it's just that the name didn't get stuck in my memory as a journal in which legitimate papers of this sort may be published, and these examples strengthen the idea that it was the right message.

Well, French universities, Schmenz universities. ;-)

Dear Gene, I would say that the natural theories of this kind were really ruled out in the 19th century, before the discovery relativity, when interferometery was born. When someone says that the waves of light are randomly long in a way that resembles the random walk, and they do, it really means that the number of wavelengths N that fit between two points should constantly fluctuate because it should be really N plus minus C times sqrt(N) where C is of order one.

Needless to say, there isn't even a fluctuation plus minus one - because that would be enough to totally destroy the coherence of light. But fluctuation by sqrt(N) where N is the number of wave maxima squeezed inside the interferometer? No way. As you correctly say, this example is a rather characteristic example showing what kind of insights is completely inaccessible to folks in the lalaland. They believe everything to be "chaotic" in the sense that coherence can't ever occur, nothing can ever be accurate, and so on. But much of modern physics is based on perfect coherence - lasers etc. Theoretically, we have precise frequencies of transitions and they can be calculated and measured really, really precisely (monochromatic radiation!), which of course excludes the possibility that the frequencies or speeds oscillate with any natural coefficient.

The only way to avoid an immediate clash with observations - with virtually any kind of an experiment that views light as a wave - is to make coefficients such as the number C above much smaller than one or much smaller than their naturally expected magnitude. But it's very unlikely that such numbers are adjusted so close to the perfect value but not quite. In other words, there don't exist any theories that would explain why C may be finite but nonzero. And surely a theory in which C=0 doesn't seem to be a special place is probably a theory where it's not a special place and that predicts C=O(1), a totally excluded value. It's just a theory on a totally wrong track. It could only be interesting if someone showed that surprisingly, it gives C=0 or C=0.000000000000000001. But because such a demonstration doesn't exist - and it's reasonable to expect it won't be found in the future, either - the theory is just dead.

Dear Lubos: Your long and spirited response gives yet another illustration of my observation that physicists after Einstein do not make a clear distinction between definition (analytic statement true by semantic construction) and statement about physics (synthetic statement which may be true or false). You repeat the idea that a meter as a certain fraction of a lightsecond, can be used as length standard BECAUSE the speed of light (in vacuum) IS CONSTANT, thus making an analytic statement into also a synthetic statement, that is an a priori statement about the world we live in, which can be guaranteed to be true. This is like sitting in your chamber with closed eyes and making a statement about the world which you can guarantee is true. It reminds about bank guarantees issued by EU-politicians with closed eyes behind closed doors.

Lorentz was very clear about the non-physical nature of the transformation he introduced, which was however ignored by Einstein who gave the transformed coordinates a direct physical meaning as real dilation of space and time. Lorentz understood while Einstein misunderstood. If only Lorentz had been alive today he could have helped modern physics out of the confusion introduced by Einstein.

The basic issue relates to the coordinate system used to express Maxwell's equations in vacuum, which in the absence of matter can vary as being tied to observer velocity, while the equations would look the same for different observers moving with different velocities, as if they anyway use the same equations. I explore this aspect in Many-Minds Relativity http://www.csc.kth.se/~cgjoh/ambsrelativity.pdf

Don't you see anything reasonable in my argument?

LOL, and the last page of the paper above shows that another quantity is non-constant: it always jumps by one. It's the number of self-citations of the author to his previous papers.

Wow, this paper is really a big pile of crap.

Dear Claes, the constancy of the speed of light in the vacuum - much like any statement about science - could have been either right or wrong *a priori* (which is a required "risk" for any statement to be scientific at all), but when the evidence is taken into account, we actually know what the truth value is. It is true.

What you're missing is that scientists are allowed - and, in fact, obliged - to take the evidence into account when they're doing further research (and even when they're deciding about terminology, conventions, and units).

OK I see that you don't understand my argument. I think I understand what you are saying and I think I see that you are missing something.

Anyway, let me then ask you: Is it thinkable as a possibility that the speed of light in vacuum would not be the same to all observers? What physics would that correspond to as thinkable physics? If it is not thinkable that the speed of light would not be constant, is then the constancy of the speed of a light a synthetic a priori statement about the world which can be guaranteed to be true independent of experimental observation?

...Many-Minds Relativity...?!

This is scary, seems like some kind of an attack of the borg is looming, LOL :-D

Hi Lubos: I sent a reply that seemed to vanish i space somewhere. I repeat my question: Is there any thinkable physics where different observers would have different conceptions about the speed of light (in vacuum), that is, that the speed of light (in vacuum) would not appear to be the same for all observers? If there is no thinkable such situation, does this mean that the constancy of the speed of light is a synthetic a priori statement about the physics of the world, which can be guaranteed to be true without experimental observation, and which is not simply an agreement or definition?

If you spent a couple of minutes with Many-MInds Relativity you would find my answers to these questions, but now the question is yours.

1. Please fix the double negation in the section just before the video:" ...can't occur in physically meaningless formulae."

2. Is Leuchs seriously taking unit dependent constants such as epsilon0 and mu0 and argues for their physical value? Is he seriously discounting excitations and other things that lead to many more particle types (as Lubos already pointed to?). I find that highly disturbing, given his credentials:
http://www.mpg.de/310384/physik_des_lichts_wissM
(Full Professor and Max-Planck Director)

Possibly related to the many minds interpretation of quantum mechanics?

ah come on they are not all bad. Alain Connes used to teach at one of the two universities listed in the paper and Roland Omnes is professor at the other one.

Dear Claes, it is you who is not getting something essential.

Yes, the possibility that the speed of light depends on the observer is "thinkable" and it even used to be believed, but we also know that this possibility is not used in Nature around us. We've known it from insights – theoretical and experimental advances – that we call the special theory of relativity.

So the proposition that the speed depends on the observer is thinkable but we also know, basically with absolute certainty, that it's false. The statement that the speed of light is constant - i.e. the statement that the opinion that it's not constant is false - is surely not an a priori statement of any sort. It's a result of a revolution in physics.

Sure, I know they're not all bad. I would still say that when evaluated as a whole, even the best 10 French universities are at a qualitatively lower level than the best 10 or even 50 U.S. universities.

Hi, 1) thanks, fixed. 2) Try to read the paper yourself and verify my interpretation. You're the first one who talks about "excitations" but that's a good point, indeed. In similar considerations, it would be very subtle to find a reason why the formula should sum over "elementary" particles but not their bound states - one term from each possible energy level of it. What I meant were not things like levels of the Hydrogen atom or another composite of known elementary particles but higher excitations of a string - or more generally a theory with new particle species at higher energies that almost certainly give new terms to his formulae.

Wow, what a powerful man! ;-)

Yep exactly, it made me think aout this too apart from the borg ... :-D

The best place for Physics in France is the ENS. I would guess it is as good as some of the top 10 US places.

Ecole Polytechnique is 44th worldwide.

In fees the top US are always at the top.

This is my favourite piece of that paper: "More speculatively, one might consider the possibility that the values of the integers could vary with time, a result suggested by several early Fortran simulations. This possibility would have obvious implications for ﬁnance and accounting."

:)

Yep ! Adjusted for fees, I would say the French universities are better ;-)

IHES isn't too shabby :)

Lubos, you are wasting words dealing with someone whose mind has been warped by "philosophy-of-science" courses, books, etc--ie rhetorical mind viruses with vacuous content.

I understand what you are saying, and I interpret it as an acknowledgment that the constancy of the speed of light (in vacuum) is an agreement between different observers to use the same Maxwell's equations. An agreement is an agreement and nothing one can argue about once the agreement is made. To reconcile the observations after the agreement to use the same Maxwell's equations independent of inertial motion, Einstein postulates that the observations must be connected by the Lorentz transformation (which leaves Maxwell's equations invariant), which means that the observations are not independent. But Lorentz would object to this way interpreting his transformed coordinates as real space-time coordinates. To me Lorentz makes more sense than Einstein in this regard, while you believe more in Einstein, for some reason.

Another possibility explored in Many-Minds Relativity is to allow the observations to be independent and check what aspects will automatically be agreed upon and what aspects two different observers may view differently. This connects to an idea by Ebenezer Cunningham who brought relativity theory to the UK.

Of course, of course! I personally used a Fabry-Perot etalon in 1955, before lasers existed, and the number of wavelengths was absolutely stable. Michelson and Morely could not even have performed their seminal experiment (without, at least, taking them into account) disproving the aether and the fluctuations would certainly have been observed many times during the nineteenth century.
I feel a bit stupid for not having seen this before touching my keyboard!

But the magnitude of the fees often depends more on reputation and cache rather than quality. Top schools ought not tolerate the teaching of crap but they often do.

They are #1 for marching in stupid uniforms.

Unlike Lubos, I have just spent many minutes reading your document and found it to be utter nonsense.
Your statement, for instance, that Erwin Schroedinger did not accept the Copenhagen interpretation of QM is correct but Schroedinger simply did not get the essence of QM and neither do you. And that is just one example of what you don’t get. You, my friend, are stuck in fantasyland.

Dear Claes, one doesn't need any "agreement", more precisely, there isn't any way for an observer to "refuse" such an agreement.

The Lorentz invariance of Maxwell's equations is a nontrivial yet indisputable mathematical fact and this Lorentz invariance - the symmetry underlying special relativity - implies that the speed of light is the same relatively to all observers.

Special relativity says that this symmetry must be the symmetry not only of Maxwell's equations but also any other equations that describe Nature.

Lorentz has never objected to any Einstein's claim about relativity and he wouldn't object to any correct claim I am making about relativity or we are making about relativity today. The only similar sentence that is true is that Lorentz wasn't the first person who realized that Nature obeyed relativity - the first person who discovered relativity.

It may be a good idea to have one mind when thinking about physics instead of many minds. Having many minds is usually referred to as schizophrenia.

Hi Gene: Don't be so sure that Schrödinger did not have very good reasons for not accepting the Copenhagen interpretation. To reject his objections (and mine) as utter nonsense is short-sighted.

You must be right. I got some semi-excited e-mails about Claes a few days ago which probably made me spend more time than I would otherwise spend.

One mind can be the mind of the Pope or the Dictator. A pluralistic democratic society has many minds agreeing on certain things and disagreeing on other.

That's right Gene. In this recent article it is said that the French Business Schools are at risk of bankruptcy when they try to imitate Harvard. Reason: hiring very expensive (American) teachers. Trying to get their school's name on the Financial Times international best schools. Some teachers earn up to 500000\$ per year! None of our schools can afford this. so of course we'll never be on the list of the best...
http://etudiant.lefigaro.fr/les-news/actu/detail/article/les-ecoles-de-commerce-risquent-la-faillite-a-se-prendre-pour-harvard-1516/

I agree. I prefer the Legion Etrangere with their white apron and ax ;-)

Must be TheOnion.archive

Well, as Heisenberg correctly said,

http://www.thebigview.com/spacetime/uncertainty.html

Heisenberg: "What Schrödinger writes about the visualisability of his theory [...] is crap."

Schrodinger explicitly said that the electron was a dissolved object. That directly contradicts the observation that the electron is seen at one place when studied with a good enough light.

"The only way to avoid an immediate clash with observations - with virtually any kind of an experiment that views light as a wave - is to make coefficients such as the cc above much smaller than one or much smaller than their naturally expected magnitude. But it's very unlikely that such numbers are adjusted so close to the perfect value but not quite. In other words, there don't exist any theories that would explain why the cc may be finite but nonzero. And surely a theory in which cc=0 doesn't seem to be a special place is probably a theory where it's not a special place and that predicts cc=O(1), a totally excluded value. It's just a theory on a totally wrong track. It could only be interesting if someone showed that surprisingly, it gives cc=0 or cc=0.000000000000000001. But because such a demonstration doesn't exist - and it's reasonable to expect it won't be found in the future, either - the theory is just dead."

lollolol! But what about anthropic arguments that would call a large C unobservable or what if the vacua with large C were unstable or what if there's a principle that limits C that we don't yet know about?

(Sorry, I just had to when I saw the form of this argument. I realize the histories of the two claims are "slightly" different". Just joking, really ;) )

lol

the speed of light is generated in it constant value is due the PT brken or appear the conservation of cp tp strong interactions,as deformations of spacemthat implies because cp ´s conserved,implying that c is constabt and the spsce is curved by the timr splitted;it is the time with two dimensions,then the oposite spimors conjugate the spacetime through of opposed torsions-anti-symetric metric tensor fields)

Whether or not Schroedinger had good reasons for anything is irrelevant and I did not reject his objections as utter nonsense. Many brilliant people, including Einstein himself, failed to grasp the very essence of QM. There is no shame in that, Claes; you are in good company but you are wrong. Schroedinger, unlike Bohr and Heisenberg, just did not get it.
Now, how the hell do you get the umlaut over the “o”?

Dear Gene, I am personally using the Czech Lumo keyboard

http://motls.blogspot.com/2011/09/international-lumo-keyboard-2011-fg.html?m=1

that has all the characters with any accents, the Greek alphabet, and much more accessible with at most 2 strokes per character.

Is it conceivable for the photon to have a very small mass, that would cause it to travel slightly slower than the maximum velocity?

Dear David, theoretically speaking, the answer is mostly No, one can't make the photon massive in a viable theory.

But even if you adopt a purely experimental perspective, one may show that the deviation from the right speed of light would be essentially undetectable.

We know that the mass of the photon is zero or so tiny that the characteristic "photon Compton" wavelength is much greater than the Earth. It's because the electromagnetic force mediated by a massive photon would be a finite-range force - much like for the Z-bosons, it would begin to exponentially drop at long enough distance scales. But we know that the geomagnetic field doesn't suffer from this exponential damping and we could probably mention stronger constraints (longer distances) of this sort, too.

The Earth radius is about 10^7 meters which translates to 0.1 seconds which translates to 10^{-35} joules which equals 10^{-16} eV. The typical green photon has 2 eV or so, 16 orders of magnitude less.

What's the deviation from the maximum speed for these parameters? Well, the Lorentz factor has to be 10^{16} (the increase of the energy from the rest energy) which means that 1-v^2/c^2 is 10^{-32} or so, so v/c differs from 1 also by about 10^{-32}.

There is no context or method where we could measure the deviations from "c" at the 32nd significant figure, with this precision, so for all practical purposes, we know that the speed of light that could be caused by a massive photon is indistinguishable from "c", the maximum speed of light. You really need to break/deny relativity to change this conclusion and it's a comparably constrained minefield.