Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Link? According to the article, "And last year, NASA conducted its own tests in a vacuum to rule out movement of air as the origin of the force." So either the story is wrong or you are, but both can't be right, right?


The article is very sweeping. You're getting only the pro side, not any of the critical view.

People are still heavily debating the experiments. It has not been empirically demonstrated to a standard sufficient to call the thrust real. That's why only a very small community is working on this. "NASA" in this context means EagleWorks is letting a couple people spend a bit of spare time and resources on it as a speculative project.

Mike McCulloch's work is interesting, but it's quite far from any mainstream acceptance or testing, as he himself would admit. It's mostly independent of the EmDrive stuff, but the enthusiast community has latched onto it as a way to escape the unpleasant reality that established physics says the EmDrive is a perpetual motion machine. I'm glad EmDrive is giving his work a boost because I think it should be tested, but there should be much better ways of testing Modified inertia by a Hubble-scale Casimir effect than the EmDrive (he's got a couple blog posts about it if you're curious).


Its arguably a possibility that we are so unsure of the effect because we have scientists "cautiously creeping" so as to not hurt their future reputation.

There was anomalous scaling in the observed effects in one Chinese research teams results, but it has not been repeated because full reproduction and pushing past their power levels would cost some decent money.

Seems stupid the (possibly) most groundbreaking advance in propulsion ever made, can't get a million or two when we collectively gamble more than that each day at casinos lotteries and horses races.

We could quickly eliminate the question of "is this anomalous results from bad experiment design" with a bigger experiment that would make such flaws more evident. Yet we waste time instead of money.


I think you've nailed it there. A professional scientist can't take this seriously because the reputational risk is so high.

Science needs to embrace failure more!


Nobody is allowed to fail or make mistakes any more, not just scientists.

Your past used to fade with time. You used to be able to move town, escape your failure, try again.

Now, any failure is forever emblazoned in shrieking tones online, and the only escape is a new name, and starting over.

It's not just science, it's everything.


>Seems stupid the (possibly) most groundbreaking advance in propulsion ever made, can't get a million or two when we collectively gamble more than that each day at casinos lotteries and horses races.

Perhaps they need to set up a crowdfunding campaign to get those people betting on their experiment, gambling on a science experiment rather than a game of chance.


Exactly, in order to have a long career in science, one needs to toe the line, and keep in the mainstream.


Somebody built this complicated device without already having any understanding of how it supposedly works, and you think it might work anyway? Talk about winning the lottery.

What's stupid is betting on stupid.


> What's stupid is betting on stupid.

What's stupid is making investments which are known to have an expected return less than 1:1. Playing the lottery isn't stupid because we don't understand how it works; playing the lottery is stupid because we do understand how it works, we know what the expected return is, and we know that it's a worse investment than just stuffing cash under a matress. This is entirely uncontroversial, and lotteries are run by profit-making entities (either private firms or as a "tax on the ignorant" by governments) whose entire viability relies on this fact.

In contrast, things like the EmDrive are high-risk high-return investments. Their expected return is more difficult to estimate than a lottery, since we'd need to estimate the probability of it working and the expected return if it did work. However, whilst an idea like the EmDrive may be controversial, the idea that spending a small proportion of investment on ideas like the EmDrive isn't controversial. There may be arguments over how much counts as "small", which ideas should be prioritised, etc. but this just goes back to the uncertainty of estimating the expected return. It's entirely uncontroversial to say that, if it works, the EmDrive would be an incredibly lucrative technology; it's also uncontroversial to say that it's unlikely to work. The tricky part is working out which term dominates the expectation: does the big thing (the potential return) multiplied by the small thing (the probability) result in a big thing or a small thing (the expected return)?


This isn't high risk though. It's low risk. However, just because something's low risk doesn't mean it's worth the investment.

The problem here is that you could make this same argument about almost anything, a la Pascal's Wager.


> This isn't high risk though. It's low risk.

I meant "high risk" in the sense that the return has a high variance, with all of it being concentrated in a thin sliver of the probability; apologies if I've misused a technical term.

> The problem here is that you could make this same argument about almost anything, a la Pascal's Wager.

I'd call it a consequence of expected return being a widely-applicable calculation, rather than a "problem" per se. Even if we knew the expected returns, we'd still need a decision procedure to perform the allocation.

My point is that it's uncontroversial to avoid putting all funding into the most promising project (e.g. other Physics research doesn't wait until we're "finished" with the LHC), so there's certainly scope for allocating a small budget to more "fringe" research like the EmDrive. I'm not in charge of research budgets, but as a simplified argument we might imagine allocating funding on an exponential scale, based on expected return and risk: the most promising projects compete for a chunk of half the funds, the "second tier" projects for a quarter, and so on. Projects with lower impact are lower tier, projects with lower probability of success are lower tier. We stop once we've rounded-up an allocation to the smallest unit of funding, hence avoiding Pascal's Wager.

Also note that there are only a finite number of options to choose from, because there are only a finite number of submitted research proposals.


>only a very small community is working on this

>established physics says

Restricting research capacity and dismissing empirical evidence because it doesn't jive with "established science" is exactly the opposite of what science is all about.


Exactly, instead of spending time dismissing the EMDrive as impossible; time should be spent devising experiments that prove it ineffective.

Modern science seems to be rapidly encroaching on religious territory: observable evidence is less important than what historical figures have said.


There have always been the recalcitrant "that's how it's always been done around here" types that often push back even when confronted with evidence. Take, for example, Ignaz Semmelweis[1] - he had the evidence that it works but he couldn't explain how washing hands could decrease mortality in hospitals. Few people took him seriously, he suffered a nervous breakdown and died. Afterwards it took Pasteur and Lister to get people to actually accept the theory.

[1] https://en.wikipedia.org/wiki/Ignaz_Semmelweis


I do understand that it is a recurring theme. This belief approach to science can be scientifically proven to be a bad idea (given evidence such as yours).

While every chance should be made to further explore our current assumptions (such as with the LHC), we shouldn't be neglecting low hanging fruit that challenges our ideas (such as the EMDrive). If as much as 1% of the money that is being spent on LHC went to "anomalous science" we'd likely have a conclusive answer on whether the EMDrive doesn't work. Science is tool that disproves, and we've been hacking it into a proofing tool for far too long. It's time to go back to basics and figure out more about what we don't know.

> McCulloch’s theory makes two testable predictions.

Think about the cost of either of those experiments - vanishing in the face of other science that is being performed today. If science wants to see the EMDrive go away, and is certain that it doesn't work, a comparatively small grant is all that it takes.


We need Antinobel prize for disproving stuff.


A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it. -- Max Planck


Science has always had devout, religious elements, don't think otherwise. Look at the backpack the Theory of Evolution received from respected scientists and how much hand-waving dismissal there was of Relativity.

It has been argued that the only way science progresses is when the obstacles to science die of old age.


This seems like a misconception to me. Scientists have always tried to apply their energies toward the most promising lines of research. The supply of research capacity is finite.


Can you provide links to alternate timelines that proof your heuristics effective? ;)


> established physics says the EmDrive is a perpetual motion machine.

In what sense is it a perpetual motion machine? As far as I understand it violates the conservation of momentum, not the conservation of energy. How would a theoretical perpetual motion machine based on this effect work?


It only looks like conservation of momentum is violated. From the article:

>The cone allows Unruh radiation of a certain size at the large end but only a smaller wavelength at the other end. So the inertia of photons inside the cavity must change as they bounce back and forth. And to conserve momentum, this must generate a thrust.

The new idea introduced is the quantization of inertia at small accelerations. As far as I understand it, from one end of the cone to the other there isn't a smooth change of inertia. This depends on the idea of Unruh radiation, and the reason the article gives for quantization of inertia is that as accelerations get very small, Unruh radiation wavelength becomes larger than the observable universe, forcing unruh radiation to take whole-value wavelengths (quantization). Again as far as I understand it, the inertia of photons on one end of the cone takes a different quantized value of inertia than the photons on the other end. So the thrust isn't a violation of conservation of momentum. The thrust is necessary to not violate conservation of momentum.


No, the new idea is a massive photon. There's strong limits on that from astronomical observations, and a stronger one from Special Relativity when cast in a modern isometry group theory form. In that form, the Poincare Group is the isometry group of the unremovable (flat space) background, and the Poincare Group has exactly one free parameter, "c", which corresponds to the speed of a massless particle. In this form, we assume that light is massless, and look for experimental evidence supporting that assumption (there is a fair amount; we have lab studies showing that the m_photon cannot be more than about 10^-17 eV/c^2 and there are even more stringent limits from observational cosmology).

A nonzero photon mass makes a mess of particle physics at high energies. The Standard Model falls apart due to loss of gauge invariance, and QED becomes really obviously non-renormalizable near limits we have already tested. So this would be a big thing.

(Note that if you bite the bullet and take a nonzero photon mass (as McCullogh says in his paper at page 3: "Normally, of course, photons are not supposed to have inertial mass in this way, but here this is assumed") you can probably get a non-constant photon speed too, along the lines of neutrino oscillation. But you can always write down non-physical theories that conflict with frequently and precisely tested areas of physics...)

None of the above departs from the symmetries of Special Relativity, and one subgroup of those (invariance under spatial translation) is what implies the (local) conservation of (linear) momentum per Noether's (first) theorem.

So there's a really big looming question about why the Standard Model (which has the Poincare Group baked into it) works as well as it does everywhere but in an EmDrive or a single space probe, and resorting to special shapes of objects on scales much larger than that of atoms further conflicts with Poincare invariance.

Finally, Unruh radiation is a difference in particle count and particle energy measured by differently accelerated observers. The cosmological horizon, when it formed, produced an acceleration between observers then and observers in the future. Unruh radiation from that is pretty uncontroversial. However the problem is that the acceleration is pretty small, so the temperature will be much lower than that of the CMB. Also, you'd expect anisotropies based on Earth's (and its surrounding Local Group's) peculiar motions relative to the horizon. Why isn't there a dipole anisotropy similar to the one we see in the CMB, and if there is, by how much does it offset the inertial argument in McCulloch's paper?


Constant thrust means constant acceleration, so speed goes up indefinitely with a constant supply of energy. But kinetic energy is proportional to the speed squared. So you get more energy than you put in.


I don't think they actually claim acceleration will be constant as the devices accelerates up.


Conservation laws aren't fundamental, but are implications of local differentiable physical symmetries (this is the primary result of Noether's (first) theorem).

The local symmetries in question are represented by the Poincare Group, which is the isometry group of Minkowski spacetime, which in turn is the unremovable background of Special Relativity. (The Lorentz Group is a subgroup of the Poincare Group).

This is another way of saying that in a 3+1 flat spacetime, well-designed local probes of fundamental physics will not depend on time translation (i.e., experimenting today vs experimenting tomorrow), on spatial translation (e.g., the same experimental results here vs there), on spatial orientation (i.e., rotation about the 3 spacelike axes; so you get the same result when you turn the system under test 90 degrees to the left), or under Lorentz boosts, which are basically instantaneous changes in constant uniform motion along any of the axes. Additionally, small-scale (small compared to several times the size and mass-energy of the whole solar system) natural phenomena are essentially always "well-designed local probes of fundamental physics".

Conservation of linear momentum arises from invariance under spacelike translation; a violation of that conservation makes it very difficult to maintain spatial translation symmetry, and in particular flies in the face of the many direct tests of physical systems at different places on our planet and in our solar system, for example. So that's a big deal that needs explaining, and in particular the explanation should preserve the known and reproducible invariance under those translations as well as allowing the EmDrive's alleged violation.

The perpetual motion machine argument is slightly different because it is rooted in a claim about how the EmDrive behaves when in non-constant motion; in the current version of the theory paper "V 9.4" at equations 14-16, there is an implied violation of the Einstein Equivalence Principle. The equation is a bit odd, and you easily can read it to say that there is a power<>thrust relationship that varies with acceleration, and raise the Einstein elevator objection: if you put the EmDrive into an upwards-accelerating box, or leave it on the ground, do you still take this power<>thrust relationship seriously? If so, you get "free power" (although that's a bit subtle). If not, then you have to explain a violation of the Equivalence Principle, which is also something that has also been very well tested and has so far applied without fail.

You could think of it with a somewhat concrete example: to hover at some fixed height above the Earth's surface, the EmDrive would have to emit thrust similar to the amount of work you would do to hold it at that height with your hands; since you are near the surface, that should be about "g" expressed as N/kg (and indeed Eq 16 tells you how much electrical power you will require for the EmDrive). So far so good. But (eq 16) can be read to say that when you place the EmDrive on the ground, it will produce electrical power proportional to to "g".

I think this is pretty clearly just a mistake rather than a serious claim. Unfortunately eq 16 features prominently in the FAQ and all the "marketing" material about the EmDrive. (Even more unfortunately it's hard to see how to fix the equation without making the drive obviously inoperable).


Thanks. For the supposed violations - aren't we still then only discussing the device working within known physics? There is no violation if the device were to slowly lose mass due to some unknown process (making it a form of Ion thruster), or if it accelerates some yet unknown weakly interacting massive particle in the opposite direction of the thrust, or if it somehow only conserves invariants for a "larger" system, potentially the entire universe?

Given an exotic enough explanation (negative energies/faster than light particles/spacetime warping/..) it feels like you can cheat your way around almost any invariant? Occam is left weeping in a corner of course.


Sure, if it's not actually reactionless, most objections fall away.

The problem with reaching for exotic particles or the like is that we don't see them in searches which involve tens of orders of magnitude more power, and we certainly don't see them in, for example, electric toasters or radar sets. So what would need explaining is what's peculiar about the EmDrive with respect to (extremely) unusual particle interactions, and there is [a] nothing obvious and [b] nothing at all in the "theory" paper.

Likewise, reaching for non-local physics answers raises similar questions (why is EmDrive doing non-local whatever but my kettle isn't; or alternatively, why is EmDrive coupling with whatever much much much more strongly than my conventional oven is).

I have another pair of things that you can add to the list: [a] EmDrive somehow violates causality in a way that other similar arrangements of mass-energy-momentum does not and [b] EmDrive somehow escapes logic in a way that other similar systems under study does not. These are neither more far-fetched nor more unpalatable, compared to abandoning local physics (causality and logic preserved, but hidden non local variables proliferate) or abandoning the Standard Model as an accurate low-energy theory of matter (causality, logic and locality preserved, but now what happens in the molecules and atoms of cars, computer chips, and light bulbs? we no longer can be quite so sure!).

There is A LOT of known physics and almost exactly zero examples of violations of the known invariants at low particle energies. You are re-testing relevant parts of all that in reading this comment.


Huh, seems I'm wrong - there was an actual high vacuum test -http://arc.aiaa.org/doi/pdf/10.2514/6.2015-4083

Paywalled on that link though so I can't see the details which tend to be everything since last time a vacuum chamber was involved it wasn't actually evacuated.


Nasa's test and their results are publically available on Nasa's website. http://ntrs.nasa.gov/search.jsp?R=20140009930&hterms=RF+Test...


A German team tested it in a vacuum. It's mentioned with sources on the EmDrive's Wikipedia page.


Yes, and that report indicates a null result. But they also indicate that the driving frequency was way mistuned for the cavity. They were driving it with a magnetron from a microwave oven, which output a frequency far from optimal. This indicates a low-budget operation, and not one in a lab with lots of microwave gear.

NASA, at least, has a microwave source with the right frequency. But they report "researchers were now working on a new integrated analytical tool to help separate EmDrive thrust pulse waveform contributions from the thermal expansion interference". That indicates this thing is still way too close to the noise threshold.

Back when cold fusion was taken seriously, I went to a Stanford talk where a physicist described their attempts to replicate the experiment. At first, he said, they had the apparatus surrounded with radiation detectors and alarms, in case it produced a dangerous burst of neutrons. After a while, they realized that wasn't going to happen. They discovered that the effect being measured was about twice background radiation. Then they discovered that people moving around the apparatus affected neutron readings by more than the measured amount. (Humans are mostly water, and thus reflect neutrons.) Finally they moved the experiment to a "neutron cube" built from lead bricks, where the background radiation was very low. The measured neutron readings went way down. That's what it's like when a phenomenon is near the noise threshold.


The German team did it for a BBC documentary on "junk science" - they were looking for a null result, so while I take claims that the tech works with a pinch of salt, the same applies here.

Everybody acts to their own interests.


Ah yes, the typical "let's prove this doesn't work" so do it half-assed and then go "see, it doesn't"

It is possible to "disprove" a lot of known science with those kind of experiments


My understanding was that the report didn't show a null report:

The device produced positive thrusts in the positive direction and negative thrusts in the negative direction of about 20 micronewtons in a hard vacuum, consistent with the low Q factor.

Besides being tested horizontally in both directions on the torsion pendulum, the cavity was also set upwards as a "null" configuration. However, this vertical test intended to be the experimental control showed an anomalous thrust of hundreds of micronewtons that could be caused by a magnetic interaction with the power feeding lines going to and from liquid metal contacts in the setup.

(from Wikipedia)

Am I misreading this somehow?


> That's what it's like when a phenomenon is near the noise threshold.

That's not really noise level, that's just unknown systematic errors.


he just means here are all kinds of piddly effects that will throw it off. If you switch the thing on and you get a huge effect size with high signal/noise you don't need to worry about stuff like who is standing near the tank other than to worry that they're being irradiated with fast neutrons.


If you're that low-budget why would you not just size the cavity to match a commercial microwave oven magnetron?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: