Wednesday, August 23, 2017

Tiny Quantum Engines

The previous post mentioned some work on minimal mathematical models for combustion engines, and ended with a link to our first published paper on this subject. That's the paper in which we introduce the class of dynamical systems that we call Hamiltonian daemons. The Hamiltonian part of this name refers to William Rowan Hamilton's formulation of classical mechanics. Hamilton's restatement of all Newton's F = ma stuff dates from 1833 but is still the dominant dialect in physics today, mainly because it adapts well to quantum mechanics.

The daemon part of our name for our systems is partly meant to recall Maxwell's Demon. The devout Presbyterian Maxwell wasn't thinking of an evil spirit: he meant the kind of benevolent natural entity that was the original meaning of the Latin word daemon. Maxwell's daemon was an imaginary tiny being who could manipulate the individual atoms in a gas.  Physicists still think about Maxwell's daemon today, as a way to express questions about the microscopic limits of thermodynamics.

Our daemon name is also partly in analogy to Unix daemons, which are little processes that operate autonomously to manage routine tasks without user input. Hamiltonian daemons are mathematical representations of small, closed physical systems which, without any external power or control, can exhibit steady transfer of energy from fast to slow degrees of freedom. In this way they are ultimate microscopic analogs of combustion engines, and learning about them may teach us about the role of thermodynamics on microscopic scales.
A simple Hamiltonian daemon: like a tiny car driving uphill. 
(The figure in our published paper does not have the little 
pictures of cars. Physical Review is too serious for that.) 

To illustrate how daemons work we took a simple model that resembles a tiny car that tries to drive up an infinite hill. The car's engine has to start in the old-fashioned way of getting it turning by pushing the car, so the car starts out already rolling uphill at some speed. Since it's going uphill, it slows down; some of the time (the dashed curve), the engine fails to ignite and the car just keeps slowing down until it rolls back down the hill. 

Other times, however, the engine catches (solid curve). The car then keeps on driving uphill at a steady speed, expending high-frequency fuel. Since our simple hill is infinite, when the fuel runs out the car will eventually roll back down, but if we add the complication of making the hill level off at some point, the car can climb to the top and then drive on to some goal. 
Schrödinger's Cat 
as a power source?

More recently we thought, "Hey, really small engines should be quantum mechanical." What would happen if you tried to extract work from a fuel-powered engine that was entirely quantum? What would a quantum Hamiltonian daemon be like? Well, we found out. 

It turns out that the quantum daemon behaves rather like a random daemon. Sometimes the engine ignites, and sometimes it doesn't—even if you do everything exactly the same every time. The quantum daemon also burns its fuel quantum by quantum, and it keeps driving uphill by giving itself a series of kicks that suddenly bump up its speed. After every such kick, there is a chance that the quantum engine will spontaneously stall, and refuse to run further, even if there is fuel left. So over time the quantum probability distribution for the height of the car develops a series of branches, as in the figure here.
The quantum daemon burns fuel in quantized steps, 
and can randomly stall.

It's not just randomness going on at every kick, though. If you look closely you can see interference fringes in the probability distribution. Since the whole daemon system is closed, the evolution is actually all unitary. Every branch of the figure, which represents a different possible trajectory of the car, is in coherent quantum superposition. Since there is more fuel left if the engine stalls sooner, but less fuel if the engine goes on longer, the superposition of all the branches is a total quantum state with high quantum entanglement between fast and slow degrees of freedom. Since the number of branches grows over time as the quantum daemon operates, the von Neumann entropy of the slow degrees of freedom increases.

Readers who are familiar with quantum mechanics may want to read more in Physical Review E 96, 012119 (2017), or in the free and legal ArXiv e-print version. Our first paper on daemons (Phys. Rev. E 94, 042127 (2016)) is also on ArXiv.

Friday, January 13, 2017

Humphrey Potter and the Ghost in the Machine

Humphrey Potter adds strings to Mr Newcomen's engine so that he can go play.
The first steam engines were slow-working beasts that needed constant human tending. To work its minutes-long cycle, the first Newcomen steam engine needed a human hand to open and close valves for cold water and steam. The machine provided great power, but someone had to control it.

At some point, however, some brilliant mechanical mind noticed that the machine was, after all, creating its own motion. Why not connect the moving piston to the valves, and let the engine run itself?

Some sources even claim to identify the person who first made this brainstorm work: a young boy named Humphrey Potter, who was paid to operate a Newcomen engine by hand. Young Potter hooked up a system of “strings and latches” that made the machine itself do his work for him. Then he ran off to play.

Apparently the sources for this story are not considered reliable. I can’t find any of them myself, and the modern texts that mention Humphrey Potter refer to his story as a legend. That’s a shame, because I’d like to think I could put a name to the person who first made a useful engine run all by itself. 

Apart from the practical advantages of getting a machine to run itself, the scientific point of lazy Potter’s clever trick is that it took any form of intelligence out of the loop of doing work. The entire engine process, from fuel combustion to pumping water, was now a purely mechanical operation. Humphrey Potter banished the ghost from the machine. He made it clear that everything that was occurring, in the marvelous process that turned lumps of coal into useful work, was occurring strictly under the basic laws of physics. It all ran all by itself.

In one way we take this insight for granted now, and may even extend it to processes more complex than lifting weight by burning coal, processes like life and consciousness. Yet in a very practical sense science still has not fully taken the point that engines can run as closed systems, without external power or control, and without any ingredients beyond basic physics. Engineers invoke higher level concepts like pressure and temperature, and while these are clearly valid, they leave a lot of details hidden under the hood. Theoretical physicists analyze parts of the whole process, like how gas particles adapt their motion when a piston moves in a predetermined way, but they do not simultaneously consider how the piston motion is itself determined by the gas particles bouncing off it. We don’t really believe there is any ghost in the machine, but when we have to explain how the machine works, somehow we keep sneaking the ghost back in, in some form or other.

If there's something strange
in your phase space neighborhood ...
At least, until now. Quite recently we have discovered a very simple mathematical model for a minimal kind of combustion engine, that runs as a closed system under basic physics. So we now have a bare-metal, first-principles model for an engine. Its operation is in some ways very reminiscent of a steam engine, but in other ways it is radically different. We are hoping that it may teach us about the microscopic roots of thermodynamics, but someday, perhaps, it might be the basis of a whole new class of power nanotechnology.

Readers who know Hamiltonian mechanics may enjoy our first paper on this subject. "Hamiltonian analogs of combustion engines: A systematic exception to adiabatic decoupling" is published in Physical Review E 94, 042127 (2016), and is also available in e-print form at https://arxiv.org/abs/1701.05006.

Wednesday, October 1, 2014

Newton versus Heisenberg

Death of a microbe
The Heisenberg Uncertainty Principle has become famous well outside quantum physics, but it is often cited in garbled form. I once found it mentioned in a textbook on social science research methods, where it was defined as the fact that electron microscopes can damage samples when they observe them by electron bombardment. Physicists roll their eyes at such misunderstandings, but I think that physicists themselves often misunderstand the Uncertainty Principle, by confounding Heisenberg’s specifically quantum mechanical relation with a principle of measurement that goes back to Newton. 
What the Uncertainty Principle really means, in practical terms, is that if you construct an apparatus to control one experimental property more tightly, so that the variations in its value between different repetitions of the experiment will be smaller, then past a certain point there will always be some other thing that becomes correspondingly less well controlled, so that its run-to-run variations become wilder. The ‘certain point’ at which this trade-off sets in represents a degree of precise control so high as to be quite unattainable anyway in the macroscopic world. So although the Uncertainty Principle is profound, it is really irrelevant to fields like social science research.

Observing something means letting it act on you.
What even many physicists think the Uncertainty Principle means, however, is something that really is widely relevant: the fact that no measurement can ever be purely passive, but always affects the thing being measured. This principle is both true and important, but it is not specifically quantum mechanical.

Observation is a physical process. A meter can only register the position of an object if there is some interaction that makes the object’s position act on the meter. Newton told us, long before Heisenberg, that this means that any meter is also going to react upon the the thing it measures. Such ‘observer effects’ are apt to be important when large meters measure tiny things; the Heisenberg Principle, however, is not this, but an additional complication in microscopic measurements.

In the early days of quantum mechanics, critics of the new theory tried to argue that the Uncertainty Principle was not self-consistent, by describing hypothetical experiments that would obey the Principle in each individual process, but yet still lead to an indirect violation of the Principle as an end result. These arguments all had subtle flaws, and the most famous flaws involved reaction effects. Thus the only really solid connection between the Uncertainty Principle and measurement reaction is historical.

Newton observes Heisenberg stealing credit for his ideas.
Newtonian reaction in physical measurements is a distinct concept from Heisenberg Uncertainty, but it does at least seem that one must get the former right in order to understand the latter. So perhaps there really is some deep connection between them. Until that connection comes to light, however, anyone who wants to relate observer effects in general to a basic principle of physics should really be citing Newton, not Heisenberg.

Monday, October 28, 2013

Opposites: Open and closed

Are they really so opposite?
As Niels Bohr used to say, the opposite of an ordinary truth is a falsehood, but the opposite of a profound truth is another profound truth. I offer this opposite pair:

1) All systems are open.
2) All systems are closed.

A closed system is a set of physical things which can be regarded as isolated from the rest of the universe. An open system, in contrast, is affected by things outside itself, even if those things are not directly observed. So these statements are certainly opposite. How are they both true?

What defines 'the system'?
Experiments try to isolate variables, but we can never achieve perfect isolation. Vacuum chambers are made of steel walls, and over time a few stray gas atoms always percolate in and out of tiny cracks or pores in the steel surface. No laboratory building is perfectly insulated from vibrations. High energy cosmic rays can pierce any barrier; and so on. It may be possible to achieve isolation that is excellent for all practical purposes, but all physical systems are open, strictly speaking. 

If we really want to speak strictly, however, then the very concept of a ‘system’ is inherently an approximation. There is really only one system: the universe. The universe as a whole is closed by definition, so all systems are closed. Of course, it is no less impossible in practice to describe the whole universe than it is to seal off a portion of the universe in perfect isolation from the rest. It is often possible, however, to describe a very large closed system.

And indeed this is precisely what we normally do, to identify the distinctive physical features of an open system: we analyze a large closed system, and then discard all the information that does not refer directly to the small sub-system that represents our ‘open system’.

So any system is open, if we want it to be: it is only a matter of how low we set our threshold for ignoring slight influences from external factors. Conversely, however, any system is closed, if we want it to be: it is only a matter of how large we are willing to make our system, to bring relevant external factors within its frame. The distinction between open and closed systems is an important one, but it is not a distinction between two different ways things can really be. It is a distinction between two different ways of thinking about things. Both ways can be good ways of thinking. Both truths are profound.


An engine would still run inside a large box.
It seems to me that too many physicists today have lost sight of the second truth, however. The most profound mystery that physics still faces is the origin of irreversibility. We don't understand why we can't remember tomorrow. And whatever is going on in quantum measurement, it seems to be an empirical fact that all quantum measurement devices rely crucially on thermodynamically irreversible processes to achieve their extreme amplification. No-one can find a clear explanation of irreversibility within closed-system Hamiltonian mechanics, but few people want to accept that our mighty science is still stumped by such a basic question after a century of breakneck progress, so most people like to think that the open system generalization must be the simple solution.

Open systems can't be the basic explanation of irreversibility, because all systems are also closed. Whether or not a system is open is not a physical fact, but an arbitrary choice of perspective in deciding what to include within the system. So the openness of physical systems cannot make a fundamental difference to anything; anything that can be explained as an open system must also be explicable as a larger closed system. A steam engine would still run, at least for a good long time, inside a big impermeable box.

Tuesday, February 19, 2013

A Cup of Heat, Monsieur?

Pierre-Simon de Laplace and Antoine-Laurent de Lavoisier
thought that heat was an invisible liquid.
18th century physicists thought of heat as something much like electricity: an invisible fluid that could flow through other materials or soak into them. They named this hypothetical fluid ‘caloric,’ and it was thought to be a distinct form of material, like air or water, only different. If an object had absorbed a lot of caloric, like a sponge soaking up water, then it was hot. If the caloric drained out, the object cooled.

This was a sensible theory. We all know that objects can become electrically charged and that this changes their properties. A charged balloon can stick to the ceiling, or make your hair stand on end; a charged metal sphere can give you a zap. Electric current really is an invisible fluid, composed (most usually) of material particles called electrons. Most of the time they are bound up in atoms, but when they come loose they can flow into things, or out of them or through them. Things can become charged by soaking up extra electrons.

Soaking up electric charge. 
(Image by Wikimedia user Dtjrh2
used under Creative Commons license.)
In a similar way, it would seem, objects can also soak up heat. Hot objects may expand or cause burns, just as charged objects stick to ceilings or shock people. Heat flows from high to low temperatures, just as electricity flows from high voltage to low. Appealing as it is at this basic level, the analogy turns out to work well even in finer detail. Antoine Lavoisier and Pierre Laplace developed an extensive body of caloric theory that was able to explain heating and cooling and many other thermal phenomena with quantitative accuracy. 

So in the eighteenth century it only made sense to think of heat as something similar to electrical charge. In the following century, heat engines and electric motors would be developed in parallel, and engineers would still think of them in similar terms. Today again people are deciding whether to have a car powered by an electric motor instead of a combustion engine, and the differences seem to be ones of practical detail. 

Today we no longer believe that heat is a material fluid, however. Why not? It's not really as clear-cut an issue as textbooks often make it seem, because today our concept of matter is not as simple as it once was. We know that not even electrons are really these indestructible little specks of hard stuff: they can be created and destroyed in high-energy collisions. And in a lot of ways we still treat heat as if it were a material fluid.

The bottom line, though, is that even though electrons can be born and die, electric charge can't. If electrons appear or disappear they do so together with positrons, so that the net change in charge remains zero. The only way for an object to become charged is for charge to flow into it from elsewhere—or for opposite charge to flow out of the object. There is no way to simply create charge from stuff that is not charge. Heat, in contrast, can flow into or out of things—but that's not the only way to get heat. One can also create heat, without importing it or already having it. It's called friction.

Rub your hands together. Feel it? That's heat. 

You didn't just create any new material substance from nothing. Big Bangs and particle colliders aren't as easy as that. So heat is not a material fluid. What is it, then?

Whatever heat is, you've just made some. It's right there in your hands.

Thursday, February 14, 2013

Fire Glows

It's not just bright.
Humans discovered fire a long time ago, but for most of that time we only used it for warmth and light and cooking, rather as Bilbo Baggins used his magic ring for years just to avoid unwanted callers. Only in the 18th century did James Watt show up to play Gandalf, and reveal that our curious little trinket was the One Ring to rule them all. Fire has enormous power.

Even after centuries of technological progress since Watt, we still find it very hard to beat combustion as a source of power. Burning a tank of fuel releases enough energy to lift cargo all the way to the Moon, even with the horrible inefficiency of a rocket engine. Combustion provides energy, as one says, to burn. Why is fire such a tremendously greater power source than, say, clockwork springs or a windmill? I’ve never seen a clear answer to this question in any physics text, but I think I have found a succinct one of my own. 

Fire glows.
Light oscillates really fast.
The fact that fire glows demonstrates that fire is releasing energy from motions (of electrons in chemical bonds) with frequencies in the range of visible light. Those are very high frequencies, around 1014 cycles per second. As Planck taught us, energy is proportional to frequency. So if human energy needs are for motion at up to a few thousand RPMS, mere hundreds of cycles per second, combustion lets us tap energy resources on a scale greater by a factor of a million million. Combustion delivers so much energy, because molecular frequencies are so high.

This is what an engine somehow does.
It isn’t easy to gear all that power down by a factor of 1012 so we can use it, though. Electrons whir around in molecules far too fast for our eyes to follow. We can’t just throw a harness over them. Even if we could, they are very light in weight. They bounce off things, rather than dragging them along. To tap them for power, we need some clever way of gently bleeding off their enormous but very rapidly whirring energy, a tiny bit at a time.  There's more to it than just installing an awful lot of tiny gears. 

Getting fire to do work means transferring power across a huge frequency range. That's what thermodynamics is all about. The reason that thermodynamics doesn’t seem very much like the rest of physics is that energy transfer across a huge frequency range is an extreme case, in which certain otherwise obscure aspects of physics become very important. That makes them important in general, though, because high frequencies can deliver so much power. It's well worth learning how thermodynamics really works.

Raising Water by Fire

James Watt dramatically improved the steam engine, but he didn’t invent it. In his time, steam engines were already a practical and economical success. The machines of Thomas Newcomen and Thomas Savery had already begun the new era in human technology. 

Savery had a head for marketing as well as for steam. In 1702 he produced a pamphlet advertising his device as “An Engine to Raise Water By Fire”. His description may have been poetic, but it was literally exact. His engine pumped water by burning coal. Its killer application was draining coal mines. 

Humans may have discovered fire in distant prehistoric times, but the really useful thing about fire was only discovered in the 18th century. Never mind cooking or smelting metal or scaring wolves: fire can raise an awful lot of water. And if you can raise water, you can do pretty much anything, because raising water means you can exert force.

Savery’s and Newcomen’s engines were crude and simple, and by that I don’t mean that they were primitively made, rattling too much or leaking steam. They were just stupid designs, compared to Watt’s machines. They didn’t even use steam pressure to actually do their work, but just let the steam balance atmospheric pressure. Then they condensed the steam, by shooting in cold water, and let the suddenly unbalanced atmospheric pressure do the work. Savery’s engine didn’t even turn any moving parts, but just sucked water through pipes. It wasn’t so much more than a proof of concept, like the aeolipile.

Hindsight is 20/20, of course, and it’s not really fair to call Savery and Newcomen stupid. Watt’s proper steam-pressure engines also needed stronger boilers. The point is that even the crudest engines were such a quantum leap in power technology, compared to wind, water, or animal power, that they rapidly changed the world. In effect they turned lumps of coal into unprecedentedly huge amounts of practical work. Up until 1775, the Russian navy had been using two enormous windmills to drain its dry docks at Kronstadt; each time they drained the docks in order to work on a ship, the draining job took a year. When they installed a single Newcomen engine, it did the job in two weeks.

With coal-fired steam engines, the human capacity to exert physical force suddenly soared. Even today, the biggest problem with changing to power sources other than combustion is that fire can provide so much more power than, say, sunlight or wind. We humans keep thinking wistfully about switching away from combustion, to some form of clean energy, but we really want to maintain our current energetic lifestyle. We're like a big city lawyer who wants to quit the firm and become a social worker, but also wants to keep up the mortgage payments.

Why is fire so very good for raising water? I have some thoughts on this, based on the fact that fire glows.