The Wikipedia page on the Standard Model currently includes the diagram below:
- What do the arcs represent?
- Where is an arc missing?
- Which of the arcs is incorrect?
- Extra credit, Wikipedia history department:
How did one correction lead to both errors?
Why understanding seems stuck:
I count five kinds of nanotechnology, of which only three are called by that name. Of the three, one is a revolutionary prospect, one is a fantasy, and the third is mostly materials science. As for the other two kinds, one is the heart of today’s greatest technological revolution, while the other is the basis for progress toward the revolutionary prospect — but neither of these is called “nanotechnology”.
This may seem confusing, and it is. Indeed, people who think they know something about “nanotechnology” often have a lot to unlearn, and would be better off knowing basic physics and chemistry and starting from there. This situation makes it extraordinarily difficult to have a productive conversation about what really matters.
Here’s a compact summary in a nice, legible png image:
Please copy or link the above wherever it might surprise someone:
In the Economist: “Rise of the robots: Prepare for a robot invasion. It will change the way people think about technology”.
The robotics revolution is, of course, riding the exponential wave of today’s leading nanotechnology, digital nanoelectronics, and today’s robots give only a taste of what nanomechanical technologies will enable through radical improvements in the cost and performance of physical products.
We really need to think about the real future — how to manage a world with pervasive robotics, pervasive surveillance, and radical material abundance (there’s a book about that).
In a recent post, the always intelligent and provocative Cosma Shalizi notes John D. Norton’s argument against (nearly) thermodynamically reversible computation, but Norton’s argument is mistaken.
In his paper “The End of the Thermodynamics of Computation: A No-Go Result,” Norton correctly states that “In a [nearly] thermodynamically reversible process, all component systems are in [nearly] perfect equilibrium with one another at all stages,” and then discusses systems in which “Fluctuations will carry the system spontaneously from one stage to another [and as] a result, the system is probabilistically distributed over the different stages.”
But the stages of computation themselves need not be in equilibrium with one another, and hence subject to back-and-forth fluctuations. Instead, a time-varying potential can carry a system deterministically through a series of stages while the system remains at nearly perfect thermodynamic equilibrium at each stage. In other words, the state of the system need not be probabilistically distributed over the different stages.
This is an example of a scientist describing an unworkable solution to a problem and then asserting that no solution will work, when workable solutions are already known. Richard Smalley did a similar but more damaging disservice to atomically precise fabrication by inventing and rejecting an unworkable concept involving exotic atom-plucking “fingers,” while ignoring a decades-old literature that described the now-mundane concept of guiding the motion of reactive molecules.
TL;DR: The standard view of the thermodynamics of computation is correct.
Nanoelectronics is a nanotechnology that makes possible the drone technologies that threaten to upend the power relationships that underpin modern civilization:
Drones will cause an upheaval of society like we haven’t seen in 700 years
Noah Smith is often worth reading: Noahpinion.com
How to roll back the impact of agriculture here.