Friday, February 25, 2022

Fusion, the Fuel of the Future

 Did it ever occur to you that you could run an ordinary car engine -- cylinders, pistons, crankshaft and all -- with no fuel? 

No problem! You just install really heavy-duty spark plugs, and as the piston reaches the top of the compression stroke in each cylinder, fire off a huge spark, think of it as a small bolt of lightning. The air doesn't care where the energy comes from, just that it gets hot and expands to push the piston back down. As long as you dump as much energy in through the spark as you would have gotten from oxidizing fuel, the motor hums right along. 

Of course, this would be a horribly inefficient way to use electricity to turn a shaft; existing electric motors are much much better. The point of the mental exercise is to get you to compare the amount of energy used by the spark and that produced by burning the fuel. Realize that they might actually be comparable.

In fact, that's what is going on in pretty much all current fusion experiments. So far, nobody has managed to get as much energy out of the fuel as they put in in the spark. The latest big buzz from the JET torus managed about 70%. However, with the current amount of interest and investment, it seems likely that someone will hit  100% (or exceed! 101%!) sometime this decade.

When you hear about that, what will it mean? 

Nice bit of science, not so much for energy. Let's go back to the car engine. 70% or so of the energy produced by the burning fuel is simply thrown away as heat. Then of the mechanical energy you get, you use a lot of it overcoming friction in the engine, turning a generator to produce more sparks, even pumping coolant through the engine so it doesn't melt. 

We don't really know in its fullness how the energy from the fusion reactor is going to be turned into (say) electric power, but we can say with fair assurance that it's not going to be cheap, easy, or efficient. So even when the magic 100% figure gets achieved (and for more extensive times than 5 seconds!), it's virtually certain that you have at least another decade to wait before there is any fusion power flowing on the grid.


Saturday, March 13, 2021

Foo Fighters

 Interesting post from Robin Hanson about how the universe may be full of incompetent aliens who may have randomly been sending UFOs to us for centuries but never got anything done. I agree with most of Robin's points about how our power structures are enormously wasteful, sort of like the ancient Chinese empires, and we shouldn't really expect our civilization, or the aliens, to do any better on the average.

He uses the "Foo Fighters" seen by WWII aviators as an example of something that may have been alien craft:

... pilots flying over Germany by night reported seeing fast-moving round glowing objects following their aircraft. The objects were variously described as fiery, and glowing red, white, or orange. Some pilots described them as resembling Christmas tree lights and reported that they seemed to toy with the aircraft, making wild turns before simply vanishing. Pilots and aircrew reported that the objects flew formation with their aircraft and behaved as if under intelligent control, but never displayed hostile behavior. However, they could not be outmaneuvered or shot down. 

The main problem with this as an alien spacecraft, though, is that I have seen one of these myself.

And I know what it is.

It was a couple of decades ago. We were on a cruise ship, sailing up the Canadian/Alaskan Inner Passage, about a night out of Skagway. We had splurged for the trip, so we had a top-deck cabin with a balcony (and a butler). Just about bedtime, on the night in question, I stepped out onto the balcony for a breath of fresh air before turning in.

It was slightly overcast and very still. I could see shore lights in the distance, and the ship made steady progress through a sea that was almost like glass. There was no moon.

I looked forward. There was a glowing disc hovering maybe 200 feet over the bow. It dashed out in front, almost like a dog running in front of a car. It came back. It danced around. It made circles around the ship. It stopped overhead, never quite standing still, but I got a good look at it. Featureless, round, glowing softly.

It was completely silent.

I watched it for maybe 10 or 15 minutes. Then it disappeared. I went back in, mentioned to my wife that I had just seen the most amazing thing, and went to bed.

What could it have been? The only thing I have seen even vaguely like it was the spot from a searchlight dancing on the bottom of a cloud layer. But the ship didn't have the right kind of searchlight, it couldn't have done the fine dancing, it would have changed shape as it tilted low to go far out, and the beam would have raked the deck as the spot dashed from stem to stern and I would have seen it. Nothing like that happened.

But it was a clue. Something like the searchlight had happened, but not from below. The overcast layer was thin, of a kind I have seen many times flying. We were in northern waters, and as it turned out, some people in Skagway had seen aurora that night.

We were on an enormous iron object, moving, and that does interesting things to the Earth's magnetic field. In particular, it can focus a tendril out of an otherwise diffuse electron flow, as seen here:


So the ship had gone through what would otherwise been a nearly invisible aurora borealis, and concentrated it into visibility as it pierced the thin cloud layer. If you've played with a plasma ball like this, you will be very familiar with the character of its motion and why it seemed to be interested in the ship.

And it sounds a lot like the descriptions of the "Foo Fighters."



Sunday, March 7, 2021

A Complex Treasure

 In the wake of this recent work, there has been a resumption of the sporadic debate over which imaginary numbers are real (pun intended) or not. Not so much by mathematicians, who tend to believe that they discover, rather than inventing, the structures of thought they employ, but physicists, who tend to think of the math they use as a different order of being than the actual physical world they describe.

From a pragmatic point of view it doesn't matter. If you have various mental tools, use the one that works best. The best meta-rule is Ockham's Razor.

In that spirit, I thought I would revisit a cute little puzzle that is often used to show how complex numbers are a bit more simple, nifty, or appropriate than just plain pairs of coordinates for solving a problem. The problem has nothing to do with the deeper properties of the quantum field; it's about a treasure map. 

It goes like this:

You have come into possession of a chart from a pirate long dead. It shows the location of a small island in the Spanish Main, and on the back are directions to find the treasure. "Pace from the cairn to the ash tree, turn right and pace the same, driving a stake. Again from the cairn to the bay tree, turning left and pacing the same, and a stake. Midway between your stakes is the treasure to be found."

You get to the island and find the ash and bay trees. Unfortunately in the meantime the island has been visited by a gang of kleptotaphophiles, who stole all the stones of the cairn, leaving it unmarked. How can you find the treasure?

Or, since this is really a math puzzle, how can you use complex numbers to prove your solution is correct?

The trick with a math problem is often to gain an intuition as to what the solution is, and then use the tools you have to show it is right. 

In my experience you are much more likely to be taught how to use the tools, and less likely to be taught how to gain an intuition. With that in mind, let's look at the map.

On our island it just so happens that the bay and ash trees are on an exact east-west line, and are exactly two furlongs apart. (A mathematician would phrase that, "Without loss of generality we may assume ..."). 

Now here's the essence of gaining an intuition. Take boundary cases, cases where something goes to 0 or 1, anything to simplify the problem without changing it overall. In the case of the treasure map, for example, start with cases where you can easily see the answer without doing any numerical geometry.

Let's call the point midway between the trees "Zero" and see what would happen if it were the cairn.

Pacing and staking, we get a simple diagram that shows the treasure would be exactly one furlong due south of Zero. Okay, what's another way to simplify?

Just pick the cairn as being one of the trees. Then the pacings for that tree are of zero length, and you drive the stake right there. When you pace the other tree you get:

Whaddya know, one furlong due south of Zero. And obviously it works the same starting from the ash tree.

What else can we eliminate? How about the string between the stakes? If they are in the same place, the centerpoint, and thus the treasure, will be right there. Put the cairn at one furlong due north of Zero:

Yep, it's at the same spot, one furlong south. And for a final flourish, what if we put the cairn right on the treasure?

Well, by now you will have gained the intuition that wherever you put the cairn, the treasure will be in the same place. And you even know where the place is. 

You now have your conjecture, and can prove it fairly straightforwardly using complex numbers.

But while you are doing that, I will have jumped to a conclusion, dug up the treasure, and escaped.








Saturday, March 6, 2021

Bayesian Death Match

 How likely are you to die of, say, covid versus, say, heart attack next year? If you look at official figures you are met with a variety of bewildering metrics. Furthermore it can be confusing to interpret what they mean in the first place. Here is a website that uses CDC figures to tell you what your odds are of dying of various causes, looking kind of like this:

Cause of Death Odds of Dying
Heart disease 1 in 6
Cancer 1 in 7
Suicide 1 in 88
Fall 1 in 106

Of course, this doesn't mean you have a 16% chance of dying of a heart attack next year; it means when all is said and done, given that you died, the chance it was by heart attack was 16%. This is beginning to sound pretty Bayesian, so let's see if we can turn it into something more intuitive.

Manipulating probabilities by Bayes' Rule is powerful but often less than straightforward. Luckily there is a way to do it that is quick and easy. The trick is to think in terms of the logarithm of the odds ratio. You are used to thinking of probabilities as a number between 0 and 1; just think of them this way instead:

  • -30 -- a billion to one against; your chance of winning a rigged lottery
  • -20 -- a million to one against; your chance of being struck by lightning in a year
  • -10 -- a thousand to one against;  your chance of flipping ten heads in a row
  • 0 -- even odds
  • 10 -- a thousand to one for; the chance you won't flip ten heads in a row 

and so forth. What we have done is taken the base-2 log of the odds ratio. Why would we do this?

The reason for using this log-odds form is that we can apply Bayes' Rule simply by adding them. Here's an example: the logodds for some given random average American to die (of any cause) next year is about -6 (roughly 1%). The logodds of dying in a car accident given that you died is -6.7. So the logodds of dying in a car crash next year is -12.7.

But we can do better than that. You aren't a random American: you can improve your estimate of the prior by knowing, for example, your age. CDC says:

This translates to 

Age Logodds
20 -10.5
30 -9.6
40 -9
50 -8
60 -6.8
70 -5.8
80 -4.5
90 -2.9

So rather than start with just -6, you'd start with the number associated with your age. Or you could have a table by sex, or whatever other division you thought made a difference.

Then add the number for the thing you're worried about. Dying from a fall is -6.7. So if you're 20, your total risk from falls is -17.2; you'll outlive Methuselah. But if you're 90, it's -9.6, well within the range of things to worry about.

The numbers for heart disease, cancer, and covid all stand pretty close to -2.6. Do the math.

Saturday, February 20, 2021

Energy Mess in Texas

 So we were having dinner with some friends last night and the topic of blackouts in Texas came up. It quickly became clear that people had not just an ideological slant to what they believed, but that it was so bad that you wouldn't believe they were talking about the same situation. One one side, it was because the windmills froze. On the other, the windmills didn't matter because they were only 14% of the total, it was because Texans were gun-toting cowboys who had to do things their own way and refused to listen to the real experts.

So let's go to the EIA (the federal Energy Information Agency) and see what actually happened with energy generation in Texas:

The tan line across the top is natural gas-fired generation. Brown is coal, red is nuclear, and the green line is wind.

First and most obvious point: on 15 Feb every significant form of generation took a hit. Gas dropped 10 GWh, coal and nuclear dropped a notch, and wind went from varying between 5 and 10 to varying between 0 and 5.

That's just how ice storms work: they freeze windmills; they freeze up the big piles of coal that are the reserves at those plants; they freeze up valves in gas pipelines and even safety sensors at nuclear plants. Furthermore, a huge unexpected cold snap diverts natural gas for heating, and there simply wasn't enough of it.

Unexpected? Indeed. Less than a month before, the forecast for Texas had been unseasonably hot:

This was from the Weather Channel, but they were using NOAA Climate Prediction Center numbers.

What actually happened? When push came to shove, Texas generating capacity got reduced by the storm and maxed out at well less than demand. They had no excess capacity to fall back on. The deep question here is why not?

This essay by a retired electric utility planning engineer sheds some light on the subject. Most of the country has utility markets where people pay both for energy delivered and capacity. You have to pay for capacity because it costs money to build the extra plants, the extra pipelines and storage facilities, and so forth. No money, no extra capacity to use in emergencies such as ice storms.

Most of the country has markets for energy actually used but also for capacity. It's like paying for seatbelts and airbags in your car. You hope not to have to use them, but...

But in Texas, there is no capacity market. It's pay for actual delivered energy only. The main reason is politics; the wind generation sector in Texas, most famously T. Boone Pickens but plenty more, has huge clout there. Texas generates more electricity from wind than any other state, even California, and indeed more than almost any country.

But, and this is the crucial fact, wind has no excess capacity. Look at the graph again: from 14 to 15 Feb. wind power dropped from 10 GWh to 0. Not only did half the windmills freeze, but the wind stopped blowing. A capacity market in Texas would have made wind significantly more expensive, and was thus politically untenable.

So Texas went with the delivered energy only market, specifically to help wind. Which worked. And when times are good, is cheaper, like a car with no airbags. But when the ice came, Texas had neither belt nor suspenders--and got caught with its pants down.


Thursday, February 4, 2021

Powerpaste, the fuel of the future

 Following in the theme of my ammonia posts, here's a new idea in hydrogen carriers as fuel for hydrogen fuel cells. It's from the Fraunhofer Institute, who call it powerpaste. It's basically magnesium hydride formulated in paste form. Add water and you get more hydrogen than you started with, the rest coming from the water. 



Unlike ammonia, the residue doesn't vanish as air and water; it's basically milk of magnesia, and would need to be recycled in a power plant. But it has (according to them) ten times the energy density of a battery, so it's a viable automotive power source. So to refuel, you swap out cartridges.

Given that they are pushing it for mopeds, I'm guessing that the full-cycle efficiency isn't high, but one imagines that could be improved over time.


Wednesday, February 3, 2021

Feynman could have saved your life

 We had Covid vaccines since March 2020, and yet we sat around all year, letting the pandemic develop into a major catastrophe, waiting for nothing but the permission of evil bureaucrats to use it.

Well, not quite. There is a huge difference between discovering the recipe for something in the lab and producing enough of it to protect billions of people. As we are seeing, nearly a year later the limited production of the vaccine is still a major bottleneck to public-scale inoculation. 

Some idiot on Twitter seemed to think that the problem was evil drug companies hoarding their recipes and this could all be solved by having everybody join in and produce vaccines. As often happens, the resulting Twitstorm provoked someone who knew something about it to speak up. In this case, it was Derek Lowe, pharma industry expert and blogger for Science magazine. Here is his post Myths of Vaccine Manufacturing.

To sum up the key point of the blog, Lowe lists 5 major step in producing a vaccine of the new mRNA type:

  1. Produce the appropriate stretch of DNA
  2. Produce mRNA from your DNA 
  3. Produce the lipids that you need for the formulation. 
  4. Take your mRNA and your lipids and combine these into lipid nanoparticles (LNPs). 
  5. Combine the LNPs with the other components and fill vials.
  6. Get vials into trays, packages, boxes, trucks, etc.
To make a long story short (do read the original and the mostly informative comments if you are interested), the major bottleneck is step 4. As Lowe puts it, "Everyone is almost certainly having to use some sort of specially-built microfluidics device to get this to happen .... Microfluidics (a hot area of research for some years now) involves liquid flow through very small channels, allowing for precise mixing and timing on a very small scale.... My own guess as to what such a Vaccine Machine involves is a large number of very small reaction chambers, running in parallel, that have equally small and very precisely controlled flows of the mRNA and the various lipid components heading into them. You will have to control the flow rates, the concentrations, the temperature, and who knows what else, and you can be sure that the channel sizes and the size and shape of the mixing chambers are critical as well. These will be special-purpose bespoke machines, and if you ask other drug companies if they have one sitting around, the answer will be “Of course not”. This is not anything close to a traditional drug manufacturing process."

So I went and looked at what these microfluidics machines for producing lipid nanoparticles (LNPs) looked like. What's the scale, how close to nanotech do you have to be, etc, etc.

from ACS Omega 2018, 3, 5, 5044–5051
So here's a sketch of the kind of thing we are talking about, and a scale of the nanoparticles themselves, which tend to range from 20 to 100 nanometers in size.

Here's a closer look at the actual gadget:

Although it produces ~50 nm LNPs, the gadget itself is micro-scale, not nanoscale, technology. They are generally made using photolithography on the same kind of machines as computer chips. This is a tricky and complex process, with a lot of planning, work, and development between even a well-specified design and a usable product.

So what if we had a technology that could produce things like this right off the shop floor? If the scale were millimeters instead of microns, you could make one of these in an hour in your garage with a slab of MDF and a router. I could print one tenth of that scale on my 3D printer, and there are printers out there that could beat that by another factor of 10.

If we had full-fledged nanotech now, none of this would matter; after all a LNP is just an arrangement of atoms. But what struck me when I read about this bottleneck was forcibly to be reminded of this passage in Where is my Flying Car, Chapter 14:

Is it Worth Starting Now?

Surely, you will say, it would have been wonderful if back in 1960 people had taken Feynman seriously and really tried the Feynman path: we’d have the full-fledged paraphernalia of real, live molecular machinery now, with everything ranging from countertop replicators to cell-repair machines.

After all, it’s been 55 years. The 10 factor-of-4 scale reductions to make up the factor-of-a-million scale reduction from a meter-scale system with centimeter parts to a micron-scale system with 10-nanometer parts, could have been done at a leisurely 5 years per step—plenty of time to improve tolerances, do experiments, invent new techniques.

But now it’s too late. We have major investment and experimentation and development in nanotech of the bottom-up form. We have Drexler’s PNAS paper telling us that the way to molecular manufacturing is by protein design. We have a wide variety of new techniques with scanning probes to read and modify surfaces at the atomic level. We have DNA origami producing arbitrary patterns and even some 3-D shapes. We even have protein engineering producing the beginnings of usable objects, such as frames and boxes.

Surely by the time a Feynman Path, started now, could get to molecular scale, the existing efforts, including the pathways described in the Roadmap for Productive Nanosystems, would have succeeded, leaving us with a relatively useless millimeter-sized system with 10-micron sized parts?

No—as the old serials would put it, a thousand times no.

To begin with, a millimeter-sized system with 10-micron sized parts is far from useless. Imagine current-day MEMS but with the catalog and capabilities of a full machine shop, bearings that worked, sliders, powerful motors, robot arms and hands. The medical applications alone would be staggering. ...

That's where we should have been, at the very least. But of course we would still have to wait for permission.