Tagline: A Brief History of Hype and Failure
My rating: 82/100
See Book Notes for other books I have read. If you like my notes, go buy it!
- At the time, Zeppelins had twice the range of the most common airplane, far more spacious cabins, and a better safety record.
- Charles F. Kettering (1876-1958), longtime (1920-1947) head of research at General Motors insisted on calling the lead additive in gasoline “ethyl gas”, an intentionally deceptive title.
- People are selfish. General Motors did not pursue other known alternatives for preventing engine knocking because they knew they could not control them. They pushed leaded gasoline because they knew they could make money.
- One third of corn crop in the USA goes toward production of ethanol for gasoline additives.
- Leaded gasoline was eventually stopped by the US Clean Air Act of 1970, which was not written to reduce lead but smog. People were more concerned about how the air looked than they were about the poison in their lungs. #cognitivebias #availabilityheuristic
- Helium airships are very unlikely to be practicable now. In the US, recent domestic helium consumption has been about 40 million cubic meters a year, if all of this annual use went into airships, it would be good enough for about two hundred large (Zeppelin-like) structures.
- No single factor [in the failure of nuclear energy to become dominant] was more important than the rapid retreat of electricity demand after the 1970s.
- The Hyperloop concept of travel through vacuum tubes was proposed as a serious solution for transportation as early as 1810.
- In 2022 there were no fully self-driving cars; fewer than 2% of the world’s 1.4 billion motor vehicles on the road were electric, but they were not “green”, as the electricity required for their operation came mostly from burning fossil fuels: in 2022 about 60% of all electricity in general came from burning coal and natural gas.
- Electric powered aircraft are simply not practical for commercial purposes: The energy density difference between the best batteries today and common kerosene jet fuel is about 24x.
- In the year 2000, fossil fuels supplied 87 percent of the world’s primary energy, while in 2020 that share was 83 percent, hence an annual reduction of 0.2% – but now we are told that we should end our dependence on carbon by 2050. But going from 83 percent to zero in thirty years would require cutting 2.75% of global fossil carbon every year, a rate nearly fourteen times faster.
Chapter 1 Inventions and Innovations: A Long History and Modern Infatuation
Those interested in failed designs should consult From the Titanic to the Challenger by Susan Herring, Spectacular Flops by Michael Shiffer, or The Role of Failure in Successful Design by Henry Petroski.
Chapter 2 Inventions That Turned From Welcome to Undesirable
Freon-12 was the first of many chlorofluorocarbons (CFCs), synthetic compounds that rapidly became the world’s dominant refrigerants, propellants in billions of aerosol cans, and as industrial degreasing agents and solvents.
Leaded Gasoline pg 23
Charles F. Kettering (1876-1958), inventor of the first practical electric starter, longtime (1920-1947) head of research at General Motors, and the man who insisted on calling the leaded additive “ethyl gas.”
Octane (C8H18) is one of the alkanes (hydrocarbons with the general formula CnH2n+2) that form anywhere between 10 to 40 percent of light crude oils.
The higher the octane rating of gasoline, the more resistant the fuel is to knocking, and engines can operate more efficiently with higher compression ratios.
Three options were available to minimize knocking.
- keep compression ratios low (4.3:1)
- develop smaller engines running on better fuel
- use additives to prevent uncontrolled ignitions
During 1900-1920 … numerous tests proved that engines using pure ethanol would never knock.
On December 9, 1921, a solution of 1 percent tetraethyl lead (TEL)-(C2H5)4Pb- produced no knock in the test engine, and soon was found to be effective even when added in concentrations as low as .04 percent by volume.
Why did GM decide not only to pursue just the TEL route but also to claim (despite it’s own correct understanding) that there were no available alternatives: “So far as we know at the present time, tetraethyl lead is the only material avaiable which can bring about these results”? Several factors help to explain the choice. The ethanol route would have required a mass-scale development of a new industry dedicated to an automotive fuel additive that could not be controlled by GM. Moreover, as already noted, the preferable option, producing ethanol from cellulosic waste (crop residues, wood) rather than food crops, was too expensive to be practical. In fact, the large-scale production of cellulosic ethanol by new enzymatic conversions, promised to be of epoch-making importance in the twenty-first century, has failed its expectations, and by 2020 high-volume US production of ethanol … claimed almost exactly one-third of the country’s corn harvest.
Midgley’s TEL patent filed in 1922 gave the company full control of an effective low-volume additive that could be dispensed at very low cost.
Kettering’s insistence on the inaccurate naming of the additive (“ethyl gas”), which deliberately avoided acknowledging the presence of lead. This heavy metal, known for its toxicity since Greek antiquity, was sometimes claimed to have played a major role in the demise of the Roman Empire, and by the early twentieth century it was well known as a cause of health problems associated with various occupational exposures. But GM and its TEL suppliers were not just engaged in disregarding lead’s health effects, they made a resolute and repeated claims aimed at minimizing or even entirely dismissing any concerns about the health effects of a compound to be emitted into the environment from car exhaust on such a large scale.
GM and DuPont claimed, without doing any studies, that the average street would likely be so free from lead that it would be impossible to detect its absorption. But in late October 1924, thirty-five workers in the TEL processing plant in New Jersey experience acute neurological symptoms, and five of them died.
During the three decades between 1945 and 1975 the US consumed nearly two trillion gallons of gasoline, which means (using the 2.4 g/gal) average) that it added about 4.7 millions tons of lead to the environment via vehicle exhausts, with annual additions surpassing 200,000 tons a year during the early 1970s.
A technical fix for photochemical smog became possible in 1962 when Eugene Jules Houdry patented a way to remove the pollutants from vehicle exhaust just before their emission into the atmosphere by deploying catalytic converters. Platinum was used as the rare metal catalyst; it would be poisoned by lead’s presence in exhaust gases, and this made the introduction of effective catalytic converters (mandatory in all cars starting with the 1975 model year) dependent on the availability of unleaded gas.
In 1970 unleaded gasoline only had about 3 percent of the US market. … By 1991 it had 95 percent. In 1985 an EPA study estimated the value of the benefits of the final lead phasedown (effects on children, reduction in other pollutants, improvements in maintenance) to be at least twice that of the associated costs (higher refinery expenditures) and twelve times that when the costs of adult male hypertension were added.
Leaded gasoline was eventually stopped by the US Clean Air Act of 1970, which was not written to reduce lead but smog. People were more concerned about how the air looked than they were about the poison in their lungs.
DDT pg 36
Paul Hermann Muller
In 1939 he synthesized dichlorodiphenyltrichloroethane, and tests immediately showed that it had an insecticidal contact effect unmatched by any known compound.
The US military began to use the compound to fight malaria, typhus, and lice, first in Europe and then on the Pacific islands.
The results were convincing. During two summer months of 1943 in Sicily, the US Army had 21,482 hospital admissions for malaria compared to 17,375 battle casualties (wounded and dead). … Field testing of DDT began in Italy in August 1943; by 1945 new cases of malaria had declined by more than 80 percent.
CFCs pg 49
dichlorodifluoromethane (CCl2F2), known as F12 and sold under the proprietary name Freon.
The annual global output of the two dominant compounds, F-11 and F-12, later known as CFC-11 and CFC-12 or R-11 and R-12, rose from less than 550 tons in 1934 to the peak of 812,522 tons in 1974.
CFCs were staying in the atmosphere … accumulating aloft. But did the presence of these compounds, as James Lovelock’s group concluded, pose “no conceivable hazard” because they “did not disturb the environment”
Molina’s Nobel lecture:
The CFCs will not be destroyed by the common cleansing mechanisms that remove most pollutants from the atmosphere, such as rain, or oxidation by hydroxyl radicals. Instead, the CFCs will be decomposed by short wavelength solar ultraviolet radiation, but only after drifting to the upper stratosphere – above much of the ozone layer – which is where they will first encounter such radiation. Upon absorption of solar radiation the CFC molecules will rapidly release their chlorine atoms, which will then participate in the following catalytic reactions:
Cl + O3 > ClO + O2
ClO + O > Cl + O2
Chlorine destroys ozone but then is released to start a new cycle of destruction, and a single atom of the gas can destroy on the order of 100,000 ozone molecules before it is eventually removed from the stratosphere by downward diffusion and reactions with methane.
The global production of CFCs declined from its 1974 peak; in March 1978 the US, Canada, Norway, and Sweden banned the use of nonessential aerosols; and in 1980 the European Community made a commitment to a CFC capacity cap and a 30 percent reduction in aerosol use.
Decisions made by DuPont, the largest US CFC maker were critical. Subsequent analyses both praised and criticized the sequence of the company’s (sometimes inconsistent) CFC related decisions, but its embrace of an early production ban and its role in supplying, fairly rapidly, commercial alternatives are indisputable.
CFCs from old refrigerators that were not properly disposed of continued to add to the atmospheric burden long after the ban on production went into effect. … Reconstruction of past levels and monitoring of CFC-11 (since 1977) show the averages for the Northern Hemisphere rising from 0.7 ppt in 1950 to 177 ppt in 1980, peaking at 270 ppt in 1994 and then declining to about 225 ppt in 2020.
When gases are compared on the basis of their global warming potential (GWP) over a period of one hundred years, with CO2, by far the most abundant gas emitted by human actions set at one, the scores are 28 for methane (from natural gas production and transport, rice fields, and enteric fermentation of ruminants), 265 for nitrous oxide (from fertilizers), 4,160 for the now outlawed CFC-11, and 10,200 for CFC-12.
My note: this GWP metric is the relative heat trapping capability of a single molecule. It is not by volume. Therefore, one molecule of methane will trap 28 times more heat on earth than one molecule of carbon dioxide.
By 2020 there were some 1.8 billion air conditioning units in operation, with more than half of them in just two countries, China and the US. But this is only a fraction of the potential total because among the nearly three billion people living in the world’s warmest climates, fewer than 10 percent have air conditioning, compared to 90 percent in the US or Japan.
Actual global warming impact comparison of various molecules:
CO2: 414 ppm x 1 GWP = 414
Methane: 1.877 ppm x 28 GWP = 52.556
Nitrous Oxide: 0.333 ppm x 265 GWP = 88.245
CFC-11: 0.000230 ppm x 4,160 GWP = 0.9568
CFC-12: 0.000525 ppm x 10,200 GWP = 5.355
(data from ChatGPT, sources?)
Chapter 3 Inventions That Were To Dominate – And Do Not
Lighter Than Air (LTA) flying machines
The [WW1] peace treaty restricted Germany’s airship construction.
The only way to do trans- or intercontinental travel was in tedious stages: three stops and more than fifteen hours were needed to make it from New York to Los Angeles, and when British Imperial Airways began to operate the London-Singapore link in 1934, its planes needed eight days and twenty-two layovers, including stops in Athens, Cairo, Baghdad, Basra, Sharjah, Jodhpur, Calcutta, and Rangoon. And while the Douglas DC-3 – introduced in 1935 and destined to become the most common and most durable piston-powered airplane in history – was about twice as fast as the Zeppelin (240km/hr), it had a maximum range of about 2,500 kilometers, just a quarter of the Zeppelin’s reach. And the cramped interiors of the first small all-metal-fuselage airplanes of the early 1930s provided no comparison to the overall roominess, designed public lounges, and dining room of a large airship. PP By the time it was grounded, in June 1937, the Graf Zeppelin had flown 1.7 million kilometers, carried more than 13,000 passengers, completed 144 intercontinental trips, and spent 717 days – nearly two years – aloft, all, despite some in-flight mishaps, without an injury to its crew and passengers.
The Hindenburg was the world’s largest airship at 245 meters long and just over 41 meters in diameter, with a volume of 200,000 cubic meters, powered by four Daimler-Benz diesel engines (890 kW each) and cruising at 122 km/h.
Even if the Hindenburg had continued to fly with a perfect safety record, it was an anachronism by the time it was launched.
The Hindenburg’s fastest crossing times between Frankfurt and New Jersey were nearly fifty-three hours westward and forty-three hours eastward, today’s scheduled flying times by Boeings or Airbuses are, respectively eight hours thirty-five minutes and seven hours twenty minutes, and under far more controllable circumstances.
All of these claims and plans have one thing in common: they pay little attention either to what any rapid expansion of LTA fleets would do to the supply of helium or what the actual revenue-earning time aloft might be. In the US, recent domestic helium consumption has been about 40 million cubic meters a year, with major uses in magnetic resonance imaging (30%), lifting gas (17%), and analytical and laboratory applications (14%). If all of this annual use went into airships, it would be good enough for about two hundred large (Zeppelin-like) structures.
Atomic Energy Commission (AEC)
During the late 1940s, David E. Lilienthal, the AEC’s first Chairman, … feared that the Russians would beat America “at developing the peaceful side of the atom.”
Politics, not economics, dictated the country’s development of nuclear energy generation.
America’s first commercial nuclear project … Shippingport, America’s first fission power plant … began to generate electricity on December 18, 1957.
This purely political decision tied the country’s future nuclear development to a reactor that none of the Manhattan Project physicists and no utility experts considered to be the best option, and this, together with the predicted high generation costs, left the utilities as uninterested in nuclear power during the late 1950s as they were a decade before.
New utility orders rose to twenty reactors in 1966 and thirty in 1967 before dipping below ten reactors in 1969, for a total of eighty-three new reactors between 1965 and 1969.
In 1973 the Organization of Petroleum Exporting Countries (OPEC), … quintupled its posted crude oil prices.
In 1973 US utilities ordered forty-two new nuclear reactors. Moreover, there was a growing consensus that this rapidly unfolding first nuclear era would soon be followed by a second era of far more effective [fast breeder reactors.]
[They,] unlike fission reactors (whether water- or gas-cooled), which operate by splitting abundant isotope U238, or slightly enriched U235 (from its natural presence of 0.7 percent to no more than 3-5%), would use a highly enriched isotope of U235 (15-30%) as the source of fast neutrons to convert abundant but non-fissionable isotope U238, placed in a blanket surrounding the reactor ore, to fissile plutonium (Pu239). Liquid sodium would transfer the generated heat and a breeder reactor would eventually produce at least 20% more fissionable fuel than it consumed. Szilard had envisioned breeder reactors already in 1943, and in 1945 Alvin Weinberg and Manhattan Project physicist Harry Soodak conceptualized the design.
LMFBR – Liquid Metal Fast Breeder Reactors
The reality proved quite different … Its major contributory causes were the unanticipated sudden end to the decadal doubling of electricity demand, excessive regulatory measure imposed on new plant construction, the ensuing mass cancellations of pressurized water reactor orders, failure to transform breeders from physicists’ dreams into even a semi viable engineering reality, and rekindled public distrust of fission used for electricity generation as a result of catastrophic accidents. I will explain briefly each of these factors, but I do not attempt to assign proportional blame (I am not sure that is even possible).
It is clear that no single factor was more important than the rapid retreat of electricity demand.
Electricity usage growth in the:
In 1974 the AEC was abolished and replaced by the Nuclear Regulatory Commision, which embarked on a seemingly endless series of regulatory interventions that slowed the new projects while raising their costs. Utility managers, accustomed to count on a virtually guaranteed doubling of electricity demand in a decade and the completion of new large stations in five to six years, now found themselves with steadily declining demand and protracted construction periods, and faced a future in which there might be no demand for electricity generated by plants completed at significantly higher cost after ten to fifteen years under construction.
The public’s unease was only reinforced by the reactor meltdown at the Chornobyl nuclear plant in Ukraine in May 1986.
During the 1980s there were no new reactor orders in the US, just cancellations, eventually amounting to 120 units.
In 1998 Westinghouse Power Generation was sold to Siemens, and the next year British Nuclear Fuels bought Westinghouse Electric Company, Toshiba took over in 2006, but in 2017 the reactor business went bankrupt once more and the company is now planning a three-way split to be completed by 2024.
In 2020 the world had 443 operating reactors … generating about 2,500 terawatt-hours of nuclear electricity generation.
And the breeders? Their development was based on several mistaken beliefs: the uranium 235, a fissionable isotope, was so scarce that its resources could not support large-scale nuclear generation.
In 1971, when the funding for the demonstration breeder began, its cost was estimated to be no more than $400 million. Projected costs had nearly doubled within a year, and by 1981, when the Clinch River breeder became the largest public works project in the US, the total was forecast to surpass $3 billion. … By the time the project was canceled in 1983 it had cost $8 billion.
Globally, the tag would be approaching $100 billion [invested in breeder projects], a sum that illustrated the power of nuclear lobbies and (against all evidence) of the stubborn beliefs of experts advising government.
Commercial fission should have been developed more deliberately, more cautiously, and with much more attention given both to its public acceptance and to the eventual long-term storage of its radioactive wastes.
Maximum speed was restricted to M 2.2 in order to use conventional aluminum alloys (flights above M2.2 require titanium and special steels because of thermal limitations). –> see pg 101
on October 23, 2003, the last Concorde flight departed JFK for Heathrow.
President Kennedy announced the development of an American supersonic plane on June 5, 1963.
The Federal Aviation Administration was claiming that the aim of the US program was “a safe, practical, efficient and economical vehicle”
Supersonic Travel (SST) would remain uneconomical, and that the sonic boom would create “undue public disturbance.” Eventually, all of these problems manifested themselves, and the weight of this combination led to the cancellation of the government support and hence to the end of American SST.
Airbus received more orders for new jetliners than Boeing in all but two years [2010-2020].
Four fundamental constraints [of SST] are apparent: a plane design dictated by the need to overcome enormous supersonic drag, engines powerful enough to sustain M2, accomplishing this economically, and doing so with acceptable environmental impacts.
Drag coefficient peaks at just above M1 and is lower at both subsonic and supersonic speeds.
The lift to drag ratio (L/D) – and hence the range of an aircraft – decreases with speed: for the Boeing 787, cruising at M0.85, it is 18, at M1 it is about 15, at M2 just 10. And while the Boeing 787 has a maximum range of nearly 14,000 km, the Concorde could go only less than 6,700 km, not enough for a transpacific flight without refueling (the flying distance from San Francisco to Tokyo is 8,246 km).
at M0.85 is 18
at M1 is 15
at M2 is 10
The material requirements to build airplanes are more exacting as the speed increases, but up to M2 they can be largely met with the best aluminum alloys. At M2.2 leading edges have temperatures as high as 135 deg C, higher than the temperature limits of the fiber reinforced polymers (90 deg C) that now make up most of the fuselage and wings in the latest jetliners.
The Concorde burned more than three times as much kerosene per passenger as the first wide-body Boeing 747.
by 2022 Boeing built nearly 1,600 747s. In contrast, only twenty Concordes were ever built, only fourteen entered commercial service, and only Air France and British Airways “purchased” them, with the acquisitions and every flight heavily subsidized by French and British taxpayers.
The best appraisal of the quest for supersonic speeds was published by Richard K. Smith, an American aviation historian, who called it the “frenzied international aeronautical saga of communicable obsessions”: “From the start to finish, in Britain, France, and the US, the supersonic airliner was a flying machine that the world did not need; it was a political airplane.“
Chapter 4 Inventions That We Keep Waiting For
desiderata – Something considered necessary or highly desirable.
The historical record shows … the basic concept for the firth mode of transportation [hyperloop] has been around for more than two hundred years. … yet not a single (near) vacuum- or low-pressure-tube, super-fast transportation project (be it for people or goods, or both) has been completed and put into operation, not even a trial short-distance link encompassing all of the basic components.
George Medhurst, an English clock maker and inventor, was the pioneer and determined proponent of rapid travel in tubes In 1810 he published a brief pamphlet titled A New Method of Conveying Letters and Goods with Great Certainty and Rapidity by Air.
The first patents for specific components were awarded in 1902 to Albert C. Alberstson and to Alfred Zehden in 1905, and at least three inventors contributed to advancing the concept of maglev transportation.
Boris Petrovich Weinberg built a model consisteing of a 10 kg iron carriage, a 20 meter long (32 cm diameter) evacuated ring tunnel of copper and a series of sequentially activated solenoids on top of the pipe suspending the carriage that eventually circulated at 6 km/h. This proof of concept was followed by proposals for a full-scale project operating at speeds of 800-1000 km/h. (My note: the hubris!)
Train ridership in the US peaked in 1920. (My note: this is close to the peak of the horse population in 1918).
In 1972 Robert Salter, at the Rand Corporation came up with a very high-speed transit system concept whose “tubecraft” would ride on, and be driven by, electromagnetic waves generated by pulsed or oscillating currents in electrical conductors forming the “roadbed” structure of an evacuated “tubeway”. PP Incredibly, Salter maintained that the speeds required for his proposed continent-spanning link (NY to LA) would “certainly be on the order of thousands of miles per hour,” and such supersonic speeds … could be accommodated only in super-straight underground tunnels whose construction would claim all but a small share of the system’s overall cost. By 1978 Salter was suggesting that the “Planetran” could be “extended to a worldwide network using under-ocean tunnels to connect continents” and that it would be “safe, convenient, low-cost, efficient and non-polluting.” What a perfect example of that common phenomenon of an inventor attached to his cherished project far beyond the boundaries of any critical appraisal!
No hyperloop line, on pylons or in tunnels, was in operation by early 2022, and the forecasts of earliest completion dates have shifted to the late 2020s.
In July 2017 Musk, out of the blue, famously tweeted that he had “just received verbal government approval for the Boring Company to build an underground NY-Phil-Balt-DC Hyperloop. NY-DC in 20 mins.”
Although modern tunneling has become remarkably mechanized, costs remain high. The Gotthard Base Tunnel in Switzerland, at 57 km the world’s longest, cost about $10.5 B (nearly $200 M/km) and took nearly seventeen years to finish.
Nitrogen Fixing Cereals
The challenge and the solution were described in memorable terms in September 1898 by William Crookes, a chemist and physicist, in his presidential address on wheat delivered at the British Association’s annual meeting in Bristol. The most quoted sentence from his presentation was that “all civilised nations stand in deadly peril of not having enough to eat,” and he estimated that the rising demand would bring a global wheat supply shortfall as soon as 1930. But he also identified the most effective solution and its most important component: increased crop fertilization and higher applications of nitrogen.
Common annual applications average more than 100 kg of the nutrient per year per hectare … This is, of course, a substantial economic loss (nitrogen fertilizers commonly account for a fifth of variable expenses in intensive crop farming) and one that also causes major environmental problems. PP None of these environmental problems is now more widespread and difficult to control than the creation of large dead zones in coastal waters.
Controlled Nuclear Fusion
The fusion of hydrogen into helium in the proton-proton cycle takes place only once the temperature reaches 13 million degrees of absolute temperature
The easiest (a relative term in this context) way to achieve controlled fusion is to combine the two heavy isotopes of hydrogen, deuterium and tritium, to form an isotope of helium.
The two elements needed for controlled fusion are abundant: deuterium can be separated from ocean water (there are 33 g of deuterium in every cubic meter of seawater) and lithium. This light metal is now in high demand for batteries, but its resources (close to 90 million tons in 2020, and highly likely to grow further) are sufficient for about one thousand years of extraction at the recent level. But future fusion plants will also need to generate their own tritium because this isotope is exceedingly rare in nature (it is produced in the atmosphere through the collisions of cosmic rays with nitrogen molecules). This tritium generation would be done by capturing neutrons in a lithium blanket surrounding the confined plasma.
Tokamak, and acronym created from the beginning syllables (and the first letter) of the Russian term for toroidal chamber with magnet coils, toroidal’naya kamera s magninymi katushkami.
Since 1970 about sixty large-scale conceptual controlled fusion designs were developed, and more than one hundred experimental facilities were built in the US, Russia, Japan, and the EU.
Q, the ratio of the thermal power produced by deuterium-tritium fusion to the power injected into a fusion device in order to superheat the plasma and initiate a fusion reaction at usefully high levels. Obviously, Q=1 is the breakeven point. … So far, the highest Q, 0.67, has been achieved by the Europeans tokamak JET in the UK.
No matter when it becomes fully functional, ITER will not capture any outgoing heat to be used for electricity generation and will not attain a state of continuous fusion: it will generate pulsed net energy (Q > 1) only when the ratio is calculated by dividing the heat energy output by the energy used to heat the plasma (50 MW), not by the total electricity consumption of the facility.
The latest turbo generators, operating under supercritical steam pressure, turn more than 40% of the incoming high temperature heat into electricity.
During the past seven decades the world has spent at least $60B (in 2020 dollars) on developing controlled fusion, but it remains perhaps the most stubbornly receding fata morgana (mirage) on record: always to be reached after yet another thirty years.
Initial cost estimates for ITER were as low as 5 B Euro, but by 2016 ITER’s director had admitted that the project was a decade late and at least 4 B Euro over budget, with later reports showing the overall sum reaching 15 B Euro, and in 2018 the US Department of Energy nearly tripled its cost estimates for ITER, to $65B.
Even if the learning process is taken into account, the most optimistic estimate for demonstration reactors (at least three of them, to be built after 2040) would not be less than $20 B each.
If the demonstration plants worked as promised, it would not be a bad bargain. … And, to put it all into the most realistic perspective, it is just a tenth of the monies spent on the two-decades-long war in Afghanistan that ended in the chaotic US withdrawal and the Taliban’s complete victory.
This brief account of controlled fusion efforts would not be complete without going back to 1989. Early in that year came (in a press conference and in a brief pater in the Journal of Electroanalytical Chemistry) a radical departure from the decades of news concerning advances in the quest for controlled thermonuclear power two physicists at the University of Utah, Stanley Pons and Martin Fleishmann, claimed they had succeeded in fusing deuterium nuclei at room temperature in a test tube. Electrolysis of a lithium salt solution led so many deuterium atoms to absorb into a palladium electrode that some of their nuclei appeared to fuse, producing net energy (above that supplied for electrolysis), as well as neutron and gamma ray emissions, clear signs of a process previously attainable only under starlike conditions, and proof of what the press soon called cold fusion.
… after thirty years plus of these claims, convincing proof is still missing.
Chapter 5 Techno-Optimism, Exaggerations, and Realistic Expectations
What is true about the past is, despite recent claims to the contrary, likely to be repeated in the future.
This cautionary attitude should be self-evident to any diligent student of modern technical advances – and so should be the basic attendant lesson.
Skepticism is appropriate whenever the problem is so extraordinarily challenging that even the combination of perseverance and plentiful financing is no guarantee of success after decades of trying.
BCI – Brain Computer Interface
Forecasts of completely autonomous road vehicles were made repeatedly during the 2010s: completely self-driving cars were to be everywhere by 2020, allowing the operator to read or sleep during a commute in a personal vehicle. All internal combustion engines currently on the road were to be replaced by electric vehicles by 2025. … A reality check: in 2022 there were no fully self-driving cars; fewer than 2% of the world’s 1.4 billion motor vehicles on the road were electric, but they were not “green”, as the electricity required for their operation came mostly from burning fossil fuels: in 2022 about 60% of all electricity in general came from burning coal and natural gas.
BLA – biological license application
NME – New Molecular Entities synthesized by chemists
The [microchip] process began with transistors 80 micrometers wide; 7 nanometer based chips are now common (their width is only 0.0000875 that of the first design) and in 2021 IBM announced the world’s first 2 nanometer chip, to be produced as early as 2024. Because the size of a silicon atom is about 0.2 nanometers, a 2 nanometer connection would be just ten atoms wide, and the physical limit of this fifty year old reduction process is obviously in sight.
Similarly low exponential growth rates characterize the economic growth of many countries that have the greatest need to advance. Since 1960 the average per capita gross domestic product of sub-Sahara Africa has been growing annually by no more than 0.7 percent when expressed in constant monies. In Brazil it has been less than 2% for half that time, while in exceptionally fast-growing China it was above 5% between 1991 and 2019. Growth rates of technical advances, productive capacities, and efficiencies have been similarly restrained. Most of the world’s electricity is generated by large steam turbines whose efficiency got better by 1.5% per year during the past hundred years.
In 1900 the best battery (lead-acid) had an energy density of 25 watt-hours per kilogram; in 2022 the best lithium-ion batteries deployed on a large commercial scale had an energy density twelve times higher. … even batteries with ten times the 2022 (commercial) energy density (that is, approaching 3,000 Wh/kg) would store only about a quarter of the energy contained in a kilogram of kerosene, making it clear that jetliners energized by batteries are not on any practical horizon.
If the cost of renewable electricity generation have been plummeting, why do the three EU countries – Denmark, Ireland, and Germany – with the highest share of energy from new renewable sources, wind and solar, have the continent’s highest prices? In 2021, the EU mean was €0.24/kWh, but the Irish price was 25% higher, the Danish price 45% higher, and the German price 37% higher.
My thoughts on this: I think it should be no surprise that switching away from fossil fuels will cost us, and in fact should be admired assuming these countries have taken this burden upon themselves voluntarily (and not forced upon the population politically, which is probably what actually happened). A broader metric might be used that includes the detrimental cost of fossil fuels upon the environment, or what is referred to as the Social Cost of Carbon.
We could measure health, longevity, and quality-of-life gains by using the common denominators of life-years saved (LYs) or quality-adjusted life-years (QALYs) gained.
American drug overdose deaths totaled about 48,000 in 2015, but in the twelve months ending in April 2021 they had doubled, to about 98,000, compared to about 320,000 deaths from all cancers and 142,000 deaths from lung cancer.
My note: what does this mean? First, drug deaths are roughly 1/3 those of cancer; one of these is preventable. Second, why are drug related deaths increasing?
Bill Gates noted in October 2015, “Half the technology needed to get zero emissions either doesn’t exist yet or is too expensive for much of the world to afford.”
A few examples illustrate the wishful nature of such [decarbonization] targets. In the year 2000, fossil fuels supplied 87 percent of the world’s primary energy, while in 2020 that share was 83 percent, hence an annual reduction of 0.2% – but now we are told that we should end our dependence on carbon by 2050. But going from 83 percent to zero in thirty years would require cutting 2.75% of global fossil carbon every year, a rate nearly fourteen times faster than we managed during the first two decades of the twenty-first century. Where are the technical capabilities and financing that would allow as to realize, instantly, such a large annual cut and sustain it for three decades?
There is yet another way how to look at which inventions are needed most … more than three billion people surviving essentially at the subsistence level … meeting the essential water, food, energy, and material needs come first.