[Video Review] Sabine Hossenfelder on What’s Going Wrong with Particle Physics

Sabine Hossenfelder, a disillusioned (former?) theoretical particle physicist and science popularizer, recently published a video “What’s going wrong in particle physics?” on her YouTube channel criticizing fifty years of common practice in particle physics. I’ve previously reviewed her book Lost in Math: How Beauty Leads Physics Astray published in 2018 and an editorial “The Uncertain Future of Particle Physics” in The New York Times (January 23, 2019) questioning the wisdom of funding CERN’s recent proposal to build a new particle accelerator, the Future Circular Collider (FCC), estimated to cost over $10 billion.  See the links below for the Lost in Math book review and commentary on the editorial. Comments on the YouTube video follow these links.

Dr. Hossenfelder’s point in the video is fairly simple. She argues that since the formulation of the so-called “standard model” (formerly known as Glashow-Weinberg-Salam or Weinberg-Salam after theoretical physicists Sheldon Glashow, Stephen Weinberg, and Abdus Salam) in the 1960’s and 1970’s, particle physicists have confirmed the standard model, discovering the predicted W and Z bosons in the 1980s, the top quark at Fermilab, and finally the Higgs particle at CERN in 2012.

However, all attempts to find new physics and new particles beyond the standard model since the 1970’s have failed. Particle physicists continue to construct more complex theories that include the standard model such as the Grand Unified Theories (GUTs) of the 1970s that predicted the decay of the proton — never detected. These theories have predicted a long succession of hypothetical particles such as axions, supersymmetric partners, WIMPs (weakly interacting massive particles), other candidates for hypothetical dark matter in cosmology, and many, many more.

These complex beyond the standard model theories keep moving the energy level — usually expressed in billions or trillions of electron volts higher and higher, justifying the research, development, and construction of ever larger and more expensive particle accelerators such as the Tevatron at Fermilab in the United States, the Large Hadron Collider (LHC) at CERN in Switzerland, and the proposed Future Circular Collider (FCC) at CERN.

This lack of success was becoming apparent in the 1980’s when I was studying particle physics at Caltech — I worked briefly on the IMB proton decay experiment which surprise, surprise failed to find the proton decay predicted by the GUTs — and the University of Illinois at Urbana-Champaign on the Stanford Linear Accelerator Center (SLAC)’s disastrous Stanford Linear Collider (SLC) which ran many years over schedule, many millions of dollars over budget, and surprise, surprise discovered nothing beyond the standard model much as Dr. Hossenfelder complains in her recent YouTube video.

Cynical experimental particle physicists would make snide comments about how theory papers kept moving the energy scale for supersymmetry, technicolor, and other popular beyond the standard model theories just above the energy scale of the latest experiments.

Not surprisingly those who clearly perceived this pattern tended to leave the field, most often moving to some form of software development or occasionally other scientific fields. A few found jobs on Wall Street developing models and software for options and other derivative securities.

The second physics bubble burst in about 1993, following the end of the Cold War with huge numbers of freshly minted Ph.D.’s unable to find physics jobs and mostly turning into software developers. The first physics bubble expanded after the launch of Sputnik in 1957 and bust in about 1967. The Reagan administration’s military build-up in the 1980’s fueled another bubble — often unbeknownst to the physics graduate students of the 1980’s.

Dr. Hossenfelder’s recent video, like Lost in Math, focuses on scientific theory and rarely touches on the economic forces that complement and probably drive — consciously or not — both theory and practice independent of actual scientific results.

Scientific research has a high failure rate, sometimes claimed to be eighty to ninety percent when scientists are excusing obvious failures and/or huge cost and schedule overruns — which are common. Even the few successes are often theoretical — better understanding of some physical phenomenon that does not translate into practical results such as new power sources or nuclear weapons for example. But huge experimental mega-projects such as the Large Hadron Collider (LHC) or the Future Circular Collider (FCC), justified by the endless unsuccessful theorizing Dr. Hossenfelder criticizes, are money here and now, jobs for otherwise potentially unemployed physicists, huge construction projects, contracts for research and development of magnets for the accelerators etc.

Big Science creates huge interest groups that perpetuate themselves independent of actual public utility. President Eisenhower identified the problem in his famous Farewell Address in 1961 — best known for popularizing the phrase “military industrial complex.”

Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades.

In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.

Today, the solitary inventor, tinkering in his shop, has been over shadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.

Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.

https://www.archives.gov/milestone-documents/president-dwight-d-eisenhowers-farewell-address

(C) 2023 by John F. McGowan, Ph.D.

About Me

John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).

Was the Manhattan Project a Fluke?

Was the Manhattan Project a Fluke?

This video argues that the Manhattan Project which developed the first atomic bombs and nuclear reactors during World War II was a fluke, not representative of what can be accomplished with Big Science programs. There have been many failed New Manhattan Projects since World War II.

Minor Correction: Trinity, the first atomic bomb test, took place on July 16, 1945 — not in May of 1945 as stated in the audio.

(C) 2019 by John F. McGowan, Ph.D.

About Me

John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).

Lost in Math: The New York Times Op-Ed

Lost in Math

In July of last year, I wrote a review, “The Perils of Particle Physics,” of Sabine Hossenfelder’s book Lost in Math: How Beauty Leads Physics Astray (Basic Books, June 2018). Lost in Math is a critical account of the disappointing progress in fundamental physics, primarily particle physics and cosmology, since the formulation of the “standard model” in the 1970’s.

Lost in Math
Lost in Math

Dr. Hossenfelder has followed up her book with an editorial “The Uncertain Future of Particle Physics” in The New York Times (January 23, 2019) questioning the wisdom of funding CERN’s recent proposal to build a new particle accelerator, the Future Circular Collider (FCC), estimated to cost over $10 billion. The editorial has in turn produced the predictable howls of outrage from particle physicists and their allies:

Letters to the New York Times from theoretical physicist and science popularizer Jeremy Bernstein and Harvard Physics Professor Lisa Randall

The Worth of Physics Research

Physicists take issue with an Op-Ed article arguing against expensive upgrades to the super collider at CERN.

An article in Slate:

Particle Physics Is Doing Just Fine

In science, lack of discovery can be just as instructive as discovery.

By Chanda Prescod-Weinstein and Tim M.P. Tait

And apparently informal criticism of Dr. Hossenfelder during a recent colloquium and presumably on the physics “grapevine”:

“Maybe I’m crazy”, Blog Post, February 4, 2019

“Particle physicists surprised to find I am not their cheer-leader”, Blog Post, February 2, 2019

Probably there will be additional fireworks.

My original review of Lost in Math covers many points relevant to the editorial. A few additional comments related to particle accelerators:

Particle physics is heavily influenced by the ancient idea of atoms (found in Plato’s Timaeus about 360 B.C. for example) — that matter is comprised of tiny fundamental building blocks, also known as particles. The idea of atoms proved fruitful in understanding chemistry and other phenomena in the 19th century and early 20th century.

In due course, experiments with radioactive materials and early precursors of today’s particle accelerators were seemingly able to break the atoms of chemistry into smaller building blocks: electrons and the atomic nucleus comprised of protons and neutrons, presumably held together by exchanges of mesons such as the pion. The main flaw in the building block model of chemical atoms was the evident “quantum” behavior of electrons and photons (light), the mysterious wave-particle duality quite unlike the behavior of macroscopic particles like billiard balls.

Given this success, it was natural to try to break the protons, neutrons and electrons into even smaller building blocks. This required and justified much larger, more powerful, and increasingly more expensive particle accelerators.

The problem or potential problem is that this approach never actually broke the sub-atomic particles into smaller building blocks. The electron seems to be a point “particle” that clearly exhibits puzzling quantum behavior unlike any macroscopic particle from tiny grains of sand to giant planets.

The proton and neutron never shattered into constituents even though they are clearly not point particles. They seem more like small blobs or vibrating strings of fluid or elastic material. Pumping more energy into them in particle accelerators simply produced more exotic particles, a puzzling sub-atomic zoo. This led to theories like nuclear democracy and Regge poles that interpreted the strongly (strong here referring to the strong nuclear force that binds the nucleus together and powers both the Sun and nuclear weapons) interacting particles as vibrating strings of some sort. The plethora of mesons and baryons were explained as excited states of these strings — of low energy “particles” such as the neutron, proton, and the pion.

However, some of the experiments observed electrons scattering off protons (the nucleus of the most common type of hydrogen atom is a single proton) at sharp angles as if the electron had hit a small “hard” charged particle, not unlike an electron. These partons were eventually interpreted as the quarks of the reigning ‘standard model’ of particle physics.

Unlike the proton, neutron, and electron in chemical atoms, the quarks have never been successfully isolated or extracted from the sub-nuclear particles such as the proton or neutron. This eventually led to theories that the force between the quarks grows stronger with increasing distance, mediated by some sort of string-like tube of field lines (for lack of better terminology) that never breaks however far it is stretched.

Particles All the Way Down

There is an old joke regarding the theory of a flat Earth. The Earth is supported on the back of a turtle. The turtle in turn is supported on the back of a bigger turtle. That turtle stands on the back of a third turtle and so on. It is “Turtles all the way down.” This phrase is shorthand for a problem of infinite regress.

For particle physicists, it is “particles all the way down”. Each new layer of particles is presumably composed of smaller still particles. Chemical atoms were comprised of protons and neutrons in the nucleus and orbiting (sort of) electrons. Protons and neutrons are composed of quarks, although we can never isolate them. Arguably the quarks are constructed from something smaller, although the favored theories like supersymmetry have gone off in hard to understand multidimensional directions.

“Particles all the way down” provides an intuitive justification for building every larger, more powerful, and expensive particle accelerators and colliders to repeat the success of the atomic theory of matter and radioactive elements at finer and finer scales.

However, there are other ways to look at the data. Namely, the strongly interacting particles — the neutron, the proton, and the mesons like the pion — are some sort of vibrating quantum mechanical “strings” of a vaguely elastic material. Pumping more energy into them through particle collisions produces excitations — various sorts of vibrations, rotations, and kinks or turbulent eddies in the strings.

The kinks or turbulent eddies act as small localized scattering centers that can never be extracted independently from the strings — just like quarks.

In this interpretation, strongly interacting particles such as the proton and possibly weakly (weak referring to the weak nuclear force responsible for many radioactive decays such as the carbon-14 decay used in radiocarbon dating) interacting seeming point particles like the electron are comprised of a primal material.

In this latter case, ever more powerful accelerators will only create ever more complex excitations — vibrations, rotations, kinks, turbulence, etc. — in the primal material.   These excitations are not building blocks of matter that give fundamental insight.

One needs rather to find the possible mathematics describing this primal material. Perhaps a modified wave equation with non-linear terms for a viscous fluid or quasi-fluid. Einstein, deBroglie, and Schrodinger were looking at something like this to explain and derive quantum mechanics and put the pilot wave theory of quantum mechanics on a deeper basis.

A critical problem is that an infinity of possible modified wave equations exist. At present it remains a manual process to formulate such equations and test them against existing data — a lengthy trial and error process to find a specific modified wave equation that is correct.

This is a problem shared with mainstream approaches such as supersymmetry, hidden dimensions, and so forth. Even with thousands of theoretical physicists today, it is time consuming and perhaps intractable to search the infinite space of possible mathematics and find a good match to reality. This is the problem that we are addressing at Mathematical Software with our Math Recognition technology.

(C) 2019 by John F. McGowan, Ph.D.

About Me

John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).

The Perils of Particle Physics

Lost in Math
Lost in Math

Sabine Hossenfelder’s Lost in Math: How Beauty Leads Physics Astray (Basic Books, June 2018) is a critical account of the disappointing progress in fundamental physics, primarily particle physics and cosmology, since the formulation of the “standard model” in the 1970’s.  It focuses on the failure to find new physics at CERN’s $13.25 billion Large Hadron Collider (LHC) and many questionable predictions that super-symmetric particles, hidden dimensions, or other exotica beloved of theoretical particle physicists would be found at LHC when it finally turned on.  In many ways, this lack of progress in fundamental physics parallels and perhaps underlies the poor progress in power and propulsion technologies since the 1970s.

Lost in Math joins a small but growing collection of popular and semi-popular books and personal accounts critical of particle physics including David Lindley’s 1994 The End of Physics: The Myth of a Unified Theory, Lee Smolin’s The Trouble with Physics: The Rise of String Theory, the Fall of Science and What Comes Next, and Peter Woit’s Not Even Wrong: The Failure of String Theory and the Search for Unity in Physical Law.  It shares many points in common with these earlier books. Indeed, Peter Woit is quoted on the back cover and Lee Smolin is listed in the acknowledgements as a volunteer who read drafts of the manuscript.  Anyone considering prolonged involvement, e.g. graduate school, or a career in particle physics should read Lost in Math as well as these earlier books.

The main premise of Lost in Math is that theoretical particle physicists like the author have been lead astray by an unscientific obsession with mathematical “beauty” in selecting and also refusing to abandon theories, notably super-symmetry (usually abbreviated as SUSY in popular physics writing), despite an embarrassing lack of evidence.  The author groups together several different issues under the rubric of “beauty” including the use of the terms beauty and elegance by theoretical physicists, at least two kinds of “naturalness,” the “fine tuning” of the constants in a theory to make it consistent with life, the desire for simplicity, dissatisfaction with the complexity of the standard model (twenty-five “fundamental” particles and a complex Lagrangian that fills two pages of fine print in a physics textbook), doubts about renormalization — an ad hoc procedure for removing otherwise troubling infinities — in Quantum Field Theory (QFT), and questions about “measurement” in quantum mechanics.  Although I agree with many points in the book, I feel the blanket attack on “beauty” is too broad, conflating several different issues, and misses the mark.

In Defense of “Beauty”

As the saying goes, beauty is in the eye of the beholder.  The case for simplicity or more accurately falsifiability in mathematical models is on a sounder, more objective basis than beauty however.  In many cases a complex model with many terms and adjustable parameters can fit many different data sets.  Some models are highly plastic.  They can fit almost any data set not unlike the way saran wrap can fit almost any surface.  These models are wholly unfalsifiable.

A mathematical model which can match any data set cannot be disproven.  It is not falsifiable.  A theory that predicts everything, predicts nothing.

Some models are somewhat plastic, able to fit many but not all data sets, not unlike a rubber sheet.  They are hard to falsify — somewhat unfalsifiable.  Some models are quite rigid, like a solid piece of stone fitting into another surface.  These models are fully falsifiable.

A simple well known example of this problem is a polynomial with many terms.  A polynomial with enough terms can match any data set.  In general, the fitted model will fail to extrapolate, to predict data points outside the domain of the data set used in the model fitting (the training set in the terminology of neural networks for example).  The fitted polynomial model will frequently interpolate, predict data points within the domain of the data set used in the model fitting — points near and in-between the training set data points, correctly.  Thus, we can say that a polynomial model with enough terms is not falsifiable in the sense of the philosopher of science Karl Popper because it can fit many data sets, not just the data set we actually have (real data).

This problem with complex mathematical models was probably first encountered with models of planetary motion in antiquity, the infamous epicycles of Ptolemy and his predecessors in ancient Greece and probably Babylonia/Sumeria (modern Iraq).  Pythagoras visited both Babylonia and Egypt.  The early Greek accounts of his life suggest he brought back the early Greek math and astronomy from Babylonia and Egypt.

Early astronomers, probably first in Babylonia, attempted to model the motion of Mars and other planets through the Zodiac as uniform circular motion around a stationary Earth.  This was grossly incorrect in the case of Mars which backs up for about two months about every two years.  Thus the early astronomers introduced an epicycle for Mars. They speculated that Mars moved in uniform circular motion around a point that in turn moved in uniform circular motion around the Earth.  With a single epicycle they could reproduce the biannual backing up with some errors.  To achieve greater accuracy, they added more and more epicycles, producing an ever more complex model that had some predictive power.  Indeed the state of the art Ptolemaic model in the sixteenth century was better than Copernicus’ new heliocentric model which also relied on uniform circular motion and epicycles.

The Ptolemaic model of planetary motion is difficult to falsify because one can keep adding more epicycles to account for discrepancies between the theory and observation.  It also has some predictive power.  It is an example of a “rubber sheet” model, not a “saran wrap” model.

In the real world, falsifiability is not a simple binary criterion.  A mathematical model is not either falsifiable and therefore good or not falsifiable and therefore bad.  Rather falsifiability falls on a continuum.  In general, extremely complex theories are hard to falsify and not predictive outside of the domain of the data used to infer (fit) the complex theory.  Simpler theories tend to be easier to falsify and if correct are sometimes very predictive as with Kepler’s Laws of Planetary Motion and subsequently Newton’s Law of Gravitation, from which Kepler’s Laws can be derived.

Unfortunately, this experience with mathematical modeling is known but has not been quantified in a rigorous way by mathematicians and scientists.  Falsifiabiliy remains a slogan primarily used against creationists, parapsychologists, and other groups rather than a rigorous criterion to evaluate theories like the standard model, supersymmetry, or superstrings.

A worrying concern with the standard model with its twenty-five fundamental particles, complex two-page Lagrangian (mathematical formula), and seemingly ad hoc elements such as the Higgs particle and Kobayashi-Maskawa matrix is that it is matching real data entirely or in part due to its complexity and inherent plasticity, much like the historical epicycles or a polynomial with many terms.   This concern is not just about subjective “beauty.”

Sheldon Glashow’s original formulation of what became the modern standard model was much simpler, did not include the Higgs particle, did not include the charm, top, or bottom quarks, and a number of other elements (S.L. Glashow (1961). “Partial-symmetries of weak interactions”. Nuclear Physics. 22 (4): 579–588. ).  Much as epicycles were added to the early theories of planetary motion, these elements were added on during the 1960’s and 1970’s to achieve agreement with experimental results and theoretical prejudices.  In evaluating the seeming success and falsifiability of the standard model, we need to consider not only the terms that were added over the decades but also the terms that might have been added to salvage the theory.

Theories with symmetry have fewer adjustable parameters and are less plastic, flexible, less able to match the data regardless of what data is presented.  This forms an objective but poorly quantified basis for intuitive notions of the “mathematical beauty” of symmetry in physics and other fields.

The problem is that although we can express this known problem of poor falsifiability or plasticity (at the most extreme an ability to fit any data set)  with mathematical models and modeling qualitatively with words such as “beauty” or “symmetry” or “simplicity,” we cannot express it in rigorous quantitative terms yet.

Big Science and Big Bucks

Much of the book concerns the way the Large Hadron Collider and its huge budget warped the thinking and research results of theoretical physicists, rewarding some like Nima Arkani-Hamed who could produce catchy arguments that new physics would be found at the LHC and encouraging many more to produce questionable arguments that super-symmetry, hidden dimensions or other glamorous exotica would be discovered.   The author recounts how her Ph.D. thesis supervisor redirected her research to a topic “Black Holes in Large Extra Dimensions” (2003) that would support the LHC.

Particle accelerators and other particle physics experiments have a long history of huge cost and schedule overruns — which are generally omitted or glossed over in popular and semi-popular accounts.  The not-so-funny joke that I learned in graduate school was “multiply the schedule by pi (3.14)” to get the real schedule.  A variant was “multiply the schedule by pi for running around in a circle.”  Time is money and the huge delays usually mean huge cost overruns.  Often these have involved problems with the magnets in the accelerators.

The LHC was no exception to this historical pattern.  It went substantially over budget and schedule before its first turn on in 2008, when around a third of the magnets in the multi-billion accelerator exploded, forcing expensive and time consuming repairs (see CERN’s whitewash of the disaster here).  LHC faced significant criticism over the cost overruns in Europe even before the 2008 magnet explosion.  The reported discovery of the Higgs boson in 2012 has substantially blunted the criticism; one could argue LHC had to make a discovery.  🙂

The cost and schedule overruns have contributed to the cancellation of several accelerator projects including ISABELLE at the Brookhaven National Laboratory on Long Island and the Superconducting Super Collider (SSC) in Texas.  The particle physics projects must compete with much bigger, more politically connected, and more popular programs.

The frequent cost and schedule overruns mean that pursuing a Ph.D. in experimental particle physics often takes much longer than advertised and is often quite disappointing as happened to large numbers of LHC graduate students.  For theorists, the pressure to provide a justification for the multi-billion dollar projects is undoubtedly substantial.

While genuine advances in fundamental physics may ultimately produce new energy technologies or other advances that will benefit humanity greatly, the billions spent on particle accelerators and other big physics experiments are certain, here and now.  The aging faculty at universities and senior scientists at the few research labs like CERN who largely control the direction of particle physics cannot easily retrain for new fields unlike disappointed graduate students or post docs in their twenties and early thirties.  The hot new fields like computers and hot high tech employers such as Google are noted for their preference for twenty-somethings and hostility to employees even in their thirties.  The existing energy industry seems remarkably unconcerned about alleged “peak oil” or climate change and empirically invests little if anything in finding replacement technologies.

Is there a way forward?

Sabine, who writes on her blog that she is probably leaving particle physics soon, offers some suggestions to improve the field, primarily focusing on learning about and avoiding cognitive biases.  This reminds me a bit of the unconscious bias training that Google and other Silicon Valley companies have embraced in a purported attempt to fix their seeming avoidance of employees from certain groups — with dismal results so far.  Responding rationally if perhaps unethically to clear economic rewards is not a cognitive bias and almost certainly won’t respond to cognitive bias training.  If I learn that I am unconsciously doing something because it is in my economic interest to do so, will I stop?

Future progress in fundamental physics probably depends on finding new informative data that does not cost billions of dollars (for example, a renaissance of table top experiments), reanalysis of existing data, and improved methods of data analysis such as putting falsifiability on a rigorous quantitative basis.

(C) 2018 by John F. McGowan, Ph.D.

About Me

John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).