## Why Should You Learn Mathematics?

Why should you learn mathematics?  By mathematics, I am not referring to basic arithmetic: addition, subtraction, multiplication, division, and raising a number to a power — for example for an interest calculation in personal finance.  There is little debate that in the modern world the vast majority of people need to know basic arithmetic to buy and sell goods and services and perform many other common tasks.  By mathematics I mean more advanced mathematics such as algebra, geometry, trigonometry, calculus, linear algebra, and college level statistics.

I am not referring to highly specialized advanced areas of mathematics such as number theory or differential geometry generally taught after the sophomore year in college or in graduate school.

I am following the language of Andrew Hacker in his book The Math Myth: And Other STEM Delusions in which he argues the algebra requirement should be eliminated in high schools, community colleges, and universities except for degrees that genuinely require mathematics.  Hacker draws a distinction between arithmetic which is clearly needed by all and mathematics such as algebra which few use professionally.

A number of educators such as Eloy Ortiz Oakley, the chancellor of California’s community colleges, have embraced a similar view, even arguing that abolishing the algebra requirement is a civil rights issue since some minority groups fail the algebra requirement at higher rates than white students.  Yes, he did say it is a civil rights issue:

The second thing I’d say is yes, this is a civil rights issue, but this is also something that plagues all Americans — particularly low-income Americans. If you think about all the underemployed or unemployed Americans in this country who cannot connect to a job in this economy — which is unforgiving of those students who don’t have a credential — the biggest barrier for them is this algebra requirement. It’s what has kept them from achieving a credential.

Eloy Ortiz Oakley on NPR (Say Goodbye to X + Y: Should Community Colleges Abolish Algebra?  July 19, 2017)

At present, few jobs, including the much ballyhooed software development jobs, require more than basic arithmetic as defined above.  For example, the famous code.org “What Most Schools Don’t Teach” video on coding features numerous software industry luminaries assuring the audience how easy software development is and how little math is involved.  Notably Bill Gates at one minute and forty-eight seconds says:  “addition, subtraction…that’s about it.”

Bill Gates assessment of the math required in software development today is largely true unless you are one of the few percent of software developers working on highly mathematical software: video codecs, speech recognition engines, gesture recognition algorithms, computer graphics for games and video special effects, GPS, Deep Learning, FDA drug approvals, and other exotic areas.

Thus, the question arises why people who do not use mathematics professionally ought to learn mathematics.  I am not addressing the question of whether there should be a requirement to pass algebra to graduate high school or for a college degree such a veterinary degree where there is no professional need for mathematics.  The question  is whether people who do not need mathematics professionally should still learn mathematics — whether it is required or not.

People should learn mathematics because they need mathematics to make informed decisions about their health care, their finances, public policy issues that affect them such as global warming, and engineering issues such as the safety of buildings, aircraft, and automobiles — even though they don’t use mathematics professionally.

The need to understand mathematics to make informed decisions is increasing rapidly with the proliferation of “big data” and “data science” in recent years: the use and misuse of statistics and mathematical modeling on the large, rapidly expanding quantities of data now being collected with extremely powerful computers, high speed wired and wireless networks, cheap data storage capacity, and inexpensive miniature sensors.

Health and Medicine

An advanced knowledge of statistics is required to evaluate the safety and effectiveness of drugs, vaccines, medical treatments and devices including widely used prescription drugs.   A study by the Mayo Clinic in 2013 found that nearly 7 in 10 (70%) of Americans take at least one prescription drug.  Another study published in the Journal of the American Medical Association (JAMA) in 2015 estimated about 59% of Americans are taking a prescription drug.  Taking a prescription drug can be a life and death decision as the horrific case of the deadly pain reliever Vioxx discussed below illustrates.

The United States and the European Union have required randomized clinical trials and detailed sophisticated statistical analyses to evaluate the safety and effectiveness of drugs, medical devices, and treatments for many decades.  Generally, these analyses are performed by medical and pharmaceutical companies who have an obvious conflict of interest.  At present, doctors and patients often find themselves outmatched in evaluating the claims for the safety and effectiveness of drugs, both new and old.

In the United States, at least thirty-five FDA approved drugs have been withdrawn due to serious safety problems, generally killing or contributing to the deaths of patients taking the drugs.

The FDA has instituted an FDA Adverse Events Reporting System (FDAERS) for doctors and other medical professionals to report deaths and serious health problems such as hospitalization suspected of being caused by adverse reactions to drugs. In 2014, 123,927 deaths were reported to the FDAERS and 807,270 serious health problems. Of course, suspicion is not proof and a report does not necessarily mean the reported drug was the cause of the adverse event.

Vioxx (generic name rofecoxib) was a pain-killer marketed by the giant pharmaceutical company Merck (NYSE:MRK) between May of 1999 when it was approved by the United States Food and Drug Administration (FDA) and September of 2004 when it was withdrawn from the market. Vioxx was marketed as a “super-aspirin,” allegedly safer and implicitly more effective than aspirin and much more expensive, primarily to elderly patients with arthritis or other chronic pain. Vioxx was a “blockbuster” drug with sales peaking at about $2.5 billion in 2003 1 and about 20 million users 2. Vioxx probably killed between 20,000 and 100,000 patients between 1999 and 2004 3. Faulty blood clotting is thought to be the main cause of most heart attacks and strokes. Unlike aspirin, which lowers the probability of blood coagulation (clotting) and therefore heart attacks and strokes, Vioxx increased the probability of blood clotting and the probability of strokes and heart attacks by about two to five times. Remarkably, Merck proposed and the FDA approved Phase III clinical trials of Vioxx with too few patients to show that Vioxx was actually safer than the putative 3.8 deaths per 10,000 patients rate (16,500 deaths per year according to a controversial study used to promote Vioxx) from aspirin and other non-steroidal anti-inflammatory drugs (NSAIDs) such as ibuprofen (the active ingredient in Advil and Motrin), naproxen (the active ingredient in Aleve), and others. The FDA guideline, Guideline for Industry: The Extent of Population Exposure to Assess Clinical Safety: For Drugs Intended for Long-Term Treatment of Non-Life-Threatening Conditions (March 1995), only required enough patients in the clinical trials to reliably detect a risk of about 0.5 percent (50 deaths per 10,000) of death in patients treated for six months or less (roughly equivalent to one percent death rate for one year assuming a constant risk level) and about 3 percent (300 deaths per 10,000) for one year (recommending about 1,500 patients for six months or less and about 100 patients for at least one year without supporting statistical power computations and assumptions in the guideline document). The implicit death rate detection threshold in the FDA guideline was well above the risk from aspirin and other NSAIDs and at the upper end of the rate of cardiovascular “events” caused by Vioxx. FDA did not tighten these requirements for Vioxx even though the only good reason for the drug was improved safety compared to aspirin and other NSAIDs. In general, the randomized clinical trials required by the FDA for drug approval have too few patients – insufficient statistical power in statistics terminology – to detect these rare but deadly events 4. To this day, most doctors and patients lack the statistical skills and knowledge to evaluate the safety level that can be inferred from the FDA required clinical trials. There are many other advanced statistical issues in evaluating the safety and effectiveness of drugs, vaccines, medical treatments, and devices. Finance and Real Estate Mathematical models have spread far and wide in finance and real estate, often behind the scenes invisible to casual investors. A particularly visible example is Zillow’s ZEstimate of the value of homes, consulted by home buyers and sellers every day. Zillow is arguably the leading online real estate company. In March 2014, Zillow had over one billion page views, beating competitors Trulia.com and Realtor.com by a wide margin; Zillow has since acquired Trulia. According to a 2013 Gallup poll, sixty-two percent (62%) of Americans say they own their home. According to a May 2014 study by the Consumer Financial Protection Bureau, about eighty percent (80%) of Americans 65 and older own their home. Homes are a large fraction of personal wealth and retirement savings for a large percentage of Americans. Zillow’s algorithm for valuing homes is proprietary and Zillow does not disclose the details and/or the source code. Zillow hedges by calling the estimate an “estimate” or a “starting point.” It is not an appraisal. However, Zillow is large and widely used, claiming estimates for about 110 million homes in the United States. That is almost the total number of homes in the United States. There is the question whether it is so large and influential that it can effectively set the market price. Zillow makes money by selling advertising to realty agents. Potential home buyers don’t pay for the estimates. Home sellers and potential home sellers don’t pay directly for the estimates either. This raises the question whether the advertising business model might have an incentive for a systematic bias in the estimates. One could argue that a lower valuation would speed sales and increase commissions for agents. Zillow was recently sued in Illinois over the ZEstimate by a homeowner — real estate lawyer Barbara Andersen 🙂 — claiming the estimate undervalued her home and made it difficult therefore to sell the home. The suit argues that the estimate is in fact an appraisal, despite claims to the contrary by Zillow, and therefore subject to Illinois state regulations regarding appraisals. Andersen has reportedly dropped this suit and expanded to a class-action lawsuit by home builders in Chicago again alleging that the ZEstimate is an appraisal and undervalues homes. The accuracy of Zillow’s estimate has been questioned by others over the years including home owners, real estate agents, brokers, and the Los Angeles Times. Complaints often seem to involve alleged undervaluation of homes. On the other hand, Zillow CEO Spencer Rascoff’s Seattle home reportedly sold for$1.05 million on Feb. 29, 2016, 40 percent less than the Zestimate of $1.75 million shown on its property page a day later (March 1, 2016). 🙂 As in the example of Vioxx and other FDA drug approvals, it is actually a substantial statistical analysis project to independently evaluate the accuracy of Zillow’s estimates. What do you do if Zillow substantially undervalues your home when you need to sell it? Murky mathematical models of the value of mortgage backed securities played a central role in the financial crash in 2008. In this case, the models were hidden behind the scenes and invisible to casual home buyers or other investors. Even if you are aware of these models, how do you properly evaluate their effect on your investment decisions? Public Policy Misleading and incorrect statistics have a long history in public policy and government. Darrell Huff’s classic How to Lie With Statistics (1954) is mostly concerned with misleading and false polls, statistics, and claims from American politics in the 1930’s and 1940’s. It remains in print, popular and relevant today. Increasingly however political controversies involve often opaque computerized mathematical models rather than the relatively simple counting statistics debunked in Huff’s classic book. Huff’s classic and the false or misleading counting statistics in it generally required only basic arithmetic to understand. Modern political controversies such as Value Added Models for teacher evaluation and the global climate models used in the global warming controversy go far beyond basic arithmetic and simple counting statistics. The Misuse of Statistics and Mathematics Precisely because many people are intimidated by mathematics and had difficulty with high school or college mathematics classes including failing the courses, statistics and mathematics are often used to exploit and defraud people. Often the victims are the poor, marginalized, and poorly educated. Mathematician Cathy O’Neil gives many examples of this in her recent book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (2016). The misuse of statistics and mathematics is not limited to poor victims. Bernie Madoff successfully conned large numbers of wealthy, highly educated investors in both the United States and Europe using the arcane mathematics of options as a smokescreen. These sophisticated investors were often unable to perform the sort of mathematical analysis that would have exposed the fraud. Rich and poor alike need to know mathematics to protect themselves from this frequent and growing misuse of statistics and mathematics. Algebra and College Level Statistics The misleading and false counting statistics lampooned by Darrell Huff in How to Lie With Statistics does not require algebra or calculus to understand. In contrast, the college level statistics often encountered in more complex issues today does require a mastery of algebra and sometimes calculus. For example, one of the most common probability distributions encountered in real data and mathematical models is the Gaussian, better known as the Normal Distribution or Bell Curve. This is the common expression for the Gaussian in algebraic notation. ${P(x) = \frac{1}{{\sigma \sqrt {2\pi } }}e^{{{ - \left( {x - \mu } \right)^2 } \mathord{\left/ {\vphantom {{ - \left( {x - \mu } \right)^2 } {2\sigma ^2 }}} \right. \kern-\nulldelimiterspace} {2\sigma ^2 }}}}$ $x$ is the position of the data point. $\mu$ is the mean of the distribution. If I have a data set obeying the Normal Distribution, most of the data points will be near the mean $\mu$ and fewer further away. $\sigma$ is the standard deviation — loosely the width — of the distribution. $\pi$ is the ratio of the circumference of a circle to the diameter. $e$ is Euler’s number (about 2.718281828459045). This is a histogram of simulated data following the Normal Distribution/Bell Curve/Gaussian with a mean $\mu$ of zero (0.0) and a standard deviation $\sigma$ of one (1.0): To truly understand the Normal Distribution you need to know Euler’s number e and algebraic notation and symbolic manipulation. It is very hard to express the Normal Distribution with English words or basic arithmetic. The Normal Distribution is just one example of the use of algebra in college level statistics. In fact, an understanding of calculus is needed to have a solid understanding and mastery of college level statistics. Conclusion People should learn mathematics — meaning subjects beyond basic arithmetic such as algebra, geometry, trigonometry, calculus, linear algebra, and college level statistics — to make informed decisions about their health care, personal finances and retirement savings, important public policy issues such as teacher evaluation and public education, and other key issues such as evaluating the safety of buildings, airplanes, and automobiles. There is no doubt that many people experience considerable difficulty learning mathematics whether due to poor teaching, inadequate learning materials or methods, or other causes. There is and has been heated debate over the reasons. These difficulties are not an argument for not learning mathematics. Rather they are an argument for finding better methods to learn and teach mathematics to everyone. End Notes 1 How did Vioxx debacle happen?” By Rita Rubin, USA Today, October 12, 2004 The move was a stunning denouement for a blockbuster drug that had been marketed in more than 80 countries with worldwide sales totaling$2.5 billion in 2003.

2 Several estimates of the number of patients killed and seriously harmed by Vioxx were made. Dr. David Graham’s November 2004 Testimony to the US Senate Finance Committee gives several estimates including his own.

Top software engineers seem to be bringing in a base salary of around $150,000 in the Silicon Valley: http://spectrum.ieee.org/view-from-the-valley/at-work/tech-careers/a-snapshot-of-software-engineering-salaries-at-silicon-valley-startups There is always the question of stock options and RSU’s (restricted stock units) and cash bonuses which can sometimes boost the base salary significantly. Keep in mind the Silicon Valley/San Francisco Bay Area is very expensive with some of the highest home prices and apartment rental rates in the United States. The salaries are still attractive but not nearly as large as they sound if you are from an inexpensive region like Texas. The bottom line is to be very cautious about paying large sums of money for coding bootcamps or other non-traditional education. (C) 2017 John F. McGowan, Ph.D. About the Author John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech). ## The Accuracy of Fingerprint Identification in Criminal Cases In the wake of the Brandon Mayfield case (2004) which raised serious questions about the accuracy of fingerprint identification by the FBI, the National Academy of Sciences was asked to perform a scientific assessment of the accuracy and reliability of latent fingerprint identification in criminal cases. Initial results were published in: Proceedings of the National Academy of Sciences (PNAS) Bradford T. Ulery, 7733–7738, doi: 10.1073/pnas.1018707108 Accuracy and reliability of forensic latent fingerprint decisions Bradford T. Ulery (a), R. Austin Hicklin (a), JoAnn Buscaglia (b),1, and Maria Antonia Roberts (c) Edited by Stephen E. Fienberg, Carnegie Mellon University, Pittsburgh, PA, and approved March 31, 2011 (received for review December 16, 2010) ABSTRACT The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. The National Research Council of the National Academies and the legal and forensic sciences communities have called for research to measure the accuracy and reliability of latent print examiners’ decisions, a challenging and complex problem in need of systematic analysis. Our research is focused on the development of empirical approaches to studying this problem. Here, we report on the first large-scale study of the accuracy and reliability of latent print examiners’ decisions, in which 169 latent print examiners each compared approximately 100 pairs of latent and exemplar fingerprints from a pool of 744 pairs. The fingerprints were selected to include a range of attributes and quality encountered in forensic casework, and to be comparable to searches of an automated fingerprint identification system containing more than 58 million subjects. This study evaluated examiners on key decision points in the fingerprint examination process; procedures used operationally include additional safeguards designed to minimize errors. Five examiners made false positive errors for an overall false positive rate of 0.1%. Eighty-five percent of examiners made at least one false negative error for an overall false negative rate of 7.5%. Independent examination of the same comparisons by different participants (analogous to blind verification) was found to detect all false positive errors and the majority of false negative errors in this study. Examiners frequently differed on whether fingerprints were suitable for reaching a conclusion. http://www.pnas.org/content/108/19/7733.full Authors Bradford T. Ulery (a) Noblis, 3150 Fairview Park Drive, Falls Church, VA 22042; R. Austin Hicklin (a) Noblis, 3150 Fairview Park Drive, Falls Church, VA 22042; JoAnn Buscaglia (b) Counterterrorism and Forensic Science Research Unit, Federal Bureau of Investigation Laboratory Division, 2501 Investigation Parkway, Quantico, VA 22135; and Maria Antonia Roberts (c) Latent Print Support Unit, Federal Bureau of Investigation Laboratory Division, 2501 Investigation Parkway, Quantico, VA 22135 Whether a 0.1 percent false positive rate is “small” is a subjective value judgement. Would you drive across a bridge that had a 1 in 1000 (0.1 percent) chance of collapsing and killing you as you drove across it? No, probably not. In addition, the 0.1 percent false positive rate is based on a small sample of less than 1000 test cases, 744 pairs of latent and exemplar fingerprints. The Federal fingerprint databases such as the ones used in the Brandon Mayfield case have millions of people in them and may eventually have all US citizens (over 300 million people) in them. How does this “small” rate extrapolate when a fingerprint is compared to every fingerprint in the US or the world? One might wonder why such an assessment was not done a long time ago. This is a report on the Brandon Mayfield case: https://oig.justice.gov/special/s0601/exec.pdf The National Research Council also published a detailed report Strengthening Forensic Science in the United States: A Path Forward in 2009 addressing the scientific issues raised by the Mayfield case and other questions about the scientific validity of forensic science methods. Fingerprint identification: advances since the 2009 National Research Council report by Christophe Campod (Philos Trans R Soc Lond B Biol Sci. 2015 Aug 5; 370(1674): 20140259. doi: 10.1098/rstb.2014.0259) has a summary of work on the issue since the 2009 National Research Council Report. The bottom line is fingerprints are much more accurate than random chance but hardly infallible as used to be widely believed. (C) 2017 John F. McGowan, Ph.D. Credits The fingerprint image is from the United States National Institute of Standards and Technology (NIST) by way of Wikimedia Commons and is in the public domain. About the Author John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech). ## Microsoft Layoffs and STEM Shortage Claims (2009-2017) Microsoft and its former CEO and founder Bill Gates are prominent in claiming that there is a severe shortage of STEM (Science, Technology, Engineering, and Mathematics) workers in the United States. Bill Gates testified on this claim to the House Committee on Science and Technology in 2008. I know we all want the U.S. to continue to be the world’s center for innovation. But our position is at risk. There are many reasons for this but two stand out. First, U.S. companies face a severe shortfall of scientists and engineers with expertise to develop the next generation of breakthroughs. Second, we don’t invest enough as a nation in the basic research needed to drive long-term innovation. Bill Gates Remarkably, Microsoft appears to have laid off about 35,000 of these allegedly rare, difficult to find STEM workers since 2008, with even more planned layoffs announced a few weeks ago. Microsoft Layoffs In January 2009, Microsoft announced planned layoffs of 5,000 employees, about five (5) percent of its workforce over the next eighteen months. In July 2014, Microsoft announced layoffs of 18,000 employees. Most of these employees, reportedly about 12,500, were part of the Nokia mobile phone division, many in Finland. In 2015 Finland students were ranked sixth (6th) worldwide in math and science compared to the United States twenty-eighth (28th). In 2001 Finland was tops in the PISA international tests. The engineers and other STEM workers laid off by Microsoft would have been educated in Finland’s schools in the early 00’s when Finland was at or near the top. UPDATE: added September 11, 2017 Remarkably, the same month that Microsoft announced these layoffs of 18,000 difficult to find STEM workers, the New York Times published an op-ed “Break the Immigration Impasse” by Sheldon G. Adelson, Warren E. Buffet, and Bill Gates (New York Times, July 11, 2014, page A25) calling for “immigration reform,” meaning more “immigrants” on dicey guest-worker visas (the controversial H1-B visa is actually a non-immigrant visa) for the technology industry, and again implying a shortage: We believe it borders on insanity to train intelligent and motivated people in our universities — often subsidizing their education — and then to deport them when they graduate. Many of these people, of course, want to return to their home country — and that’s fine. But for those who wish to stay and work in computer science or technology, fields badly in need of their services, let’s roll out the welcome mat. A “talented graduate” reform was included in a bill that the Senate approved last year by a 68-to-32 vote. It would remove the worldwide cap on the number of visas that could be awarded to legal immigrants who had earned a graduate degree in science, technology, engineering or mathematics from an accredited institution of higher education in the United States, provided they had an offer of employment. The bill also included a sensible plan that would have allowed illegal residents to obtain citizenship, though only after they had earned the right to do so. (emphasis added) One is reminded of the definition of chutzpah as “that quality enshrined in a man who, having killed his mother and father, throws himself on the mercy of the court because he is an orphan”. END UPDATE In July 2015, Microsoft announced layoffs of 7,800 employees, also mostly related to Nokia. In May 2016, Microsoft announced layoffs of about 2,000 employees, including about 1300 from Nokia. In July 2016, Microsoft announced layoffs of about 2,850 employees. In July 2017 (a few weeks ago) Microsoft confirmed reports of planned layoffs without confirming reports that about 3,000 employees would lose their jobs, primarily in sales. Thus, Microsoft appears to have laid off about 35,000 employees with more cuts likely in the coming year since Bill Gates testimony to the House Committee on Science and Technology. Microsoft reported to the SEC that it had about 114,000 full time employees in 2016. Stack and Rank Up until 2013, Microsoft overtly practiced a stack and rank employment system where employees were graded on a curve compared to co-workers and “low performers” apparently laid off or fired. This stack and rank system was the subject of a highly critical article in Vanity Fair by Kurt Eichenwald in July 2012 which probably contributed to the decision to shelve the system. It is unclear how many allegedly difficult to find and replace STEM workers were laid off, fired or constructively discharged due to stack ranking. Microsoft has been sued over allegedly using stack ranking to discriminate against female employees. Microsoft like other industry leaders such as Google, Facebook, Apple, and Amazon is noted for being extremely picky about who it even interviews for jobs and for a grueling, highly demanding interview process. Nonetheless, Microsoft appears to have had a policy of laying off a certain percentage of these highly qualified STEM workers every year despite repeatedly claiming to have great difficulty in finding these same STEM workers! Conclusion Microsoft is not alone in announcing sizable layoffs at the same time that it claims a STEM worker shortage. Many other large STEM worker employers do the same thing. In an exchange on Bloomberg TV in August 2014 interviewer Alix Steel confronted industry funded “immigration reform” PAC FWD.us then chief Joe Green on the inconsistency between numerous layoff announcements and the shortage claims. His answer was especially unconvincing and he soon resigned as chief of FWD.us probably at the behest of his friend and colleague Facebook CEO Mark Zuckerberg. It is difficult to know what to make of this. On a short term quarterly basis replacing a highly experienced and more expensive STEM worker with a less experienced, cheaper, more error prone STEM worker is likely to make the quarterly and sometimes annual earnings numbers look better. However, there is a reason more experienced STEM workers are on average more expensive than less experienced STEM workers. Some problems simply require more experience to solve; two less experienced STEM workers is not always equivalent to one more experienced STEM worker. I personally don’t doubt that these bizarre hiring and employment practices have seriously negative consequences in the longer term. Many of Kurt Eichenwald’s unnamed sources in his Vanity Fair article on Microsoft’s stack and rank employment system blamed the system for Microsoft’s faltering fortunes. Would Microsoft not have been better off reassigning its highly skilled workers in Finland to new projects? Nonetheless, despite the STEM shortage claims and despite what seems like common sense, many major STEM worker employers like Microsoft continue to lay off, fire, or constructively discharge large numbers of the qualified STEM workers they claim they want. (C) 2017 John F. McGowan, Ph.D. Credits The picture of Bill Gates at the World Economic Forum 2012 in Davos, Switzerland is from the World Economic Forum by way of Wikimedia Commons. It is licensed under the Creative Commons Attribution 2.0 Generic license. About the author John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech). ## The Problems with STEM Shortage Claims STEM (Science, Technology, Engineering and Mathematics) shortage claims are claims that there is a current or projected shortage of STEM workers in the United States and sometimes worldwide. These claims are promoted by large employers of STEM workers in private industry, academia, and the government. In the last few years the claims tend to be focused on a particular subset of STEM workers: programmers, software engineers, and other “technology” workers, where “technology” is implicitly equated with “computer technology.” A high profile example of these claims can be found in venture capitalist Marc Andreesen’s widely cited Wall Street Journal article “Why Software is Eating the World” (August 20, 2011): Secondly, many people in the U.S. and around the world lack the education and skills required to participate in the great new companies coming out of the software revolution. This is a tragedy since every company I work with is absolutely starved for talent. Qualified software engineers, managers, marketers and salespeople in Silicon Valley can rack up dozens of high-paying, high-upside job offers any time they want, while national unemployment and underemployment is sky high. This problem is even worse than it looks because many workers in existing industries will be stranded on the wrong side of software-based disruption and may never be able to work in their fields again. There’s no way through this problem other than education, and we have a long way to go. (Emphasis added) Andreesen is far from an isolated instance of these claims. For example, in his testimony to the House Committee on Science and Technology in 2008, former Microsoft CEO Bill Gates claimed: I know we wall want the U.S. to continue to be the world’s center for innovation. But our position is at risk. There are many reasons for this but two stand out. First, U.S. companies face a severe shortfall of scientists and engineers with expertise to develop the next generation of breakthroughs. Second, we don’t invest enough as a nation in the basic research needed to drive long-term innovation. (Emphasis added) Ironically, Microsoft, a highly profitable company, announced several thousand layoffs of its highly qualified and presumably difficult to replace employees a few months later. Both Bill Gates and Microsoft have been prominent in claiming shortages of qualified technology workers since 2009 even as Microsoft has announced a series of major layoffs of presumably highly qualified technology workers. Microsoft announced another round of about 3,000 layoffs a few weeks ago (July 2017). Rising Above the Gathering Storm In 2005, the COMMITTEE ON PROSPERING IN THE GLOBAL ECONOMY OF THE 21ST CENTURY produced a widely cited report “Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future” under the auspices of the National Research Council (NRC) promoting similar claims. The committee members were: NORMAN R. AUGUSTINE (Chair), Retired Chairman and CEO, Lockheed Martin Corporation, Bethesda, MD CRAIG R. BARRETT, Chairman of the Board, Intel Corporation, Chandler, AZ GAIL CASSELL, Vice President, Scientific Affairs, and Distinguished Lilly Research Scholar for Infectious Diseases, Eli Lilly and Company, Indianapolis, IN STEVEN CHU, Director, E. O. Lawrence Berkeley National Laboratory, Berkeley, CA ROBERT M. GATES, President, Texas A&M University, College Station, TX NANCY S. GRASMICK, Maryland State Superintendent of Schools, Baltimore, MD CHARLES O. HOLLIDAY, JR., Chairman of the Board and CEO, DuPont Company, Wilmington, DE SHIRLEY ANN JACKSON, President, Rensselaer Polytechnic Institute, Troy, NY ANITA K. JONES, Lawrence R. Quarles Professor of Engineering and Applied Science, University of Virginia, Charlottesville, VA JOSHUA LEDERBERG, Sackler Foundation Scholar, Rockefeller University, New York, NY RICHARD LEVIN, President, Yale University, New Haven, CT C. D. (DAN) MOTE, JR., President, University of Maryland, College Park, MD CHERRY MURRAY, Deputy Director for Science and Technology, Lawrence Livermore National Laboratory, Livermore, CA PETER O’DONNELL, JR., President, O’Donnell Foundation, Dallas, TX LEE R. RAYMOND, Chairman and CEO, Exxon Mobil Corporation, Irving, TX ROBERT C. RICHARDSON, F. R. Newman Professor of Physics and Vice Provost for Research, Cornell University, Ithaca, NY P. ROY VAGELOS, Retired Chairman and CEO, Merck, Whitehouse Station, NJ CHARLES M. VEST, President Emeritus, Massachusetts Institute of Technology, Cambridge, MA GEORGE M. WHITESIDES, Woodford L. & Ann A. Flowers University Professor, Harvard University, Cambridge, MA RICHARD N. ZARE, Marguerite Blake Wilbur Professor in Natural Science, Stanford University, Stanford, CA Nearly all of the committee members were current or former top executives, frequently the CEO, of major employers of STEM workers, public and private. The committee followed up with another report in 2010 “Rising Above the Gathering Storm, Revisited: Rapidly Approaching Category 5.” Category 5 is a reference to the Saffir-Simpson hurricane wind scale in which the highest classification Category 5 is reserved for extreme storms with winds exceeding 156 miles per hour. Rising Above the Gathering Storm, like most reports of this type (there are many), called for more STEM teachers, more STEM students, more visas for STEM worker immigrants and guest workers strongly implying a major shortage of STEM workers in the United States. Ironically the report starts with a claim that appears grossly inconsistent with this, a quote from Nobel Laureate Julius Axelrod (Rising Above the Gathering Storm, Preface, Page ix): Ninety-nine percent of the discoveries are made by one percent of the scientists. Julius Axelrod, Nobel Laureate It is manifestly unclear why more scientists and more funding for science is needed if ninety-nine percent accomplish almost nothing. Why indeed not eliminate the nearly useless 99 percent and the 99 percent of funding that they consume? Federal R&D funding is over$100 billion per year.  Why not free up over $99 billion to fund other more productive activities? 🙂 STEM shortage claims have a long history STEM shortage claims predate the acronym STEM by many decades. STEM shortage claims date back at least to the early days of the Cold War, when much of the focus was on physics and physicists. Then, as now, the STEM shortage claims often involve an alleged existential threat to the nation. Professor David Kaiser of MIT, a physicist turned historian of science, has written a number of articles and given a number of presentations on the Cold War physics and STEM claims, notably “TOIL, TROUBLE, AND THE COLD WAR BUBBLE: PHYSICS AND THE ACADEMY SINCE WORLD WAR II” at the Perimeter Institute in 2008. In recent years, the STEM shortage claims tend to focus on computer science and software engineering rather than physics, although claims of this type are common for almost all forms of STEM work. STEM shortage claims have many highly qualified critics The claims have been questioned and challenged by a large number of academics, journalists and others for many years including Michael S. Teitelbaum (Senior Research Associate at the Labor and Worklife Program at Harvard Law School), Norman Matloff (Professor of Computer Science at UC Davis), Peter Cappelli (George W. Taylor Professor of Management, Wharton Business School, University of Pennsylvania), Paula Stephan (Professor of Economics at Georgia State University), Ron Hira (Associate Professor, Howard University), Patrick Thibodeau (a Senior Editor at Computerworld), Robert N. Charette of the IEEE and author of “The STEM Crisis is a Myth,” and many others. I have written many critical articles on the claims including “STEM Shortages, Purple Squirrels, and Leprechauns,” “STEM Shortage Claims and Facebook’s$19 Billion Acquisition of WhatsApp”,  and “The Corinthian Colleges Scandal, STEM Shortage Claims, and Minorities.”  The last includes a lengthy discussion of Microsoft’s numerous layoffs in the comments section.

STEM shortage claims are closely connected to, although logically separate from, calls for increased immigration and guest worker visas such as the controversial H1-B visa.  The claims are also closely connected to, though again logically separate from, claims that education in the United States is poor both in absolute terms and compared to other nations such as Finland and calls for “school reform” often promoted by extremely wealthy individuals such as former Microsoft CEO Bill Gates, Facebook CEO Mark Zuckerberg, and others.

STEM shortage claims are confusing

STEM shortage claims are surprisingly difficult to pin down.  The crux of the issue is what exactly constitutes a qualified STEM worker (software engineer, scientist,…)?

Many claims seem to imply a shortage of STEM workers with critical basic skills taught at the K-12 level such as basic arithmetic, algebra, AP Calculus, basic programming skills taught in AP Computer Science and other introductory CS courses (or for that matter learned programming a game on your laptop in Python or Java, a popular activity among STEM students who never take AP Computer Science).

Obviously, the tens of thousands of highly qualified engineers and other STEM workers laid off by Microsoft since 2008 have these K-12 skills in spades.  Indeed many of the highly qualified engineers laid off by Microsoft were from Finland which has consistently scored top or near the top in the international comparisons of K-12 skills frequently cited in STEM shortage claims.  So apparently this is not the STEM shortage referred to by Microsoft and Bill Gates.

Similarly, many older — over thirty-five, even over thirty sometimes — software engineers and other STEM workers report surprising difficulties finding jobs, a fair number leaving the STEM fields every year.  Again there is little question these candidates have the K-12 level STEM skills and much more.

When pressed about these obvious inconsistencies, spokesmen for STEM employers will generally begin to claim they mean a shortage of very specific skills such as years of paid experience developing first person shooter apps for the iPhone (IOS) in Objective C (C++ on Android won’t cut it!) and often that they mean a shortage of the very best STEM workers — along the lines of the elite one percent in the Axelrod quote  from the Rising Above the Gathering Storm report above.  Often, years of specialized experience in narrowly defined skills and being the very best are implicitly conflated in these revised STEM shortage claims.

What do the STEM employers really want?

Yet, do the employers actually want either the candidates with years of specialized experience or the very best or both as they claim?  There are some high profile rejections of candidates who would seem to meet these criteria such as Facebook’s infamous turndown of Brian Acton who went on to found WhatsApp which Facebook then acquired for \$19 billion.

In recent years, many employers are noted for quizzing candidates about introductory data structures and algorithms taught in college CS courses rather than advanced specific skills learned on the job.  This has spawned a large number of interview practice books, courses and programs such as Gayle Laakmann McDowell’s Cracking the Coding Interview.

It is difficult to see how these introductory questions would reliably identify the specialized skills learned on the job such as iPhone app programming that are often listed in job descriptions and cited in defenses of STEM hiring practices.  Can these tests really identify the very best candidates either?  More likely they identify candidates who have spent many hours drilling on the questions in books like Cracking the Coding Interview.

STEM shortage claims are highly questionable.  For sure, there is no shortage of K-12 level STEM skills in the United States and probably world wide.  Indeed, the actual hiring practices of STEM employers suggest they are often not interested in the specialized skills they claim to seek when confronted about refusing to hire, laying off, or firing seemingly highly qualified engineers and other STEM workers.

Is the real problem a STEM worker shortage or excessively picky, irrational, discriminatory and ultimately costly hiring and employment practices?

(C) 2017 John F. McGowan, Ph.D.

Credits

The picture of Ken Thompson and Dennis Ritchie is from Wikimedia Commons and is in the public domain.

John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing gesture recognition for touch devices, video compression and speech recognition technologies. He has extensive experience developing software in C, C++, MATLAB, Python, Visual Basic and many other programming languages. He has been a Visiting Scholar at HP Labs developing computer vision algorithms and software for mobile devices. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech).

## A Personal Note: Break-in

Someone broke into my storage locker in Mountain View (best known as Google’s home town) over the past weekend.  I was notified but was not able to take a look until this morning.  Fortunately, so far, nothing appears to have been taken or damaged.  I don’t of course keep anything valuable, important, or that I cannot afford to lose in my storage locker.

About a dozen lockers, several neighboring lockers on the same floor and several on another floor with the same locker numbers for that floor, were reportedly broken into at the same time.

Many boxes were thrown about and cut or torn open, but nothing appears (so far) to have been taken or damaged.  Supposedly the most common motive for locker break-ins is to get personal financial records related to bank accounts and other financial accounts as a step to gaining access to the money in the accounts.  I don’t keep any personal records like that in my locker for that and other similar reasons.

While it does not look like I was personally targeted, it still leaves an uneasy feeling!  One cannot be sure.  🙁

The picture below shows the boxes strewn about when I opened the locker this morning.  They were neatly stacked and packed together in one corner before.  Many of the smaller boxes and some of the larger ones were torn or cut — mostly cut — open.