The 2,243rd Meeting of the Society

October 24, 2008 at 8:00 PM

Powell Auditorium at the Cosmos Club

The History of Probability

Mike Shlesinger

Research Division Director
Expeditionary Maneuver Warfare and Combating Terrorism Department
Office of Naval Research

About the Lecture

This lecture traces the history of probability theory from the throwing of bones, sticks, and dice to modern times. Early 18th century books, Jacob Bernouill’s “The Art of Conjecture” and Abraham DeMoivre’s “The Doctrine of Chances” were rich with new mathematics, insight and gambling odds. Progress was often made by confronting paradoxes. The first of these confused probabilities with expectations and was explained in the Pascal-Fermat letters of 1654. The St. Petersburg Paradox involved a distribution with an infinite first moment, and Levy discovered a whole class of probabilities with infinite moments that have found a surprising utility in physics. The Bertrand paradox involved measure theory for continuous probabilities, Poisson discovered that adding random variables need not always produce the Gaussian, and Daniel Bernoulli and D’Alembert argued over the probabilities for the safety of smallpox vaccinations. Using these and other anecdotes, this lecture discusses vignettes that have brought us to our modern understanding of probability theory.

About the Speaker

Beginning in late 2008, Michael Shlesinger will hold the Kinnear Chair in Science at the US Naval Academy. At the Office of Naval Research, he is the Research Division Director in the Expeditionary Maneuver Warfare and Combating Terrorism Department. He is also the Program Manager for the Counter-IED basic research program and ONR’s Chief Scientist for Nonlinear Science. He joined ONR in 1983 and became a member of the Senior Executive Service in 1987. He is a Fellow of the American Physical Society and has published about 200 scientific papers on topics in stochastic processes, glassy materials, proteins, neurons, and nonlinear dynamics. He is a Divisional Associate Editor of the Physical Review Letters. He received ONR’s 2006 Saalfeld Award for Outstanding Lifetime Achievement in Science, the federal government’s Presidential Rank Award for Meritorious Senior Professionals in 2004, U. Maryland’s Distinguished Postdoc Alum award in 2004, the Navy Superior Civilian Service Award in 1991 and was the Regents’ Lecturer at UCSD in 1994, and the Michelson Lecturer at the USNA in 1992. His Ph. D., in Physics, is from the U. of Rochester in 1975, and he holds a B.S. in Mathematics and Physics from SUNY Stony Brook in 1970.

Minutes

President Kenneth Haapala called the 2,243rd meeting to order at 8:15 pm October 24, 2008 in the Powell Auditorium of the Cosmos Club. The minutes of the 2,242nd meeting were read and approved with an improvement contributed by one of the audience. Mr. Haapala introduced the speaker of the evening, Mr. Michael Shlesinger of the Naval Research Laboratory, who spoke on “The History of Probability.”

Probability, Mr. Shlesinger said, took a long time to find a home in mathematics. This may have been because if you flip a coin, heads and tails are equally probable. That does not seem like mathematics. How do you make it mathematical? By flipping the coin many times. With increasing repetitions, patterns of results become predictable.

Probability started, it appears, with people playing with bones, the heel bones of sheep and dogs. People tossed them and made decisions based on how they fell. These bones may land on four surfaces and the four are not equally probable.

The Greeks called it knucklebones. They threw the bones from their knuckles, four at a time, to prevent cheating.

The Chinese came up with a system of equally likely outcomes. It involved picking short or long sticks. One of three pairs was selected, yielding eight possible outcomes. This was done twice, yielding 64 possible outcomes. This did not lead to probability quantification, though, because it takes too long to generate distributions involving 64 possible outcomes.

One of the first applications of mathematics came from gambling. The game involved throwing three dice at a time. Galileo was asked why the house seemed to win betting that, on dice throws, a 9 will happen before a 10. They would seem equally likely, there being six combinations of numbers from one to six that add to ten and six that add to nine. Galileo showed that, of the 216 possible results of throwing three dice, more of them add to ten. So even then, Galileo, at least, could calculate probabilities using finite possibilities.

Mr. Shlesinger mentioned risk-benefit analysis. This takes into account one’s personal gains and losses. If you are in Las Vegas with just enough money to get back to Washington, you don’t bet even if the odds are in your favor, because you can’t afford to lose.

Seven percent of the English were dying of smallpox in the 1700's. The wife of the ambassador to the Ottoman empire observed the Turks deliberately giving themselves cowpox. This prevented smallpox, but .5% of these people died of cowpox. People were reluctant to take the cowpox inoculation and deliberately accept the .5% risk.

The Dutch physician, Ingenhousz, inoculated the British Royal family. He visited America and urged George Washington to inoculate the troops, which did not have smallpox. Washington had it done. Benedict Arnold’s Northern Army was not inoculated and was devastated by smallpox.

Mr. Shlesinger told us of the Talmudic Paradox. It involves three drawers. One has two blue coins, one has two green, and one has one of each. You pick a coin, and it is blue. How probable is it the other coin in that drawer is blue? One answer is that you know you are looking in a drawer with a blue coin; you could have either drawer; if you have the one with one blue coin, the other one is green. If you have the one with two blue coins, the other one is blue. So the probability is 1/2. Another answer is that there are six coins; three of them are blue; two of the blue coins correspond to the other coin being blue; one of them corresponds to the other coin being green, so the probability of the other coin being blue is 2/3. Both seem plausible. Two/thirds is the right answer. It becomes more obvious if you consider an example where 9/10 of the coins in each drawer are one color. If you then draw a blue, another blue is far more likely.

The beginnings of probability started with a gambling question posed by Chevalier de Mere to Pascal. If you throw a die once, it seems obvious the probability of a six is 1/6, and it is. If you throw it four times, it would seem to be 4/6. Likewise, if you threw two dice 24 times, you might think the probability of a pair of sixes would be 24/36. But it is obvious the probabilities do not combine by addition, because if they did, if you threw one die seven times, the probability would have to be 7/6. Jacob Bernoulli wrote a book on probability. His nephew, Nicholas Bernoulli, finished it after Jacob’s death. Everything you ever studied about probability is in this book. It introduced combinations and permutations and explained how to calculate them. It gave us the Bernoulli method for finding the probability of m successes in n trials.

Abraham De Moivre is famous for predicting his own death. At 80, he noticed he was sleeping longer. He timed his increasing sleep periods and predicted his death at 87, when he expected to be sleeping 24 hours a day. He did die at the predicted time, apparently in his sleep.

De Moivre became the pre-eminent mathematician in England. Newton recognized his genius. Newton sent people he did not like to de Moivre. When de Moivre tired of roaming England to meet unappreciative, wealthy students, he worked out of Slaughter’s Coffee House. Such meetings were actually the beginning of the Royal Society. He originated the first limit theorem in probabilities. De Moivre’s theorem had to do with calculating combinations. He discovered the Gaussian distribution 50 years before Gauss.

An Irishman, Robert Adrain, also discovered the Gaussian distribution as a way to describe errors in surveying. Gauss published it a year later. Of course you all know that the Gaussian distribution is the familiar family of normal distributions defined by means and standard deviations, the most useful distribution family in statistics.

The members and guests who attended the 2,243rd are indeed fortunate. They know, as I am sure few of their friends and neighbors do, that the first use of the Poisson distribution was to fit the frequency of German cavalrymen being kicked to death by horses.

A lecture on statistics would, of course, be incomplete without mentioning beer. A scholar popularly called by the pseudonym, “student,” made critical contributions to the development of the “t” test, still today called the “student’s t test.” The t distribution was developed to fit the number of yeast cells in drops of beer. His name was William Gosset. During his lifetime, his employer, the Guinness company, refused to reveal beer-making secrets, so Gosset did not reveal his innovation even to Guinness and Guinness learned of it only after Gosset died. The moral: ask permission only after you’re dead.

Mr. Shlesinger described Bertrand’s Paradox: take a circle and what is the probability that a randomly drawn chord of a circle will be longer than the side of an inscribed equilateral triangle. It looks like it should be 1/3, but it is not. Some conditions “snuck in.” It is the angles that are equally distributed, not the chord lengths. A similar relationship obtains with a guy looking at the sky from a life raft. The angles count, not the strips of the apparent sky.

Mr. Shlesinger answered a question about the inevitability of the current state with a comment about geography being a strong determiner of everything.

Another asked about the wisdom of slot machines, then under political consideration in Maryland, and the proposition about not betting unless you can afford to lose. In response, he discussed the insidiousness of slot machines and how they deliberately let people win in increasing amounts at decreasing frequencies. He does not favor slot machines. He mentioned that he went to Vegas and lost all his money, though, so he may be biased.

He mentioned the need for behavioral research depending on results, and observed in Vegas, they adjust results by giving free drinks.

Someone asked about the Hood and the Bismarck. The Bismarck shot down the Hood with one shot. Churchill, wisely, it seems, sent many planes after the Bismarck. Isn’t it foolish to have too big a ship? Answer: Yes, the odds go with the many.

To another question, he said he doesn’t like calculus because it seems like a matching game.

Finally, he was asked why he went to Vegas. He said the purpose was not to gamble, it was a meeting of the American Physical Society. Differing from the recent commercial, Vegas does not want him to stay in Vegas, or even to return. The word in Vegas is that the average physicist goes to Vegas with one shirt and one $20-bill, and doesn’t change either one.

Mr. Haapala presented a plaque to Mr. Shlesinger commemorating the occasion. He introduced 13 new members. He made a ringing parking announcement. He introduced Bob Hershey, membership chairperson, who encouraged nonmembers to join. He announced the beginning of the process of electing new officers. He announced his own upcoming address on economic and financial literacy, November 6, near the Club. Interested parties may see Ken for details. He announced the following meeting.

Finally, at 9:23 pm, he adjourned the 2,243rd meeting to the social hour.

Attendance: 78
The weather: mostly clear
The temperature: 10°C
Respectfully submitted,

Ronald O. Hietala,
Recording secretary