Celebratio Mathematica

Joseph L. Doob

Joseph Leo Doob: February 27 1910–June 7, 2004

by Daniel W. Stroock

Joseph Leo Doob was born in Cin­cin­nati, Ohio, and raised in New York City. For much of his ca­reer, he was the lead­ing Amer­ic­an-born prob­ab­il­ist, a status re­flec­ted by his elec­tion as pres­id­ent of the In­sti­tute of Stat­ist­ics in 1950 and as pres­id­ent of the Amer­ic­an Math­em­at­ic­al So­ci­ety in 1963. In ad­di­tion, he was a mem­ber of the Amer­ic­an Academy of Arts and Sci­ences, the Na­tion­al Academy of Sci­ences, and the French Academy of Sci­ence; and a fel­low of the Roy­al Stat­ist­ic­al So­ci­ety. He was a re­cip­i­ent of the Steele Prize from the Amer­ic­an Math­em­at­ics So­ci­ety and the Na­tion­al Medal of Sci­ence from Pres­id­ent Jimmy Carter.

Des­pite his many hon­ors, Doob was a sin­cerely mod­est man who shunned ad­u­la­tion and took as much pride in his 25-year ap­point­ment as Com­mis­sar of the Urb­ana-Cham­paign Sat­urday Hike as he did in his pres­id­en­cies of learned so­ci­et­ies. His mod­esty did not spring from lack of con­fid­ence. He entered Har­vard Uni­versity as a fresh­man at age 16 and left with a Ph.D. six years later. Al­though the Har­vard math­em­at­ics fac­ulty had sev­er­al lu­minar­ies, Doob was nev­er cowed by them. He sat in on a course that G. D. Birk­hoff gave on his math­em­at­ic­al the­ory of aes­thet­ics and dropped out when he de­cided that Birk­hoff’s for­mu­la­tion was not suf­fi­ciently rig­or­ous. A large part of what Doob learned in his two years as a gradu­ate stu­dent he ab­sorbed while typ­ing the manuscript of Mar­shall Stone’s fam­ous treat­ise on lin­ear op­er­at­ors.

Ini­tially, Doob thought he would write his thes­is un­der Stone but ended up writ­ing it un­der J. L. Walsh when Stone said that he did not have an ap­pro­pri­ate new prob­lem to sug­gest. Doob’s earli­est work was thus fo­cused on re­fined ques­tions, of the Fatou type, about the bound­ary be­ha­vi­or of ana­lyt­ic func­tions.

A freshly min­ted math­em­at­ics Ph.D. faced a tough job mar­ket in 1932, but, with the back­ing of Birk­hoff and Stone, Doob won a two-year Na­tion­al Re­search Coun­cil Fel­low­ship that en­abled him move to New York, where his wife was in med­ic­al school.

He spent two years at Columbia Uni­versity, sup­posedly work­ing with J. F. Ritt but in fact work­ing more or less on his own. At the end of his NRC fel­low­ship, the job pro­spects for young math­em­aticians had not im­proved and showed no signs of do­ing so any time soon. Fol­low­ing the ad­vice of B. O. Koop­man, who told him that his chances of get­ting a job would be bet­ter if he ap­plied for one in stat­ist­ics, Doob ac­cep­ted a Carne­gie Cor­por­a­tion grant that Har­old Ho­telling could get for him in Columbia’s stat­ist­ics de­part­ment. Thus it was cold fin­an­cial ne­ces­sity that de­flec­ted the tra­ject­ory of Doob’s math­em­at­ic­al ca­reer. The fol­low­ing year, he ap­plied for and ob­tained one of the three or four jobs avail­able then in math­em­at­ics, a po­s­i­tion to teach stat­ist­ics at the Uni­versity of Illinois.

Un­til A. N. Kolmogorov provided it with a math­em­at­ic­al found­a­tion in 1933, prob­ab­il­ity the­ory did not ex­ist as a math­em­at­ic­al sub­ject. Dat­ing back to the 18th cen­tury, many math­em­aticians, in­clud­ing D. Bernoulli, P.-S. Laplace, P. de Fer­mat, B. Pas­cal, and C. Huy­gens had made in­ter­est­ing cal­cu­la­tions, but ex­actly what they were cal­cu­lat­ing was not, from a math­em­at­ic­al stand­point, well defined. Sim­il­arly, al­though C. F. Gauss him­self had stud­ied what he called the “the­ory of er­rors” and F. Galton had in­voked Gauss’s ideas to lend sci­entif­ic cre­dence to his ideas about eu­gen­ics, the re­la­tion­ship between stat­ist­ics and math­em­at­ics was even less clear. One can only ima­gine the re­ac­tion of someone like Doob, who found Birk­hoff’s the­ory of aes­thet­ics want­ing in math­em­at­ic­al rig­or, to these fields.

Faced with the chal­lenge of not only learn­ing stat­ist­ics but also trans­form­ing it in­to a field that met his high stand­ards, Doob set to work. Kolmogorov’s mod­el of prob­ab­il­ity the­ory is based on Le­besgue’s the­ory of in­teg­ra­tion, and Doob was well versed in the in­tric­a­cies of Le­besgue’s the­ory. One of Doob’s first break­throughs was a the­or­em about stochast­ic pro­cesses. As long as one is deal­ing with a count­able num­ber of ran­dom vari­ables, most ques­tions that one can ask about them are an­swer­able, at least in the­ory. However, the same is not true of an un­count­able num­ber of ran­dom vari­ables. Kolmogorov had de­vised a ubi­quit­ous pro­ced­ure for con­struct­ing un­count­able fam­il­ies of ran­dom vari­ables, but his con­struc­tion had a ser­i­ous draw­back. Namely, the only ques­tions about the ran­dom vari­ables con­struc­ted by Kolmogorov that were meas­ur­able were ques­tions that could be for­mu­lated in terms of a count­able sub­set of the ran­dom vari­ables. Thus, for ex­ample, if one used Kolmogorov’s pro­ced­ure to con­struct Browni­an mo­tion, then one ended up with a fam­ily of ran­dom vari­ables that were dis­con­tinu­ous with in­ner meas­ure 1 but con­tinu­ous with out­er meas­ure 1. What Doob showed was that there is a ca­non­ic­al way to modi­fy Kolmogorov’s con­struc­tion so that ques­tions like those of con­tinu­ity be­came meas­ur­able; and, be­cause this modi­fic­a­tion re­duced ques­tions about un­count­ably many ran­dom vari­ables to ques­tions about count­ably many ones, he called it the “sep­ar­able” ver­sion. Al­though Kolmogorov’s con­struc­tion is no longer the meth­od of choice, and there­fore Doob’s res­ult is sel­dom used today, at the time even Kolmogorov ac­know­ledged it as a sub­stant­ive con­tri­bu­tion.

Doob’s renown did not rest on his separ­ab­il­ity the­or­em. In­stead, he was best known for his sys­tem­at­ic de­vel­op­ment and ap­plic­a­tion of what he called “mar­tin­gales”1  — a class of stochast­ic pro­cesses of either real- or com­plex-val­ued ran­dom vari­ables para­met­er­ized by a lin­early ordered set, usu­ally either the non-neg­at­ive in­tegers or real num­bers, with the prop­erty that in­cre­ments are or­tho­gon­al to any meas­ur­able func­tion, not just lin­ear ones, of the earli­er ran­dom vari­ables. Thus, par­tial sums of mu­tu­ally in­de­pend­ent ran­dom vari­ables are mar­tin­gales, but par­tial sums of tri­go­no­met­ric series are not.

Mar­tin­gales arise in a re­mark­able num­ber of con­texts, both out­side and in­side of prob­ab­il­ity the­ory. For ex­ample, J. Mar­cinkiewicz’s gen­er­al­iz­a­tion of Le­besgue’s dif­fer­en­ti­ation the­or­em can be seen as an early in­stance of what came to be known as Doob’s mar­tin­gale con­ver­gence the­or­em, even though Mar­cinkiewicz’s ar­gu­ment was devoid of prob­ab­il­ity reas­on­ing. In a prob­ab­il­ist­ic set­ting, mar­tin­gales had also ap­peared in the work of S. Bern­stein, Kolmogorov, and P. Lévy. But it was Doob who first saw just how ubi­quit­ous a concept they are, and it was the ideas he in­tro­duced that pro­pelled mar­tin­gales in­to the prom­in­ence they have en­joyed ever since. Of key im­port­ance were his con­ver­gence and stop­ping time the­or­ems.

Strictly speak­ing, the con­ver­gence the­or­em was not an en­tirely new res­ult, as both B. Jessen and Lévy had proved the­or­ems from which it fol­lowed rather eas­ily. However, Doob’s proof was far more re­veal­ing and in­tro­duced ideas that were as valu­able as the res­ult it­self. In par­tic­u­lar, he in­tro­duced the no­tion of a “stop­ping time”, a ran­dom time with the prop­erty that one can tell wheth­er it has oc­curred by any fixed time \( t \) if one ob­serves the mar­tin­gale up to \( \text{time } t \). Thus, the first time that a mar­tin­gale ex­ceeds a level is a stop­ping time, but the last time that it does is not a stop­ping time. Doob’s stop­ping time the­or­em shows that the mar­tin­gale prop­erty is not lost if one stops the mar­tin­gale at a stop­ping time. His proof of his con­ver­gence the­or­em makes in­geni­ous use of that fact.

After prov­ing these the­or­ems, Doob turned his at­ten­tion to ap­plic­a­tions. What he dis­covered was that mar­tin­gales provided a bridge between par­tial dif­fer­en­tial equa­tions and stochast­ic pro­cesses. The ar­chetyp­al ex­ample of such a bridge is the ob­ser­va­tion that a har­mon­ic func­tion eval­u­ated along a Browni­an path is a con­tinu­ous mar­tin­gale. Once one knows this fact, S. Kak­utani’s sem­in­al res­ult — the one that relates the ca­pa­cit­ory po­ten­tial of a set to the first time that a Browni­an mo­tion hits it — be­comes an easy ap­plic­a­tion of Doob’s stop­ping time the­or­em. In ad­di­tion, the the­or­em lends it­self to vast gen­er­al­iz­a­tions that lead even­tu­ally to the con­clu­sion that, in a sense, there is an iso­morph­ism between po­ten­tial the­ory and the the­ory of Markov pro­cesses. One can eas­ily ima­gine Doob’s joy when he used mar­tin­gales to prove Fatou’s the­or­em about the way in which ana­lyt­ic func­tions in the disk ap­proach their bound­ary val­ues. Hav­ing been forced by the job mar­ket to aban­don clas­sic­al ana­lys­is for stat­ist­ics, his re­venge must have been sweet.

Two books that ap­peared in the early 1950s were re­spons­ible for mak­ing prob­ab­il­ity a stand­ard part of the Amer­ic­an math­em­at­ics cur­riculum. The first was Wil­li­am Feller’s Prob­ab­il­ity The­ory and its Ap­plic­a­tions,2 which was a su­perb treat­ment that stu­di­ously avoids Le­besgue in­teg­ra­tion. Feller’s book set the stand­ard for un­der­gradu­ate prob­ab­il­ity texts, but it was Doob’s Stochast­ic Pro­cesses3 that made prob­ab­il­ity the­ory re­spect­able in the math­em­at­ic­al re­search com­munity. The style of Doob’s book was very dif­fer­ent from that of Feller’s. Ex­amples and ap­plic­a­tions were every­where dense in Feller’s book, where­as Doob’s was a bril­liant but daunt­ing com­pil­a­tion of tech­nic­al facts, un­em­bel­lished by ex­amples. Doob’s goal was to show that prob­ab­il­ity the­ory could stand with any oth­er branch of math­em­at­ic­al ana­lys­is, and he suc­ceeded.

The fact that most young prob­ab­il­ists have nev­er read Doob’s book can be seen as a test­a­ment to its suc­cess. In the years since its pub­lic­a­tion, it spawned a myri­ad of books and art­icles that gave friend­li­er ac­counts of the same ma­ter­i­al. Non­ethe­less, if it were not for Doob’s treat­ment of K. Itô’s stochast­ic in­teg­ra­tion the­ory, R. Mer­ton’s in­ter­pret­a­tion of the Black–Scholes mod­el would not have been pos­sible. Of course, wheth­er that is a reas­on for cel­eb­rat­ing or con­demning Doob’s book is a ques­tion whose an­swer is out­side the scope of math­em­at­ics.

Typ­ic­al of Doob’s dry hu­mor and lac­on­ic style was a state­ment con­tained in his Who’s Who entry to the ef­fect that he had made prob­ab­il­ity more rig­or­ous but less fun. Al­though he was right on both counts, he had no reas­on to re­gret what he had done. Math­em­aticians owe him a huge and en­dur­ing debt of grat­it­ude.

[Ed­it­or’s note: The ori­gin­al mem­oir was pub­lished with a se­lect bib­li­o­graphy of Joseph Doob’s works. A com­plete bib­li­o­graphy can be found on the Works page of this volume.]