return

Celebratio Mathematica

Ruth J. Williams

Interview with Ruth J. Williams

by Allyn Jackson

Ruth Williams.
Photo: Erik Jepsen, UC San Diego Publications.

Ruth J. Wil­li­ams is one of the world’s lead­ing prob­ab­il­ists. Her work com­bines a pen­chant for ap­plied prob­lems with a love of rig­or­ous the­or­et­ic­al de­vel­op­ment. Much of her re­search has trained her for­mid­able tech­nic­al prowess on math­em­at­ic­al ques­tions arising in the dy­nam­ics of stochast­ic mod­els of com­plex net­works. Her work helped to build and ex­pand the the­or­et­ic­al found­a­tions for re­flect­ing dif­fu­sion pro­cesses, par­tic­u­larly re­flect­ing Browni­an mo­tions. She has de­veloped flu­id and dif­fu­sion ap­prox­im­a­tions for the ana­lys­is and con­trol of gen­er­al stochast­ic net­works and has brought sys­tem­at­ic ap­proaches to prov­ing heavy traffic lim­it the­or­ems. Her most cur­rent work, in col­lab­or­a­tion with sys­tems bio­lo­gists, cen­ters on stochast­ic mod­els for epi­gen­et­ic cell memory.

Born in Aus­tralia, Wil­li­ams com­pleted a Bach­el­or of Sci­ence (Hon­ors) de­gree in math­em­at­ics in 1976 at the Uni­versity of Mel­bourne. She also earned a Mas­ter of Sci­ence de­gree there in 1978 be­fore mov­ing to the United States to at­tend Stan­ford Uni­versity, where she earned her PhD in 1983 un­der the dir­ec­tion of Kai Lai Chung.

After a postdoc­tor­al po­s­i­tion at the Cour­ant In­sti­tute of Math­em­at­ic­al Sci­ences at New York Uni­versity, she joined the fac­ulty of the Uni­versity of Cali­for­nia at San Diego (UC­SD), where she is now Dis­tin­guished Pro­fess­or of the Gradu­ate Di­vi­sion, Dis­tin­guished Pro­fess­or of Math­em­at­ics Emer­ita, and former hold­er of the Charles Lee Pow­ell Chair in Math­em­at­ics I.

Wil­li­ams re­ceived a Na­tion­al Sci­ence Found­a­tion Pres­id­en­tial Young In­vest­ig­at­or Award in 1987 and an Al­fred P. Sloan Fel­low­ship the fol­low­ing year. The In­sti­tute for Op­er­a­tions Re­search and the Man­age­ment Sci­ences (IN­FORMS) awar­ded her both the 2016 John von Neu­mann The­ory Prize (jointly with her col­lab­or­at­or Mar­tin Re­iman) and the 2017 Award for the Ad­vance­ment of Wo­men in Op­er­a­tions Re­search and the Man­age­ment Sci­ences. Wil­li­ams was elec­ted to the Amer­ic­an Academy of Arts and Sci­ences in 2009, to the US Na­tion­al Academy of Sci­ences in 2012, and as a Cor­res­pond­ing Mem­ber of the Aus­trali­an Academy of Sci­ence in 2018.

What fol­lows is an ed­ited ver­sion of an in­ter­view with Ruth Wil­li­ams that took place over two ses­sions in June 2024.

“Your computer for 20 cents”

To start at the be­gin­ning — can you tell me about your child­hood grow­ing up in Aus­tralia?

I was born in Mel­bourne, which is the cap­it­al city of Vic­tor­ia, one of the south­ern states in Aus­tralia. When I was in grade three, we moved to an in­land city called Bendigo, about 100 miles north of Mel­bourne. That’s where I spent most of my child­hood. We moved be­cause of my fath­er’s job. He ori­gin­ally worked for the fed­er­al gov­ern­ment, the Aus­trali­an Pub­lic Ser­vice, in Mel­bourne. When we moved to Bendigo, he be­came head of busi­ness stud­ies at the loc­al tech­nic­al col­lege, the Bendigo In­sti­tute of Tech­no­logy.

Ruth Williams as a young girl in Australia, wearing her school uniform.
Photo courtesy of Ruth Williams.

What was his back­ground?

He grew up in the de­pres­sion years and ori­gin­ally worked for the post of­fice in ac­count­ing, I think. He was also study­ing com­merce part-time at the Uni­versity of Mel­bourne. Then World War II came along, and he was an of­ficer in the navy. After the war, he worked as an audit in­spect­or for the navy and the fed­er­al gov­ern­ment pub­lic ser­vice. He also went to night school to com­plete his Bach­el­or of Com­merce de­gree at the Uni­versity of Mel­bourne.

He stud­ied ac­count­ing, so he must have been good at math.

I be­lieve so. He was very lo­gic­al and math­em­at­ic­ally in­clined. He was also vis­ion­ary about com­puters in edu­ca­tion. When he moved to the tech­nic­al col­lege in Bendigo, it was the early 1960s, and there were very few com­puters in edu­ca­tion in Aus­tralia. He was very ef­fect­ive at get­ting main­frame com­puters for the col­lege. That was a big deal. He also served on an Aus­tralia-wide com­mit­tee, called a Com­mon­wealth Com­mit­tee, ad­vising about com­puters in edu­ca­tion. He made a couple of over­seas trips for that com­mit­tee, in 1968 and 1972, to look at how com­puters were used in edu­ca­tion around the world. I re­mem­ber those trips. He was away for — it must have been three or four months at a time. Travel in those days took longer and was very ex­pens­ive. Once one was over­seas, one made the most of it.

Where did he travel to?

He went largely to the US, Canada, and the UK. He wrote ex­tens­ive re­ports; I still have one of them. He helped bring com­puters in­to edu­ca­tion in Aus­tralia, es­pe­cially in Bendigo, and also made strong con­nec­tions to in­dustry as well as to the loc­al high schools. He star­ted a com­puter club for high school stu­dents with the slo­gan, “Your Com­puter for 20 Cents.” It was 20 cents a year to join the com­puter club, and you could sub­mit as many com­puter jobs as you wanted to the main­frame com­puter — but the max­im­um num­ber of runs you could get per day was one! That was a really great thing. I got to see early on how com­puters could be a very use­ful tool.

Ruth Williams (center) with two other members of the Bendigo Computer Club, Margaret Lourens (left) and Jenny Sutherland (right), around 1970.
Photo courtesy of Ruth Williams.

There is a pic­ture of three girls, in­clud­ing you, who were mem­bers of the Bendigo Com­puter Club. That was the club your fath­er put to­geth­er?

Yes. There were many more mem­bers of the club; only three are in the photo. Ini­tially there were pro­gram­ming classes on Sat­urday morn­ing where any­body from the Bendigo com­munity could go. Quite a few teach­ers from loc­al high schools went and learned a little bit of pro­gram­ming, and high school stu­dents too. Even­tu­ally I taught those classes as a mem­ber of the com­puter club. It was really a great thing.

What lan­guage did you pro­gram in?

Ini­tially I wrote in FOR­TRAN, but I also wrote a few pro­grams in AL­GOL and CO­BOL. I wrote about 100 pro­grams while I was in high school. They were batch jobs, so every time you made a mis­take you had to wait a day to send in a cor­rec­ted pro­gram. Ac­cess was via key­punch cards, which made it a long pro­cess. There were no ter­min­als at that time, oth­er than the one that was at­tached to the com­puter it­self.

What were the pro­grams you wrote? What prob­lems were you work­ing on?

I re­mem­ber try­ing to write a pro­gram to clas­si­fy plants, one to make a de­cision tree, one for air­line re­ser­va­tions. There were oth­er, sim­pler things like com­put­ing us­ing some math­em­at­ic­al for­mu­las. This was be­fore there were hand-held cal­cu­lat­ors and the like.

So you were do­ing this in high school.

In my spare time!

And pri­or to that, did you al­ways like math­em­at­ics? Did you have in­terests in oth­er areas?

I liked math­em­at­ics and sci­ence equally well. I liked solv­ing prob­lems — un­der­stand­ing why things are true, work­ing things out, as op­posed to just mem­or­iz­ing them.

In math­em­at­ics you can un­der­stand things to the very bot­tom.

Yes. I really like to dig down and un­der­stand at a fun­da­ment­al level. That’s one at­trac­tion of math­em­at­ics for me.

Did you have good math teach­ers?

Yes, I had a teach­er in fifth and sixth grade — it was the same teach­er for both — Mr. Dav­ies, who really en­cour­aged me in math­em­at­ics. I liked to al­ways get things ex­actly right, and he en­cour­aged that. Then in high school I had a num­ber of math and sci­ence teach­ers who en­cour­aged me, in chem­istry, phys­ics, and math. Some went out of their way to help; I had a chem­istry teach­er, Mr. Lake, who had me read ahead to the next year’s ma­ter­i­al.

Did you en­counter the at­ti­tude that girls can’t do math and sci­ence?

I don’t think so. I had a strong sup­port struc­ture. I had par­ents and teach­ers who were very sup­port­ive of me.

Is there any­thing dif­fer­ent about Aus­tralia that would make it more ac­cept­able for girls to be in­ter­ested in these sub­jects?

I’m not sure. I was good at math­em­at­ics, and people who were good at something were en­cour­aged. I was al­ways in­ter­ested in hav­ing some kind of sci­entif­ic ca­reer, which in­cludes math­em­at­ics. I al­ways felt like that was pos­sible.

You didn’t feel there were obstacles, or that this was something strange for a girl to do. That’s great.

Well, there wer­en’t that many wo­men I could name who had sci­entif­ic ca­reers! But I just felt that it was a nat­ur­al thing that I was good at and en­joyed.

And your moth­er, did she have a pro­fes­sion also?

Yes, she was trained as a nurse. When she had chil­dren she stopped work­ing, but as we got older, she went back to nurs­ing. She be­came a nurse at a home for the aged blind in Bendigo, and even­tu­ally be­came dir­ect­or of nurs­ing there. My moth­er was great. She was a very caring per­son and also was al­ways very en­cour­aging. I had great par­ents; I was very lucky.

And sib­lings?

I have a broth­er and a sis­ter, both young­er.

What did they end up do­ing?

My broth­er is an elec­tric­al en­gin­eer in Sydney. My sis­ter lives in Bendigo, and she was a nurse — an op­er­at­ing theat­er nurse ac­tu­ally, which is an in­tense pro­fes­sion.

You have to have strong nerves for that! Did your broth­er get a PhD?

No, he earned a Bach­el­or of Sci­ence (Hon­ors) de­gree, spe­cial­iz­ing in elec­tric­al en­gin­eer­ing from the Uni­versity of Mel­bourne. My sis­ter trained as a nurse at the Roy­al Mel­bourne Hos­pit­al School of Nurs­ing. Nurs­ing was not a uni­versity de­gree at that time. Later she earned her Bach­el­or of Health Sci­ence in nurs­ing from La Trobe Uni­versity and took vari­ous post­gradu­ate spe­cial­ized train­ing courses, for ex­ample, re­lated to sur­gery.

Drilling down to understand things

It sounds like your par­ents had a huge pos­it­ive in­flu­ence on all three of you. After high school, you went to the Uni­versity of Mel­bourne for your un­der­gradu­ate stud­ies. Was that the closest uni­versity to Bendigo?

The way it worked was that when you ap­plied to go to uni­versity in Aus­tralia, you usu­ally ap­plied to the uni­versit­ies in your own state. It was dif­fi­cult to go to a uni­versity in an­oth­er state, oth­er than maybe to the Aus­trali­an Na­tion­al Uni­versity, which is in Can­berra, the cap­it­al of Aus­tralia. It would ac­cept stu­dents from any state.

Mel­bourne was the best uni­versity in the state of Vic­tor­ia. There were com­pet­it­ive statewide ex­ams at the end of high school, ex­ams in dif­fer­ent sub­jects. That de­term­ined what uni­versit­ies you got ad­mit­ted to. You could give a pref­er­ence, but the ex­ams gave you a rank­ing. Mel­bourne was the hard­est one to get in­to.

Did you know what you wanted to study?

I knew I wanted to study sci­ence, but I wasn’t sure wheth­er I wanted to do chem­istry, phys­ics, or math­em­at­ics. In the first year I stud­ied all three — ac­tu­ally I stud­ied two kinds of math­em­at­ics, pure and ap­plied. At the end of the first year I needed to de­cide, and I de­cided to do math­em­at­ics. Again, it goes back to be­ing able to drill down and un­der­stand why things were true. I felt it was a bit harder to do that with phys­ics and chem­istry. In chem­istry, the mod­el of the atom seemed to change every year, which was a little dis­turb­ing! Phys­ics was closer to math­em­at­ics. I was good at all three, but I thought math­em­at­ics en­abled me to un­der­stand things at a fun­da­ment­al level and that you could do a lot of things with it.

I also in­form­ally minored in what would now be called com­puter sci­ence and was then called in­form­a­tion sci­ence. So I took com­put­ing courses as well. That was easy to do.

You were already a hack­er, right? So to speak!

I don’t think I was a hack­er! In the first year, there was a little com­pon­ent of com­puter pro­gram­ming in an ap­plied math course, and I found that pretty easy be­cause I could already pro­gram, be­cause of my ex­per­i­ence with the Bendigo Com­puter Club. So then I was help­ing oth­er stu­dents. But it got more in­ter­est­ing in later years. There were things that I hadn’t stud­ied be­fore, such as al­gorithms and data struc­tures, and also dif­fer­ent lan­guages. I learned SNO­BOL and Pas­cal at some point.

It’s of­ten said today that girls go in­to com­puter courses lack­ing the kind of pro­gram­ming ex­per­i­ence that boys are like­li­er to have. But at that time you were un­usu­al even to have any pro­gram­ming ex­per­i­ence at all.

That’s true. That was one of the lucky things about be­ing in Bendigo, where there was the com­puter club that gave you ac­cess to a com­puter. At the time it was very un­usu­al, es­pe­cially in a re­gion­al city like Bendigo. Maybe it was a bit more com­mon in Mel­bourne. There were a few stu­dents I ran in­to at uni­versity who had a bit of pro­gram­ming ex­per­i­ence, but it was very un­usu­al at that time to have that kind of ac­cess.

Those two girls in the photo from the Bendigo Com­puter Club — those were friends of yours?

Yes. We were all mem­bers of the com­puter club. I knew them pretty well.

Did they go on to do something in math or sci­ence?

I don’t know. They didn’t go to the Uni­versity of Mel­bourne, and I lost track of them. There were some oth­er people at the high school I went to who did go to the Uni­versity of Mel­bourne. Since I was from the “coun­try”, I couldn’t live at home to go to uni­versity, so I lived in a res­id­en­tial col­lege, St. Hilda’s Col­lege. There were quite a few girls from my high school who also lived in St. Hilda’s Col­lege, so that was nice. And all of them were study­ing sci­ence. One of them, Sue Win­zar, had been a mem­ber of the Bendigo Com­puter Club. Sue had a very suc­cess­ful ca­reer in IT, even­tu­ally head­ing up IT for Esso in Aus­tralia.

Did you have oth­er in­terests as a young per­son, for ex­ample, mu­sic? Did you play a mu­sic­al in­stru­ment?

I learned the re­cord­er in primary school! I’m not really good at mu­sic­al in­stru­ments. I like out­door things, that’s my re­lease.

Does that come from liv­ing in Aus­tralia?

I think so. I think that’s a part of the Aus­trali­an spir­it, to like to be out­doors. I played some ten­nis and did some run­ning for a while. Also, my fam­ily would of­ten spend week­ends and hol­i­days work­ing at a wheat and sheep farm that we owned. This was an­oth­er as­pect of my love of the out­doors.

A wheat and sheep farm! Can you say more about that?

It was ori­gin­ally the farm that my moth­er had grown up on. Her sis­ter and broth­er-in-law had owned it for a while, but then they moved in­to town, and we bought it from them when I was around five years old. It’s one of the reas­ons we moved from Mel­bourne to Bendigo, be­cause it was closer to the farm. We would go there on the week­ends. Farms al­ways need something to be done! Wheat and sheep are good be­cause they tend to look after them­selves fairly well, but there were al­ways things to be done, like mend­ing fences and mak­ing sure wa­ter didn’t flow in the wrong places. Shear­ing time was a busy time, and we would help with round­ing up the sheep and gath­er­ing wool from the shear­ing. I re­mem­ber jump­ing on the wool bales to press them down. Feed­ing the shear­ers was a non­stop task be­cause they con­sume a lot of en­ergy.

Pro­gram­ming com­puters dur­ing the week and then go­ing on the week­end in­to the coun­try — that sounds like a great bal­ance.

Yes, I think it was a great ex­per­i­ence for me as a child. I have a lot of good memor­ies from that.

Did you ever par­ti­cip­ate in math com­pet­i­tions when you were grow­ing up in Aus­tralia?

No. I didn’t even know they ex­is­ted — and I am not sure they did ex­ist in Aus­tralia when I was a kid. But I did have one really nice ex­per­i­ence when I was in the last year of high school. There was a con­fer­ence in Mel­bourne cel­eb­rat­ing Edis­on’s birth­day, and they se­lec­ted one stu­dent from each high school in the state of Vic­tor­ia to go. And I was chosen to go from my high school. That was a really in­ter­est­ing ex­per­i­ence. It was a few days in Mel­bourne, and people came to speak about the fu­ture of sci­ence. I got to meet oth­er stu­dents from around the state of Vic­tor­ia. Edis­on, of course, had been from the United States, so the con­fer­ence had an in­ter­na­tion­al feel. I was very lucky to do that.

The speak­ers were from Aus­tralia?

Some of them were from over­seas, but some were from Aus­tralia. I re­mem­ber there were people who talked in par­tic­u­lar about the fu­ture of com­put­ing.

A PhD, the next natural step

Go­ing back to your time at the Uni­versity of Mel­bourne, were there pro­fess­ors there who were spe­cial to you, who en­cour­aged you?

Yes, es­pe­cially as I got in­to my third year. In Aus­tralia, the reg­u­lar bach­el­or’s de­gree was a three-year de­gree. Then you could do an hon­ors year, if you got high enough marks, and would write an hon­ors thes­is as well as take some courses. So I did that. In my second and third years, I had taken a couple of courses in ana­lys­is from a pro­fess­or, Jerry Ko­liha, and really liked them, so I chose to do my hon­ors thes­is with him on pseudo-in­verses of op­er­at­ors. It’s part of func­tion­al ana­lys­is. I really liked his math­em­at­ic­al style. He was very pre­cise but also very en­thu­si­ast­ic about math­em­at­ics. That was a good ex­per­i­ence.

Then I stayed on to do a mas­ter’s de­gree, which at the Uni­versity of Mel­bourne was purely by re­search. I did re­search in dif­fer­en­tial games, with Dav­id Wilson, which I also en­joyed.

What does that mean, dif­fer­en­tial games?

It’s game the­ory, but where the state of the sys­tem that you are look­ing at is dy­nam­ic. It’s gov­erned by a dif­fer­en­tial equa­tion. You have two or more play­ers who have con­trols in those dif­fer­en­tial equa­tions that can in­flu­ence the be­ha­vi­or. A way to think about it is that with op­tim­iz­a­tion or con­trol, you have one play­er who’s try­ing to con­trol the be­ha­vi­or of a dif­fer­en­tial equa­tion. In dif­fer­en­tial games, you have more than one play­er, and they are com­pet­ing against one an­oth­er.

Dif­fer­en­tial games are used for mod­el­ing dy­nam­ic situ­ations that arise in a whole host of ap­plic­a­tions. At the time there was a lot of in­terest in stra­tegic ap­plic­a­tions, but today dif­fer­en­tial games are used in oth­er areas, in en­gin­eer­ing and even in data sci­ence.

Did you pub­lish a pa­per about that re­search?

I pub­lished two pa­pers.

That’s im­press­ive. How did you come to the idea of go­ing for a PhD in math­em­at­ics? Did you al­ways have it in mind, or did your pro­fess­ors en­cour­age you?

I think I al­ways had it in mind as the next nat­ur­al step. Cer­tainly I had in my mind since I was an un­der­gradu­ate that I would like to be­come a uni­versity pro­fess­or. At the time, it was usu­al for Aus­trali­an stu­dents who wanted to pur­sue a PhD to go over­seas. Usu­ally at that time people would go to the UK. I did ap­ply to the UK, but I also ap­plied to a couple of places in the US. Since I’d been do­ing re­search on dif­fer­en­tial games, I wanted to at least have the op­tion of con­tinu­ing that.

So I looked for places that had a strong math­em­at­ics de­part­ment but where there was some­body do­ing dif­fer­en­tial games. I ap­plied to both Stan­ford and Berke­ley and got ad­mit­ted to both. Stan­ford had someone in aero­naut­ic­al en­gin­eer­ing who did dif­fer­en­tial games. I got ad­mit­ted to the math pro­gram, but I would have been able to work with that per­son if I’d wanted to. That’s how I chose Stan­ford. I could have gone to the UK, but if I wanted to con­tin­ue do­ing dif­fer­en­tial games, it was a bit bet­ter to come to the US.

It’s def­in­itely a dif­fer­ent setup go­ing to the US versus go­ing to the UK. In the UK, you fo­cus very quickly on what you want to pur­sue for your PhD, where­as in the US there is a lot more course­work first, and then you choose what you want to do. The Aus­trali­an sys­tem is a lot like the Brit­ish sys­tem. I feel like I got the best of both sys­tems, be­cause I got my early edu­ca­tion in Aus­tralia and then I came to the US and got the be­ne­fits of gradu­ate edu­ca­tion here. It’s very easy here to pick up new sub­jects, be­cause you can of­ten just go and take a course.

Graduate school at Stanford

Ruth Williams, pictured here with her mother and father, receiving her PhD from Stanford University in 1983.
Photo courtesy of Ruth Williams.

You knew you were go­ing abroad and would leave your fam­ily. Was that dif­fi­cult?

Es­pe­cially at that time, when air travel was not so com­mon, it was a big step. It was very ex­pens­ive to travel back home, so I only went back a couple of times to Aus­tralia dur­ing my PhD. But people wrote let­ters in those days, so I would write a let­ter every week, and my par­ents would write a let­ter every week. We used aero­grammes, as they were less ex­pens­ive than a let­ter in a sep­ar­ate en­vel­ope. Phone calls were also very ex­pens­ive and only used rarely. As an in­ter­na­tion­al stu­dent, I did have a host fam­ily, the Rich­erts, who would in­vite me, to­geth­er with some oth­er Stan­ford gradu­ate stu­dents, to events like Thanks­giv­ing. That was a good way to ex­per­i­ence Amer­ic­an fam­ily life. The Rich­erts were very kind to me, and we have stayed in touch over the years.

Was there much of a cul­ture shock to go to Cali­for­nia, to a uni­versity that must have had a very dif­fer­ent style and feel from Mel­bourne?

In some ways yes, al­though Cali­for­nia is quite a bit like Aus­tralia, in terms of the ve­get­a­tion and cli­mate. I think the first thing that struck me was that the lec­tures in Aus­tralia were more form­al than the lec­tures in the United States. At the time, there was not very much re­li­ance on text­books in Aus­tralia. A lec­turer would write on the board very care­fully, and you would take very care­ful notes. So I was used to learn­ing just from whatever some­body presen­ted in a lec­ture, rather than read­ing a text­book. I think that’s a good skill to have, to be able to take notes and listen well.

Of course, the sys­tem as I said is a bit dif­fer­ent too, in that, for a PhD in Aus­tralia, once you start, you’ve already se­lec­ted who you want to do your PhD with. The PhD was short­er, three years, be­cause people were usu­ally sup­por­ted on fel­low­ships. In the US in math­em­at­ics, people are most of­ten sup­por­ted on TA­ships [teach­ing as­sist­ant­ships]. I was for­tu­nate for the first couple of years to have par­tial sup­port from a schol­ar­ship from the Uni­versity of Mel­bourne. That meant I didn’t have to TA quite as much in the first few years, though mainly I was sup­por­ted by a TA­ship from Stan­ford. For the TA­ship, I largely had to teach my own courses, which had about thirty stu­dents. I was re­spons­ible for teach­ing the whole course, in­clud­ing grad­ing and ex­am­in­a­tions. That was a very good ex­per­i­ence.

At that time, in PhD pro­grams in the US, there was course­work in the be­gin­ning that you didn’t have in Aus­tralia or Bri­tain. I took prob­ab­il­ity courses from Kai Lai Chung and oth­er people. One nice thing about Stan­ford was that there were people do­ing prob­ab­il­ity in dif­fer­ent parts of the uni­versity. There were people in stat­ist­ics, in op­er­a­tions re­search, even in the busi­ness school, who were do­ing ap­plied prob­ab­il­ity, who were in­vent­ing new prob­ab­il­ity mod­els and ana­lyses.

My first year I took a game the­ory course as well, be­cause I wasn’t sure if I still wanted to do dif­fer­en­tial games at that point. That was very in­ter­est­ing, be­cause game the­ory was in the eco­nom­ics de­part­ment. They had a sum­mer pro­gram in game the­ory, and they se­lec­ted me to write the notes from the course they were teach­ing. I got to meet some fam­ous people in game the­ory, like Robert Au­mann. That was pretty amaz­ing. I wouldn’t have met those people in Aus­tralia. That’s one of the bo­nuses of hav­ing come to the US.

Even­tu­ally I de­cided to do prob­ab­il­ity rather than game the­ory. And be­ing a prob­ab­il­ity stu­dent at Stan­ford was great, be­cause there were lots of vis­it­ors com­ing through to give a talk or to stay for longer peri­ods of time, and you got to meet lu­minar­ies in prob­ab­il­ity. It was an ex­cel­lent place to be for that.

You and some oth­er wo­men PhD stu­dents took a read­ing course with Samuel Karlin. Can you tell me about that?

We were the three wo­men out of about a dozen PhD stu­dents in my year. We had a lot of our ini­tial classes to­geth­er, but it also turned out that we were all in­ter­ested in prob­ab­il­ity. I think we’d taken the ba­sic gradu­ate prob­ab­il­ity course to­geth­er. Sam Karlin was writ­ing a book on stochast­ic pro­cesses — ac­tu­ally a two-volume set — with Howard Taylor [of Cor­nell Uni­versity]. Stochast­ic pro­cesses wer­en’t em­phas­ized so much in the ba­sic prob­ab­il­ity course, so we wanted to learn more about that. We asked Karlin if he would do a read­ing course with us.

The three of us would meet with him once a week and go over the read­ing we’d been do­ing and ask ques­tions. There were some long chapters, in­clud­ing one of more than 100 pages on dif­fu­sion pro­cesses, and we worked our way through that. That was a great ex­per­i­ence, both learn­ing to­geth­er — we would talk with one an­oth­er about what we were read­ing — but also in­ter­act­ing with Sam Karlin. He was al­ways very en­thu­si­ast­ic about whatever he was work­ing on and would tell us about his re­search. I got to meet some of the people vis­it­ing Stan­ford who were work­ing with him, like Si­mon Tav­aré. There were al­ways prob­ab­il­ity vis­it­ors com­ing through, either vis­it­ing Chung or Karlin, who were the two main prob­ab­il­ists in the math­em­at­ics de­part­ment.

What was Karlin like as a per­son?

He was al­ways very nice to us. He was very hard work­ing and com­mit­ted to what he was do­ing. You’d go in­to his of­fice, and im­me­di­ately he would start talk­ing about math. He was so en­thu­si­ast­ic about it, it was in­fec­tious.

He was also very pro­lif­ic. He star­ted col­lab­or­at­ing with bio­lo­gists at some point in his ca­reer, so there were al­ways a lot of people com­ing through, con­sult­ing with him about prob­lems they needed to have solved. I re­mem­ber him say­ing at one point that when he made the trans­ition to work­ing with bio­lo­gists, he learned you needed to have ex­per­i­ments. I al­ways re­membered that, that it’s im­port­ant to be con­nec­ted to the ex­per­i­ment­al side of bio­logy if you want to work with bio­lo­gists.

We three wo­men who were in the read­ing course all went in­to prob­ab­il­ity. I worked with Kai Lai Chung, Amy Rocha worked with Joe Keller, who had come fairly re­cently to Stan­ford, and Marge Foddy worked with Sam Karlin.

Three wo­men out of an in­com­ing class of about a dozen PhD stu­dents — that’s a large per­cent­age of wo­men.

It was a big per­cent­age. In oth­er years, there were no in­com­ing wo­men stu­dents. It was maybe a bit un­usu­al. But it was good to have oth­er wo­men in the class, es­pe­cially since we were all in­ter­ested in prob­ab­il­ity.

Was it im­port­ant to you that oth­er wo­men were there? Did you feel any of what has been called the “chilly cli­mate” that wo­men in PhD pro­grams have some­times felt — that they felt out of place or were not taken ser­i­ously?

I don’t think I really felt that, but it’s al­ways good to not be too isol­ated. It was good that there were oth­er wo­men in the pro­gram. At that time, in sci­ence and en­gin­eer­ing in gen­er­al, there wer­en’t so many wo­men. The Dean of Gradu­ate Af­fairs, Jean Fet­ter, was cog­niz­ant of this, and or­gan­ized the mak­ing of a little book­let about wo­men in sci­ence and en­gin­eer­ing at Stan­ford. In my fi­nal year, her of­fice helped to ini­ti­ate a lec­ture series for wo­men in sci­ence and en­gin­eer­ing, where wo­men fac­ulty or wo­men in in­dustry would come and give a short present­a­tion about their re­search and there would be a re­cep­tion af­ter­ward. I was the co­ordin­at­or for that. The series was a good way to con­nect with people who had gone on to suc­cess­ful ca­reers in sci­ence and en­gin­eer­ing. It was also a way to meet wo­men gradu­ate stu­dents from oth­er de­part­ments.

It sounds like you ad­jus­ted very well to Stan­ford. You found a good place for your­self.

I think it worked out well, es­pe­cially when I found prob­ab­il­ity. For every gradu­ate stu­dent there is this time when you are look­ing for an ad­visor: Who can I work with?

Ruth Williams got her PhD thesis problem from Mike Harrison (center), with whom she wrote several papers. Another collaborator of hers is Jim Dai of Cornell University (right). The three are pictured here in a photo from 2009.
Photo courtesy of Ruth Williams.

A peri­od of un­cer­tainty.

Yes. Kai Lai Chung was my ad­visor in math, but I got my thes­is prob­lem through tak­ing courses in oth­er de­part­ments — ac­tu­ally a course in the busi­ness school taught by Mike Har­ris­on.1 So I really had two ad­visors, Mike Har­ris­on and Kai Lai Chung. Stan­ford was a good place to do that be­cause they had a scheme where you could have ad­visors from dif­fer­ent de­part­ments. At some point [S.R.S.] Varadhan vis­ited Stan­ford, and I met with him. He gave me help­ful in­put on the prob­lem I was work­ing on and in­tro­duced me to sub­martin­gale prob­lems as a way to char­ac­ter­ize dif­fu­sion pro­cesses.

What are your memor­ies of Kai Lai Chung?

He was en­thu­si­ast­ic about math­em­at­ics but also had a me­tic­u­lous and el­eg­ant style. I liked that. It suited me well. Chung and I wrote a book to­geth­er on stochast­ic in­teg­ra­tion, which grew out of a top­ics course he taught while I was a PhD stu­dent.2

In an­oth­er ad­vanced top­ics course I took with Chung, he men­tioned an open prob­lem about the fi­nite­ness of the Feyn­man–Kac gauge re­lated to solu­tions of the re­duced Schrödinger equa­tion. A little bit later I was in a PDE course, and I real­ized that some tech­nique from the PDE course could be used to look at the prob­lem — it was ac­tu­ally a con­jec­ture. Part of the prob­lem was that it wasn’t clear wheth­er the con­jec­ture was true or not. I could use the PDE res­ult to show that, at least un­der some smooth­ness con­di­tions, the an­swer was yes, the con­jec­ture was cor­rect. That led to the first pa­per I wrote as a PhD stu­dent, be­fore I was ac­tu­ally work­ing on my thes­is prob­lem prop­er, which was on a dif­fer­ent top­ic.

When you make con­nec­tions like that, you know you are really a math­em­atician.

Yes. There’s a real thrill in dis­cov­er­ing something new and prov­ing that it’s cor­rect. It’s a bit hard to de­scribe. You work on a prob­lem really hard, and you try this way and that way, and it doesn’t work out. But then even­tu­ally you get some idea, and it just clicks. It’s really a won­der­ful feel­ing.

Of course the first thing you do is you check it many times to make sure you’re not fool­ing your­self!

Feyn­man said that the easi­est per­son to fool is your­self.

Yes, I think you have to be your own worst crit­ic. That’s a use­ful tal­ent to ac­quire, to be able to find your own mis­takes.

Probability: “It was what I was looking for.”

So you met Mike Har­ris­on through a course in the busi­ness school?

Yes. He offered a course on stochast­ic cal­cu­lus and its ap­plic­a­tions. It was a great course, and even­tu­ally he made a book out of it, which was very well re­ceived. It was pretty stand­ard at Stan­ford that gradu­ate stu­dents from oth­er de­part­ments would take prob­ab­il­ity courses in the math de­part­ment, and also prob­ab­il­ity stu­dents would take prob­ab­il­ity-re­lated courses in oth­er de­part­ments.

Dur­ing the course Mike would some­times put some prob­lems on the black­board that were al­most re­search prob­lems, or little pieces of something re­lated to re­search prob­lems. So I got a taste of the kind of things he was in­ter­ested in. It was a blend of ap­plic­a­tions and rig­or­ous math­em­at­ics. After that course, I talked to him about wheth­er there might be some re­search ques­tions there that I could work on. He sug­ges­ted something, and I went away and thought about it, came back, made a little pro­gress on it — that’s how it evolved to me work­ing on that prob­lem with him as a coad­visor, and Kai Lai Chung as my ad­visor in math. So it all worked out in the end. I just fol­lowed the math to where I thought the in­ter­est­ing prob­lems were.

That’s part of be­ing a math­em­atician too, know­ing what you find in­ter­est­ing and trust­ing that. Also hav­ing a taste for prob­lems.

I think some of that you get with ex­per­i­ence too. Maybe I was lucky that I already had some re­search ex­per­i­ence, so I was on the lookout for prob­lems. I didn’t really know prob­ab­il­ity at the gradu­ate level be­fore I went to Stan­ford, so that was something new for me. The first year I was tak­ing be­gin­ning courses, and the second year I took a prob­ab­il­ity course and then star­ted tak­ing more ad­vanced courses. So it took a little while to find prob­ab­il­ity, but it was what I was look­ing for. I really like it as a field be­cause it touches oth­er areas of math­em­at­ics, both pure and ap­plied, but also it in­ter­faces with a lot of ap­plic­a­tions. It’s the the­or­et­ic­al basis for stat­ist­ics, it in­ter­sects with op­er­a­tions re­search and with ap­plic­a­tions in lots of dif­fer­ent fields in sci­ence and en­gin­eer­ing. Re­cently I’ve been work­ing with bio­lo­gists. Prob­ab­il­ity is a per­fect field for me.

There are dif­fer­ent parts to prob­ab­il­ity. I tend to work more on the con­tinu­ous side of prob­ab­il­ity. There is also the dis­crete side, which tends to be more com­bin­at­or­i­al and al­geb­ra­ic. I am more on the side that is closer to ana­lys­is. There are a lot of con­nec­tions with ana­lys­is, but it’s built on top of ana­lys­is. If you just know ana­lys­is, it doesn’t mean you can do prob­ab­il­ity. There’s ex­tra struc­ture there, dif­fer­ent prob­lems. And there is ran­dom­ness every­where in the world, so there are lots of dif­fer­ent things to work on.

Do you re­mem­ber the so-called Monty Hall Prob­lem that was all over the news years ago? It came from a game show where con­test­ants had to choose which door to open, and one door would have a nice prize and the oth­er doors would have a gag prize like a goat. A news­pa­per colum­nist wrote about the cor­rect strategy to use in some par­tic­u­lar case, and many math­em­aticians wrote let­ters com­plain­ing that she’d got­ten it wrong. In fact the colum­nist was right. Do you re­mem­ber this?

I seem to re­call that the prob­lem de­pends on ex­actly what you know at what point, but also on how you phrase the ques­tion.

Yes. The math­em­aticians who wrote let­ters didn’t un­der­stand the prob­ab­il­ity as­pect of the prob­lem. They didn’t have the in­tu­ition of prob­ab­il­ists, which is a dif­fer­ent kind of in­tu­ition, is that right?

I agree that prob­ab­il­ity has dif­fer­ent or ad­di­tion­al in­tu­ition. I think of it as an ex­tra lay­er that you are put­ting on everything else that you know. It doesn’t mean that you can’t use all of the know­ledge you have from math­em­at­ics, but there is an ex­tra lay­er that ran­dom­ness brings.

Can you talk about that lay­er? What kind of in­tu­ition do you need to un­der­stand ran­dom things?

Of­ten as hu­man be­ings we are con­di­tioned to think de­term­in­ist­ic­ally; we want to think de­term­in­ist­ic­ally. Some­times people’s in­tu­ition leads to cer­tain as­sump­tions, for ex­ample, that things are equally likely when they are not, or that they are in­de­pend­ent when they are not. I’ve seen this with dy­nam­ic­al sys­tems, for ex­ample. People are very used to ask­ing, What is the long-term be­ha­vi­or of this dy­nam­ic­al sys­tem? If that sys­tem has noise in it, you have to phrase such ques­tions dif­fer­ently. A dif­fu­sion pro­cess, for ex­ample — which is a solu­tion of a stochast­ic dif­fer­en­tial equa­tion — could ba­sic­ally go every­where. People are used in dy­nam­ic­al sys­tems to ask­ing: Does it con­verge to zero? You have to phrase the prob­ab­il­ity ques­tions dif­fer­ently, like: How long does it take to get to a neigh­bor­hood of the ori­gin? And even­tu­ally when it gets there, it might still vis­it every­where else. If you have in­tu­ition that is re­lated to de­term­in­ist­ic mo­tion, you don’t nat­ur­ally think of those kinds of things.

So prob­ab­il­ity is great — it’s like an ex­tra as­pect that can be there, and it provides ad­di­tion­al tools for try­ing to solve things.

You men­tioned the course with Mike Har­ris­on on stochast­ic cal­cu­lus and ap­plic­a­tions. Can you give an ex­ample of an ap­plic­a­tion there?

He was very in­ter­ested in queueing and in­vent­ory con­trol. It comes up in ap­plic­a­tions in man­u­fac­tur­ing, but also the same kinds of mod­els come up in tele­com­mu­nic­a­tions and even in trans­port­a­tion. These were queueing net­work mod­els. That was my first in­tro­duc­tion to the con­nec­tion between queueing net­works and dif­fu­sion pro­cesses called re­flect­ing Browni­an mo­tions. That was a nice con­nec­tion to learn about.

Stochast­ic cal­cu­lus has as many ap­plic­a­tions as reg­u­lar cal­cu­lus has! And prob­ably more. Whenev­er you have dy­nam­ic­al sys­tems sub­ject to noise, stochast­ic cal­cu­lus is a tool for try­ing to ana­lyze those sys­tems. It comes up in phys­ics, en­gin­eer­ing, eco­nom­ics, and fin­ance. The Black–Scholes equa­tion in fin­ance is an ex­ample of a stochast­ic dif­fer­en­tial equa­tion, but you can have much more com­plic­ated fin­ance mod­els than that. Stochast­ic cal­cu­lus was in­ven­ted by Kiy­oshi Itô and is some­times called Itô cal­cu­lus. After Itô, there was a lot of de­vel­op­ment of gen­er­al the­ory, es­pe­cially by the French school of prob­ab­il­ity.

After you got your PhD you went to the Cour­ant In­sti­tute and worked with Varadhan.

I had already had some in­ter­ac­tion with him about my thes­is, as I men­tioned. Some years be­fore that, he had had a sab­bat­ic­al at Stan­ford and had talked to Mike Har­ris­on about the re­flect­ing Browni­an mo­tion prob­lems. Then when I was a PhD stu­dent at Stan­ford, Varadhan vis­ited Stan­ford for a day or so, and I met him through Mike Har­ris­on. Varadhan and I dis­cussed the re­flect­ing Browni­an mo­tion prob­lem I was work­ing on, and even­tu­ally that led to a joint pa­per.3 When I was at Cour­ant, I talked to him about vari­ous re­flect­ing Browni­an mo­tion prob­lems.

Cour­ant was a great place to be. On the top floor of the build­ing they had a tearoom where people would go for af­ter­noon tea and lunch. The fac­ulty and stu­dents would gath­er there and talk math­em­at­ics. There was a very friendly at­mo­sphere for postdocs and stu­dents. It was pretty easy to talk to the fac­ulty. You’d sit down at lunch with Cath­leen Mor­awetz and oth­ers.

How was it for you to meet Cath­leen Mor­awetz?

I knew of her, but at Cour­ant was the first time I met her. Of course she was to me a gi­ant in math­em­at­ics, though we are in dif­fer­ent fields. I think the first time I sat in the tearoom for lunch, she was talk­ing about the re­search mono­graph she’d been work­ing on over the sum­mer. It was great to be able to meet her in this in­form­al at­mo­sphere.

After I left Cour­ant, I in­ter­ac­ted a bit with Cath­leen when she was in­volved with the AMS [Amer­ic­an Math­em­at­ic­al So­ci­ety], in­clud­ing when she was AMS pres­id­ent. She was a very im­press­ive per­son and al­ways very nice to me.

There had been no wo­men on the math fac­ulty at Stan­ford, right?

There was Mary Sun­seri, but she only taught cal­cu­lus. I don’t think she did re­search. The whole time I was at Stan­ford there were no wo­men re­search­ers on the fac­ulty.

Was Cath­leen Mor­awetz the first wo­man math re­search­er you met?

No. I went to a few con­fer­ences when I was a gradu­ate stu­dent in my last year and met some wo­men re­search­ers there. I met Cath­er­ine Doléans-Dade at one of the con­fer­ences in the Mid­w­est. She was at the Uni­versity of Illinois, a very good prob­ab­il­ist. I also met Cindy Green­wood, who was at the Uni­versity of Brit­ish Columbia, and I met a few oth­er wo­men prob­ab­il­ists when I was on the job mar­ket and vis­ited some places.

No academic jobs in Australia

Since our last con­ver­sa­tion, I googled “his­tory of com­put­ing in edu­ca­tion in Aus­tralia” and pretty quickly found your fath­er. Westy Wil­li­ams was his name?

Yes.

He was a vis­ion­ary, as you said, and he had a big ef­fect.

There’s been a little bit of his­tory writ­ten in Bendigo about what he did, in­clud­ing a book pub­lished re­cently about the his­tory of the col­lege he was at. There was a nice blurb in that book about what he did there, and they in­ter­viewed me for that. The col­lege — it’s now called La Trobe Uni­versity — named the com­puter cen­ter after my fath­er.

One art­icle I found on the web noted how hard it was at the time to move com­puters around, be­cause they were huge and heavy. They took up a whole room. You said the com­puters your fath­er got were bought from the UK?

This newspaper clipping from the Bendigo Advertiser from the 1960s shows an article about Ruth Williams’s father, Westy Williams, and his work introducing computers into education in Bendigo. The larger photo shows the booties one had to wear when coming into the computer room, to keep out dust and dirt.

He was in­flu­en­tial in ac­quir­ing sev­er­al com­puters over sev­er­al years. The first few com­puters I think were from Eng­land, from ICL [In­ter­na­tion­al Com­puters Lim­ited]. Later some were from CDC [Con­trol Data Cor­por­a­tion] in the US. Cer­tainly there was noth­ing be­ing man­u­fac­tured in Aus­tralia; they had to come from over­seas. The ma­chine I used was huge and needed spe­cial air con­di­tion­ing, which at that time was kind of a rar­ity. You had to try to keep the room dust-free, so ini­tially any­body who went in the com­puter room had to wear little booties. But it was trans­form­at­ive to have ac­cess to com­put­ing power like that.

When we talked last time we left off when you were at Cour­ant. Then you went to UC­SD and have stayed in San Diego your whole ca­reer. So that was a good fit for you. When did you go to San Diego? Was that 1986?

No, 1984 was when I phys­ic­ally ar­rived. The fi­nal year of my PhD, I was look­ing for jobs. I got the ten­ure-track of­fer from San Diego and the postdoc of­fer from Cour­ant. San Diego let me ac­cept its of­fer and take leave for the first year, to do the postdoc at Cour­ant. After that I went to San Diego. I’ve been very happy here, it’s been great. I was at­trac­ted to UC­SD’s strong prob­ab­il­ity group. Ron­ald Getoor and Mi­chael Sharpe were here, real ex­perts in stochast­ic pro­cesses, es­pe­cially Markov pro­cesses. So that was at­tract­ive for me. Of course San Diego has a great math de­part­ment too.

It was a tough time when I fin­ished my PhD. I thought ini­tially I would go back to Aus­tralia, but there were no aca­dem­ic jobs there. There were few in the US, so I was for­tu­nate to get both a postdoc and a ten­ure-track po­s­i­tion.

Yes, the 1980s was not an easy time for the math­em­at­ics job mar­ket in the US — but no aca­dem­ic jobs in Aus­tralia!

If I’d gone back to Aus­tralia, I could have be­come an ac­tu­ary, which is not really what I wanted to do! Not that there’s any­thing wrong with be­ing an ac­tu­ary, but I did have in my mind for a long time that I would like to be a uni­versity pro­fess­or. I like do­ing re­search and teach­ing the next gen­er­a­tion. For­tu­nately it worked out, but it wasn’t al­ways clear that it was go­ing to work out.

Developing theory for reflecting Brownian motion

Ruth Williams (center) at the 8th World Congress in Probability and Statistics in Turkey in 2012. That year she served as President of the Institute of Mathematical Statistics. She is pictured here with members of the IMS Executive Committee (left to right): Judith Rousseau (Ecole Nationale de la Statistique et de L’Administration Economique), Bin Yu (UC Berkeley), Hans-Rudolf Künsch (ETH Zurich), Aurore Delaigle (University of Melbourne), and Jean-Didier Opsomer (Colorado State University).
Photo: IMS Publications.

I’d like to talk now about your math­em­at­ic­al work. A lot of it has centered on ques­tions com­ing from prob­lems around queueing net­works. Can you say what these queueing net­works are and give a con­crete ex­ample?

I would say that my in­terests broadly are in stochast­ic pro­cesses and their ap­plic­a­tions. Ini­tially, I worked on re­flect­ing Browni­an mo­tion be­cause it was an in­ter­est­ing stochast­ic pro­cess, and there wasn’t a lot of the­ory de­veloped for it. It arose as an ap­prox­im­a­tion to a queueing net­work, but there wasn’t a lot of the­ory de­veloped in gen­er­al for those kinds of re­flect­ing Browni­an mo­tions.

Typ­ic­ally the state space was a poly­hed­ron and re­flec­tion at the bound­ary was ob­lique, where the dir­ec­tion of re­flec­tion had a dis­con­tinu­ity at the in­ter­sec­tion of bound­ary faces. This made the prob­lem non­stand­ard. In the 1980s and early 1990s, I worked a lot on de­vel­op­ing a found­a­tion­al the­ory for such re­flect­ing Browni­an mo­tions, es­pe­cially on ex­ist­ence and unique­ness and char­ac­ter­iz­ing their be­ha­vi­or. This in­cluded work with my first PhD stu­dent, Lisa Taylor. So I didn’t work on the queueing net­work side ini­tially; I worked more on the stochast­ic pro­cess side. Then over time, it be­came ap­par­ent that in mak­ing that con­nec­tion between the re­flect­ing Browni­an mo­tions and the queueing net­works, there were some in­ter­est­ing out­stand­ing prob­lems.

I’ll de­scribe what a queueing net­work is in a mo­ment! But in the early 1990s, people real­ized that these queueing net­works, when they were het­ero­gen­eous — i.e., they pro­cessed dif­fer­ent types of jobs — they could be un­stable, while the ana­log­ous ho­mo­gen­eous net­works were stable. That was a sur­prise. So try­ing to fig­ure out when they were stable, and when they could be ap­prox­im­ated by re­flect­ing Browni­an mo­tions, be­came an im­port­ant prob­lem. For ho­mo­gen­eous net­works, the ap­prox­im­a­tion had already been de­veloped by Mike Har­ris­on and Mar­tin Re­iman. But for what were called mul­ti­class queueing net­works — mean­ing the het­ero­gen­eous case — it hadn’t been fully de­veloped. There were two dif­fer­ent as­pects to that. You needed a the­ory for the re­flect­ing Browni­an mo­tions, but you also needed to prove lim­it the­or­ems that show that you could ap­prox­im­ate the mul­ti­class queueing net­works by one of these re­flect­ing Browni­an mo­tions.

To come to what queueing net­works are: You have en­tit­ies that come in­to a net­work — they could be cus­tom­ers, or jobs, or pack­ets of data, or even mo­lecules in a bio­lo­gic­al ap­plic­a­tion. These en­tit­ies need pro­cessing at vari­ous nodes or sta­tions in the net­work. They might need to queue up to wait for pro­cessing, and that’s where the queueing as­pect comes in. They might need to vis­it more than one sta­tion in the net­work be­fore they leave, and that rout­ing in the net­work can be ran­dom. So there are dif­fer­ent sources of ran­dom­ness: The times between ar­rivals can be ran­dom, the times to pro­cess things can be ran­dom, and the rout­ing in the net­work can be ran­dom.

Queueing net­works can be used to mod­el dif­fer­ent ap­plic­a­tions in sci­ence and en­gin­eer­ing — things like man­u­fac­tur­ing pro­cesses, tele­com­mu­nic­a­tions, also ori­gin­ally tele­phone net­works, later on the In­ter­net and wire­less net­works. Cus­tom­er ser­vice sys­tems and big call cen­ters have been a source of in­terest in more re­cent times. But you can use queueing net­works to mod­el even things like bio­lo­gic­al net­works, which is something I have been very in­ter­ested in re­cently.

Stochast­ic net­works — or stochast­ic pro­cessing net­works — are a more gen­er­al ver­sion of queueing net­works. A lot of the the­ory of queueing net­works star­ted to be de­veloped in the 1950s — well, it goes back even earli­er than that, but there were a lot of de­vel­op­ments in the 1950s and 1960s, though largely for ho­mo­gen­eous net­works. A lot of the work was on ex­act ana­lys­is of the net­works. Then in the 1960s, ap­prox­im­a­tions star­ted to be de­veloped, both by the Rus­si­an school, es­pe­cially Al­ex­an­dr Borovkov, and also by [Don­ald] Ig­le­hart and [Ward] Whitt in the US.4

You men­tioned ex­act ana­lys­is, and I won­der why that doesn’t al­ways work. En­tit­ies come in­to the net­work, they get pro­cessed, they move on — it’s a step-by-step sys­tem, so if you think na­ively, it seems you could cre­ate a map of everything that could hap­pen. But it’s not that simple?

There are two things that make it more com­plic­ated. One is that there is a lot of ran­dom­ness in vari­ous as­pects of the mod­el, like times between ar­rivals and pro­cessing times. These are usu­ally ran­dom vari­ables; they are not usu­ally de­term­in­ist­ic. The oth­er thing is that the rout­ing in the net­work could also be prob­ab­il­ist­ic, rather than de­term­in­ist­ic, and there can be com­plex feed­back pat­terns in the rout­ing.

You might think, well, we know the av­er­age rate at which things are ar­riv­ing, the av­er­age rate at which things are pro­cessed, and so forth. So we should be able to fig­ure out un­der what con­di­tions we can make it so that everything just flows smoothly through the sys­tem. If everything were de­term­in­ist­ic, you could do that. But be­cause of the ran­dom­ness in things like the pro­cessing times, you will of­ten get queueing. For ex­ample, there will of­ten be in­stances when the amount of time it takes to pro­cess something is great­er than the in­ter­val un­til the next ar­rival, so the next ar­rival is go­ing to have to wait for ser­vice.

This is an un­der-ap­pre­ci­ated as­pect of pro­cessing net­works, that ran­dom­ness can cause queueing. People like to think de­term­in­ist­ic­ally, but the ran­dom­ness in the sys­tem is im­port­ant, es­pe­cially if you try to bal­ance the sys­tem, which is a nat­ur­al thing to try to do. Man­agers of sys­tems try to make sure that they don’t un­der-util­ize re­sources, so they typ­ic­ally try to keep these sys­tems in what’s called “heavy traffic”, that is, where there is full util­iz­a­tion of the re­sources. But then you can get sub­stan­tial queueing and bot­tle­necks in the net­work. Ran­dom­ness in any­thing — in­terar­rival times, ser­vice times, and so on — af­fects queueing. So people try to design sys­tems to use the re­sources op­tim­ally and also to un­der­stand what is caus­ing con­ges­tion and how it might be al­le­vi­ated.

That’s where math­em­at­ic­al mod­els can help a lot, be­cause these sys­tems can be quite com­plex, and they can have feed­back. An en­tity might go around sev­er­al times to the same sta­tion, per­haps be­cause something has to re­peat pro­cessing or be­cause it’s a ma­chine that puts down many lay­ers of a ma­ter­i­al, such as in semi­con­duct­or wafer fab­ric­a­tion. Once there is feed­back in the sys­tem, it can be dif­fi­cult to tell un­der what con­di­tions the sys­tem will be stable, in the sense that the av­er­age queue lengths don’t grow without bound. In queueing there are two as­pects. One is to un­der­stand how a sys­tem be­haves and to study its per­form­ance, to meas­ure things like queue lengths or idle time in the sys­tem. An­oth­er is to fig­ure out what good con­trols are for the sys­tem. One of the things that I’ve worked on is In­ter­net con­ges­tion con­trol — for ex­ample, what kinds of policies make the sys­tem be­have well. It can be very com­plex and dif­fi­cult to un­der­stand what’s caus­ing con­ges­tion and in­stabil­ity.

Ruth Williams signing the induction book at the 150th anniversary of the National Academy of Sciences in 2013.
Photo: National Academy of Sciences.

I men­tioned the ex­amples that came out in the early 1990s, where queueing net­works that people thought should be stable un­der stand­ard con­di­tions turned out to not be stable, in the sense that av­er­age queue lengths could grow without bound. Even quite simple sys­tems, once they have this feed­back phe­nomen­on and cer­tain kinds of dis­cip­lines for serving or pro­cessing en­tit­ies in the sys­tem, can be­come un­stable. We un­der­stand that bet­ter now, be­cause people have worked on it since the early 1990s.

You might ask, Why wasn’t that un­der­stood earli­er? These sys­tems have been used in man­u­fac­tur­ing and tele­com­mu­nic­a­tions for a long time. Part of the reas­on is that, for ex­ample, in man­u­fac­tur­ing, there were man­agers run­ning around try­ing to re­solve con­ges­tion when they found it. That’s a kind of con­trol fea­ture that can mask an un­der­ly­ing source of in­stabil­ity in the sys­tem, when there might have been more sys­tem­at­ic ways to deal with it. That’s an ex­ample where math­em­at­ic­al mod­els really helped people to un­der­stand these mul­ti­class, or het­ero­gen­eous, queueing net­works.

You star­ted work­ing on re­flect­ing Browni­an mo­tions in the con­text of queueing net­works. Browni­an mo­tion comes from stat­ist­ic­al mech­an­ics, which makes me think of atoms col­lid­ing with each oth­er. It seems coun­ter­in­tu­it­ive that something like that would ap­ply to queueing net­works.

One of the nice things about work­ing on a the­ory for a math­em­at­ic­al ob­ject is that you might be mo­tiv­ated to study it for one ap­plic­a­tion, but then it turns out to have oth­er ap­plic­a­tions. Re­flect­ing Browni­an mo­tions arise in phys­ics and also in fin­ance. But let me ex­plain why it’s nat­ur­al for them to come up in queueing sys­tems.

It’s true that Browni­an mo­tion is a mod­el for mo­lecules col­lid­ing with one an­oth­er. But also, in stat­ist­ics, you can get Browni­an mo­tion as a scal­ing lim­it of a ran­dom walk. A ran­dom walk is a stochast­ic pro­cess that takes dis­crete steps, where those steps are giv­en by a se­quence of in­de­pend­ent, identic­ally dis­trib­uted ran­dom vari­ables. A simple ex­ample of a ran­dom walk is, you are walk­ing on a line, and every time you take a step you flip a coin. If it comes up heads, you take one step for­ward, if it comes up tails, you take one step back. If you speed up time, like you’re watch­ing a movie, and shrink the size of the steps in the right way, in the lim­it of the ran­dom walk you get Browni­an mo­tion.

Sums of in­de­pend­ent, identic­ally dis­trib­uted ran­dom vari­ables like that come up every­where in stat­ist­ics. A won­der­ful as­pect of Browni­an mo­tion is giv­en by Don­sker’s the­or­em, which says that in the ap­prox­im­a­tion of a ran­dom walk, all the ran­dom steps need to have is fi­nite mean and vari­ance, and then the stat­ist­ics of the Browni­an mo­tion that you get in the lim­it de­pend only on those two stat­ist­ics. It’s in­sens­it­ive to the rest of the dis­tri­bu­tion of those steps. That’s called an in­vari­ance prin­ciple, in the sense that you can ap­prox­im­ate quite gen­er­al ran­dom walks with gen­er­al dis­tri­bu­tions for the step sizes, by this uni­ver­sal ob­ject, the Browni­an mo­tion.

There are oth­er ex­amples of that kind of thing in stat­ist­ic­al phys­ics, things like KPZ [the Kardar–Par­isi–Zhang stochast­ic par­tial dif­fer­en­tial equa­tion], where you get an in­vari­ant ob­ject ap­prox­im­at­ing a lot of in­ter­act­ing particle sys­tems. That’s very use­ful. I’m build­ing to the con­nec­tion with queueing net­works!

Let me just talk about a single queue. In a single queue, sup­pose you have a Pois­son ar­rival pro­cess and a se­quence of in­de­pend­ent, identic­ally dis­trib­uted ex­po­nen­tial pro­cessing times, where the av­er­age ar­rival rate is equal to the av­er­age pro­cessing rate. Then the queue is in heavy traffic. When the queueing sys­tem has jobs in the queue and there is a new ar­rival, that’s like tak­ing a step up; the queue length goes up by one. If a job fin­ishes be­ing pro­cessed, the queue length goes down by one. When the queue is nonempty, it is fol­low­ing a (con­tinu­ous time) ran­dom walk, go­ing up by one and down by one. You can ap­prox­im­ate that by a Browni­an mo­tion. This ap­prox­im­a­tion can be gen­er­al­ized to where the in­terar­rival and pro­cessing times are not just ex­po­nen­tially dis­trib­uted, via an in­vari­ance prin­ciple. Now, when you get to the bound­ary — that is, when the queue length reaches zero — you need to keep the queue length non­neg­at­ive. That’s where the re­flec­tion comes in, in re­flect­ing Browni­an mo­tion. It’s not really a mir­ror re­flec­tion; an­oth­er term people have used is reg­u­lated Browni­an mo­tion. It’s a kind of con­trol, or reg­u­la­tion, at the bound­ary, which keeps the pro­cess non­neg­at­ive.

In the one-di­men­sion­al case, it’s very easy to de­scribe how to keep the queue length non­neg­at­ive. But once you go to mul­tiple di­men­sions, the con­straint at the bound­ary means that you might have one queue that be­comes empty, but it’s feed­ing in­to an­oth­er queue, and then there is lost po­ten­tial flow to the next queue. In terms of the ef­fect at the bound­ary, you get an ob­lique dir­ec­tion of re­flec­tion on the bound­ary rather than nor­mal re­flec­tion, be­cause of the struc­ture of the queueing net­work. So these re­flect­ing Browni­an mo­tions have what’s called ob­lique re­flec­tion at the bound­ary of, say, an or­thant in d-di­men­sion­al space, where d is the num­ber of sta­tions or nodes in the net­work. They are dif­fer­ent from many of the re­flec­ted dif­fu­sions that were stud­ied earli­er, which were of­ten in smooth do­mains with nor­mal re­flec­tion.

When I star­ted to get in­volved in this area, there was a little bit of the­ory for these ob­liquely re­flec­ted Browni­an mo­tions as­so­ci­ated with ap­prox­im­a­tions to ho­mo­gen­eous queueing net­works. However, there wasn’t a the­ory to cov­er oth­er ap­prox­im­a­tions, in par­tic­u­lar, those com­ing from most het­ero­gen­eous queueing net­works. So one of the things I was in­volved in early on was de­vel­op­ing a the­ory for these re­flect­ing Browni­an mo­tions that would cov­er more of a full spec­trum of the stochast­ic pro­cesses you would ex­pect to get. I al­ways had in mind the queueing ap­plic­a­tions, but I didn’t really start to work on prov­ing queueing lim­it the­or­ems un­til a bit later on, after I had de­veloped the­ory for the re­flect­ing Browni­an mo­tions. It was in the 1990s that I star­ted work­ing on prov­ing lim­it the­or­ems. Be­fore that, I was work­ing a lot on the fun­da­ment­al the­ory for the re­flect­ing Browni­an mo­tions.

Re­gard­ing lim­it the­or­ems, in the late 1990s, Maury Bramson and I co­ordin­ated in de­vel­op­ing a mod­u­lar ap­proach5 to prov­ing heavy traffic lim­it the­or­ems for mul­ti­class queueing net­works. This in­volved us­ing the asymp­tot­ic be­ha­vi­or of hy­dro­dynam­ic lim­its, called flu­id mod­els, to prove a di­men­sion re­duc­tion on dif­fu­sion time scale, called state space col­lapse, which then fed in­to prov­ing a dif­fu­sion ap­prox­im­a­tion for the queueing net­work.

Broadening research through collaboration

You’ve had many col­lab­or­at­ors over the years. Can you tell me about what col­lab­or­a­tion means to you?

I have been for­tu­nate to have many won­der­ful col­lab­or­at­ors. These range from PhD stu­dents and postdocs to col­leagues at UC­SD and around the world. These col­lab­or­at­ors have in­cluded math­em­aticians, as well as re­search­ers from fields of ap­plic­a­tion such as bio­logy, cog­nit­ive sci­ence, con­trol the­ory, and op­er­a­tions man­age­ment. While some col­lab­or­a­tions have been with stu­dents, postdocs, vis­it­ors and col­leagues at UC­SD, a sub­stan­tial num­ber have come from ex­ten­ded vis­its I have made to pro­grams at re­search in­sti­tutes, and also on sab­bat­ic­al vis­its to vari­ous uni­versit­ies around the world. I found such vis­its es­pe­cially use­ful for learn­ing of new re­search prob­lems and mak­ing con­nec­tions with new col­lab­or­at­ors to work with. Sup­port I re­ceived from UC­SD, as well as from vari­ous fel­low­ships and in­sti­tu­tions, was in­valu­able in mak­ing such vis­its pos­sible. This was es­pe­cially im­port­ant early in my ca­reer and has con­tin­ued to be an im­port­ant way for me to broaden my re­search.

One of my col­lab­or­at­ors is my hus­band of 30 years, Bill Helton, who is also a math­em­atician at UC­SD. Bill has a great sense of hu­mor and has been very sup­port­ive of me in my ca­reer. Al­though he is in a dif­fer­ent field, we wrote a couple of pa­pers to­geth­er. One of them, with Ghe­orghe Cra­ciun, was on ho­mo­topy meth­ods for count­ing re­ac­tion net­work equi­lib­ria and grew out of an IMA [In­sti­tute for Math­em­at­ics and Its Ap­plic­a­tions] work­shop.6 The oth­er pa­per I have with Bill was writ­ten with two oth­er col­lab­or­at­ors, Frank Kelly and Ilze Zied­ins, both ex­perts on stochast­ic net­works. That pa­per was on ana­lys­is of a traffic net­work with in­form­a­tion feed­back and on­ramp con­trols.7

You men­tioned the bio­lo­gic­al ap­plic­a­tions you have been work­ing on re­cently. Can you tell me about that?

Since the 1990s, I worked on queueing in more gen­er­al stochast­ic net­works and prov­ing lim­it the­or­ems that jus­ti­fy vari­ous con­tinu­ous stochast­ic pro­cesses as ap­prox­im­a­tions, in­clud­ing cer­tain meas­ure-val­ued (in­fin­ite di­men­sion­al) pro­cesses. I also got in­ter­ested in con­trol prob­lems for these stochast­ic net­works. In the mid-2000s, in­spired by an­oth­er work­shop at the IMA, I be­came in­ter­ested in mod­els that came out of bio­lo­gic­al net­works. I thought that was a very in­ter­est­ing dir­ec­tion for ap­plied prob­ab­il­ity. I star­ted in­vest­ig­at­ing what are called stochast­ic chem­ic­al re­ac­tion net­works. And one of the things I real­ized pretty quickly was that to work on ap­plic­a­tions re­lated to bio­logy, it was good to be con­nec­ted to people who did ex­per­i­ments, as Sam Karlin had told me.

I found there was a group here at UC­SD work­ing in syn­thet­ic bio­logy, led by Jeff Hasty and Lev Tsim­ring. They were us­ing some stochast­ic and de­term­in­ist­ic mod­els to ap­prox­im­ate small ge­net­ic cir­cuits. That seemed very suit­able as a po­ten­tial ground for stochast­ic mod­el­ing and ana­lys­is. At that time NSF [Na­tion­al Sci­ence Found­a­tion] had a nice pro­gram called In­ter­dis­cip­lin­ary Grants in the Math­em­at­ic­al Sci­ences, which aimed to en­cour­age math­em­aticians to spend a year vis­it­ing a group in an­oth­er dis­cip­line in which they had nev­er worked be­fore. This was ideal for me, be­cause I hadn’t col­lab­or­ated with bio­lo­gists be­fore. I ap­plied for one of those grants and got it, and I went to vis­it the Hasty–Tsim­ring lab for a year.

We ended up work­ing on some stochast­ic mod­els of en­zymat­ic pro­cessing, which you can think of as queueing-like mod­els. As usu­ally hap­pens, ini­tially you can use some ex­ist­ing the­ory, but then you have to ex­tend it in dif­fer­ent ways to ad­apt to the dif­fer­ent situ­ations. That was a great col­lab­or­a­tion that went on for sev­er­al years with the lab and got me my start in col­lab­or­at­ing with bio­lo­gists, es­pe­cially with people who were do­ing cel­lu­lar and mo­lecu­lar bio­logy.

The vis­it I made to the Hasty–Tsim­ring lab was partly a sab­bat­ic­al that I had from UC­SD. In 2019 I had an­oth­er sab­bat­ic­al, this time in Bo­ston, vis­it­ing the Cen­ter of Math­em­at­ic­al Sci­ences and Ap­plic­a­tions at Har­vard, and also MIT. While there I con­nec­ted with Dom­itilla Del Vec­chio, who is also a re­search­er in syn­thet­ic bio­logy. We have an on­go­ing col­lab­or­a­tion about mod­el­ing epi­gen­et­ic cell memory. I find it very in­triguing to work on stochast­ic net­work mod­els like this, where there is a need to de­vel­op some new the­ory. Also, of­ten I find if you work on an ap­plic­a­tion, ques­tions come up that haven’t already been answered by the ex­ist­ing the­ory. It’s a good way to get nat­ur­al ques­tions and make sure you are ask­ing good, chal­len­ging ques­tions. I find this col­lab­or­a­tion very ex­cit­ing.

Ruth Williams being inducted into the Australian Academy of Science in 2019.
Photo: Australian Academy of Science.

That’s amaz­ing you jumped in­to this new area. Was it dif­fi­cult to get up to speed to be able to talk to these people who were do­ing such a dif­fer­ent kind of work?

I had col­lab­or­ated be­fore with en­gin­eers, in mech­an­ic­al, in­dus­tri­al, and elec­tric­al en­gin­eer­ing. But I found that col­lab­or­at­ing with bio­lo­gists, I had to stretch fur­ther. Bio­phys­i­cists were good in­ter­me­di­ar­ies. They of­ten use math­em­at­ic­al lan­guage but also know the ex­per­i­ment­al side. Lev Tsim­ring is a bio­phys­i­cist, and Jeff Hasty trained as a bio­phys­i­cist, al­though he is very in­volved in ex­per­i­ments now as well. There was also a tal­en­ted bio­phys­ics postdoc, Will Math­er, whom I talked with a lot when I went to the Hasty–Tsim­ring lab. That helped a great deal. Gen­er­ally in in­ter­dis­cip­lin­ary col­lab­or­a­tions, it’s a real team ef­fort.

But I found ini­tially that read­ing bio­logy pa­pers was like learn­ing a dif­fer­ent lan­guage. In a math­em­at­ics pa­per, of­ten there is a suc­cinct for­mula, and if you un­der­stand the for­mula, you un­der­stand a lot about what is go­ing on. The bio­lo­gic­al pa­pers used a lot more words and few­er for­mu­las. When I gave talks, I found that even though there might be a beau­ti­ful for­mula that you could show, it was im­port­ant to do things like show a graph of an in­stance of the for­mula — even though one for­mula is worth a thou­sand graphs! It’s just a dif­fer­ent way of present­ing things. I def­in­itely felt that there was a steep learn­ing curve in start­ing to col­lab­or­ate with bio­lo­gists, but I had very good people I could talk to. Of­ten it’s the postdocs and the gradu­ate stu­dents who you can talk to on a daily basis and learn many things from.

In en­gin­eer­ing of­ten there is already a mod­el, but in bio­logy of­ten there’s not. You can make a mod­el that’s so com­plic­ated that you wouldn’t be able to do any­thing with it. So you have to ask a lot of ques­tions about what’s im­port­ant to put in­to the mod­el. Also, there are of­ten para­met­ers that aren’t known very pre­cisely. For all of that it was very help­ful to be em­bed­ded in the lab.

After you de­veloped these bio­lo­gic­al mod­els, how were they then used?

In the Hasty–Tsim­ring lab, and also in the work that I’m do­ing now with Dom­itilla Del Vec­chio at MIT, the mod­els help to guide the ex­per­i­ments. As is typ­ic­al in ap­plied math, there is a feed­back loop: You do an ex­per­i­ment, and you find that maybe the res­ults are a bit dif­fer­ent from what the mod­el is pre­dict­ing, so you go back and re­fine the mod­el. The mod­el is also very help­ful in ex­plor­ing dif­fer­ent para­met­er re­gimes that you might not be able to fully ex­plore with ex­per­i­ments, which can be very costly. Some of the mod­els I’ve worked on might be even­tu­ally used to help com­bat dis­ease, al­though what I am do­ing at the mo­ment is more ba­sic sci­ence.

An “Aha!” moment earns an ice cream cake

What do you do when you do math­em­at­ics? Do you go out for a walk and look at the trees? Do you lie on your back, close your eyes, and enter some oth­er world? How does it work?

I mostly sit with a pad and pa­per, but I al­ways find it’s help­ful to take a walk and think. I have to ad­mit that math is in my head a lot of the time! It’s kind of a nat­ur­al thing. When you are work­ing on a prob­lem, you first have to learn enough about it so that you can carry it around in your head and think about it. But it doesn’t take a hol­i­day — it’s al­ways in your head!

The prob­ab­il­ist Chris Burdzy had a Warschawski Vis­it­ing As­sist­ant Pro­fess­or po­s­i­tion at UC­SD some years ago. We had a prob­ab­il­ity sem­in­ar where Chris was giv­ing a talk one day. He men­tioned an open prob­lem and said that, if some­body solved it, that per­son would get an ice cream cake. Well, I was in­ter­ested in the prob­lem, not ne­ces­sar­ily the ice cream cake! But later that week I was tak­ing a walk across a park in San Diego, and I had an idea about how to solve this open prob­lem. That was an “Aha!” mo­ment. It turned out it was a good idea, and Chris and I wrote a pa­per to­geth­er.8 At the de­par­ture party for Chris when he went on to his next po­s­i­tion, the ice cream cake ap­peared.

He made good on his prom­ise.

He did, yes. Math is in my head a lot of the time. That’s just the way it is. It’s a pas­sion.

You said you have to know enough about the prob­lem to be able to carry it around in your mind. Can you talk about what that thing is that you carry in your mind? Is it a pic­ture? Equa­tions? Shapes? Or some pro­cess that un­folds? What does that look like?

I’m a visu­al thinker. A lot of the prob­lems I work on in­volve stochast­ic pro­cesses, and I am of­ten in­ter­ested in what people would call the sample path be­ha­vi­or. For ex­ample, con­sider re­flect­ing Browni­an mo­tion in three di­men­sions, which lives in the pos­it­ive or­thant, or in two di­men­sions, which lives in the pos­it­ive quad­rant. It runs around like Browni­an mo­tion, but it’s got this re­flec­tion at the bound­ary. So I have those pic­tures in my mind, of how the sample paths be­have. Of­ten you are in­ter­ested in ques­tions like, does a sample path hit the corner of the quad­rant? How long does it take to get there? Then some­times equa­tions or con­structs come to mind about how you might prove that.

I don’t really see line-by-line proofs when I visu­al­ize things, it’s more try­ing to get an idea about how to solve the prob­lem. I’m a geo­met­ric thinker. Even with stochast­ic pro­cesses, I think about the geo­metry of what’s go­ing on, es­pe­cially how these sample paths be­have, which has con­nec­tions with ana­lys­is.

A sample path for re­flect­ing Browni­an mo­tion is ba­sic­ally a con­tinu­ous path, so you are fol­low­ing this ran­dom con­tinu­ous func­tion as it moves around in a state space. The bound­ary be­ha­vi­or is a bit sin­gu­lar, so things are not usu­ally ab­so­lutely con­tinu­ous, and that’s a little bit tricky, but it’s a kind of stochast­ic dif­fer­en­tial equa­tion with state con­straints. To con­nect with ana­lys­is, a use­ful tool is stochast­ic cal­cu­lus, or Itô cal­cu­lus, which I men­tioned be­fore. When I think about how to ap­ply those tools, usu­ally then I would want to get my pen­cil and pa­per out and start writ­ing things down, be­cause it’s easy to fool your­self at that stage!

In November 2024 Ruth Williams (third from right) attended an American Institute of Mathematics SQuaRE with her student Yi Fu (far right) and other collaborators (left to right): Greg Rempala, Lea Popovic, Hye-Won Kang, and Wasiur KhudaBukhsh.
Photo courtesy of Wasiur KhudaBukhsh.

Do you do any pro­gram­ming now, say to carry out little ex­per­i­ments that might oc­cur to you?

I of­ten have some gradu­ate stu­dents, or oc­ca­sion­ally a postdoc, do some pro­gram­ming, es­pe­cially in Math­em­at­ica. So for ex­ample with the bio­logy mod­els at the mo­ment I have a very good gradu­ate stu­dent help­ing me with that, Yi Fu. A won­der­ful thing that’s changed since I first wrote com­puter pro­grams is that there are a lot more high-level lan­guages now, things like Math­em­at­ica and Mat­lab. So some­times I’ll write a little Math­em­at­ica code, maybe to il­lus­trate a the­or­em or test a con­jec­ture. If stu­dents write the code, some­times I’ll tinker with it and pro­duce ex­amples and test dif­fer­ent scen­ari­os.

Also, I coteach a com­pu­ta­tion­al fin­ance course in the Rady School of Man­age­ment, where we teach some al­gorithms, though for most of the cod­ing, we have a TA who helps.

Have you also done re­search in fin­ance, or this is just a course that you en­joy teach­ing?

I’ve al­ways had a little bit of an in­terest in fin­ance, be­cause it’s an ap­plic­a­tion of stochast­ic cal­cu­lus. So I de­veloped a course at UC­SD on math­em­at­ic­al fin­ance and wrote an in­tro­duct­ory book, which is pub­lished by the AMS.9 At UC­SD we have both an un­der­gradu­ate- and a gradu­ate-level math­em­at­ics course based on that book. In ad­di­tion, I am coteach­ing the com­pu­ta­tion­al fin­ance course with an­oth­er math­em­at­ics pro­fess­or. Fin­ance is a good ap­plic­a­tion of prob­ab­il­ity, and also stu­dents of­ten want to get some back­ground in it, to en­hance their job op­por­tun­it­ies. We have stu­dents from oth­er de­part­ments who take our math­em­at­ic­al fin­ance courses too, es­pe­cially stu­dents from eco­nom­ics, also some­times from phys­ics and en­gin­eer­ing. It’s a pop­u­lar top­ic.

It sounds like you have a lot of di­versity in your work.

Yes, I feel that, and I also feel that what I have worked on has changed with time. I’m al­ways look­ing to find new op­por­tun­it­ies. I like work­ing with oth­er people, and I like work­ing on prob­lems and de­vel­op­ing new math­em­at­ics. I have an open mind to that. I am bet­ter at cer­tain kinds of math­em­at­ics than oth­ers, so I tend more to the ana­lys­is side.

Encouraging the next generation

Ruth Williams has had about a dozen PhD students. Here she is pictured with two of them in 2009, Michael Kinnally (left) and Nam Lee (right).
Photo courtesy of Ruth Williams.

You said that in Aus­tralia you had to learn how to fol­low lec­tures and listen to people. Do you see stu­dents today de­vel­op­ing that skill? Every­one is watch­ing video lec­tures and read­ing ma­ter­i­al on the In­ter­net.

I think there is a lot of con­cern about that. I’m by no means an ex­pert, but the abil­ity to fo­cus and listen to some­body when they are speak­ing or giv­ing a lec­ture, and tak­ing notes as well, are im­port­ant skills that not every­body has these days. I think there is a tend­ency with videos for people to just skip through them and not really ab­sorb a lot. That ac­tu­ally can make it more dif­fi­cult to learn the math­em­at­ics.

There is something about writ­ing on a black­board, and the pace that comes with it, that seems to be very help­ful for people to be able to ab­sorb things. Math­em­at­ics it­self is a kind of short­hand lan­guage, and in a lec­ture you are try­ing to ab­sorb that lan­guage as well as new con­cepts. If you just fly through it in a video, you might not ab­sorb it very well. And we’ve seen that if stu­dents have videos to watch, they don’t ne­ces­sar­ily do as well as if they come to in-per­son lec­tures. I’ve seen that in my own classes, and also oth­er people have re­por­ted it. It’s a chal­len­ging is­sue. Learn­ing new math­em­at­ics takes time and pa­tience and prac­tice.

When we talked about your early life in Aus­tralia, you said you didn’t en­counter the at­ti­tude that “fe­males can’t do math­em­at­ics.” In fact, you were very much en­cour­aged by your par­ents and teach­ers. How was it later in your ca­reer? Did you feel that be­ing a wo­man in math­em­at­ics was an is­sue, or caused any prob­lems, or set up any obstacles?

One thing I no­ticed is that the fur­ther you go up, the few­er wo­men there are. I’ve tried to en­cour­age people who were more ju­ni­or than my­self. I find that I’m most help­ful at the in­di­vidu­al level. I think the most valu­able thing is if you can provide help and ad­vice that is ad­ap­ted to the in­di­vidu­al, wheth­er a man or a wo­man — things like re­view­ing a per­son’s grant pro­pos­al or com­ment­ing on a draft of their re­search state­ment. That all takes time, but those are the kinds of valu­able things that really help.

A dif­fer­ent way of help­ing is through pro­fes­sion­al ser­vice, and I have done a lot of that with vari­ous or­gan­iz­a­tions.10 I find it is one way for me to give back to the or­gan­iz­a­tions that have be­nefited me in my ca­reer and that foster math­em­at­ics and sci­ence re­search. Their role in help­ing ju­ni­or re­search­ers is es­pe­cially im­port­ant.

Go­ing back to the is­sue of be­ing a wo­man in math­em­at­ics — it’s some­times said that wo­men have less con­fid­ence than men, that wo­men lack con­fid­ence. As you spoke about your life, that didn’t seem to come up. You just were in­ter­ested in math and sci­ence, and then the ques­tion of con­fid­ence, the ques­tion of “Can I do this?”, got sub­sumed by the de­sire to learn and un­der­stand. To as­sume that you have to be ex­tremely con­fid­ent that everything will work out is maybe not ne­ces­sary or real­ist­ic.

When you are work­ing on a math­em­at­ics prob­lem, you don’t know wheth­er it’s go­ing to work out! I feel that I just have to try and fol­low what I think is in­ter­est­ing. But every­body wor­ries, “Will I be able to solve this prob­lem? Will I get my PhD? Will I get ten­ure?” I think that’s nat­ur­al. In work­ing on math­em­at­ics, there’ll be times when it’s tough, when a prob­lem doesn’t seem like it’s yield­ing, and you just have to try and push through, or step back and take an al­tern­at­ive ap­proach. Maybe with ex­per­i­ence one gains that per­spect­ive, which is a bit harder to have in the be­gin­ning. But I don’t think be­ing over­con­fid­ent is good!

You said about a pro­fess­or you had in Aus­tralia, “I liked his math­em­at­ic­al style.” Even at that early stage you were look­ing out for what was in­ter­est­ing to you, what you were drawn to, what ap­pealed to you, in­stead of wor­ry­ing, “Can I do it?”

That was def­in­itely true in my un­der­gradu­ate stud­ies. When I did my PhD, it took me a little while to find prob­ab­il­ity as the right field. I just tried dif­fer­ent things and kept look­ing. It was per­sist­ence, or stoicism — Aus­trali­ans tend to be sto­ic! Work­ing hard helps. And def­in­itely there were people who helped me along the way, and I am very grate­ful to them. I’ve been very lucky.