Professor, Department of Statistics & Actuarial Science
Dr. Bingham’s research focuses on the statistical aspects of uncertainty quantification. He develops statistical methodology for new types of data or new applications in fields ranging from cosmology to glaciology. This applied work is complemented by a fundamental research stream that looks at the broader and more theoretical side of what you can and cannot do with these computational models.
How has your research program evolved since becoming an independent researcher?
I started off working on the mathematical design of experiments. Although motivated by applications, the work was theoretical and technical in nature. Later, I became more involved with applications-based research. I am a methodologist rather than an applied statistician; I’m on the lookout for projects where the technical data or the problem arising in their application isn’t in a book because it hasn't been solved.
Your research impacts a wide variety of fields. What are some of your ongoing projects?
A lot of applied mathematicians, physicists, computer scientists, etc. use computational models to describe and investigate physical systems. One such collaboration I have is with cosmologists who are programming supercomputers to develop calculations that describe the Universe. The models are generally faulty, but they represent the best physics known. This work is complemented by physical observations made from real physical systems (e.g., measurements obtained using satellites). So, you have observations of reality, which are noisy and have limits due to the observational system, and then there is this imperfect computer model built using the best knowledge available. The question is how to put the imperfect computer model together with the noisy observations of reality to make predictions of what will happen in the future.
I work on a range of other projects that use computational models and observations, with my role being to use the computational models to make inferences. For example, in a glaciology project we are designing experiments and using computational models to make inferences about glacial evolution. Another team project, funded by the Center for Exascale Radiation Transport (CERT), is with nuclear engineers at Texas A&M University who are developing exascale computational models for very large supercomputers to understand thermal radiation transport.
Having had two terms as a Tier 2 Canada Research Chair (CRC), what did it allow you to do that you couldn't have done otherwise?
With the help of my Department and the University I built a lab, which is not common in statistics, because typically what we need is computing resources. Through that excellent resource, I recruited students and created a rich environment for many visiting researchers. As a CRC I received protected time for research; this allowed me to reach out to scientists outside my university, to travel to their labs and to have them visit my lab to create the research enterprise.
What personal research experience was the most exciting for you?
As a PhD student with Randy Sitter at SFU, I wrote an algorithm to generate all kinds of experimental designs so that I could investigate which was the best. My supervisor eventually asked me where I got the program, and when I told him that I wrote it, he looked at me and said, “Well, that is a paper!” It was exciting to realize that some of my ideas were not obvious to everybody else and that I could actually do research. I didn't know what a significant thought was, so this first taste of research was a real thrill.
As a student I first assumed that research was magic. The truth is, research is incremental; you have to understand what others have done to make progress yourself. Step zero is to read everything, to become an expert, and then you can apply your creative ideas and discover something new. Many people can do this if they work hard; there is a method in place, it is not magic.