Fall 2015 UBC/SFU Joint Statistical Seminar
Saturday October 24th, 2015
Room 7000 in the Harbour Centre
Presented by PIMS and the Simon Fraser Graduate Student Society

PIMS logo                       GSS logo

Overview
The SFU-UBC Joint Graduate Student Workshop in Statistics is going into its 11th year. This is the first of two seminars to take place this school year; the one in Fall is organized by graduate students from SFU and the one in Spring is organized by graduate students from UBC. The idea of this event is to offer graduate students in Statistics and Actuarial Science with an opportunity to attend a seminar with accessible talks providing them an introduction to active areas of research in the field. For two students from each university the seminar allows them to present on their work, as well as to offer them an opportunity to develop their presentation skills with their peers.

A new format this year consists of talks given by four students (two from UBC and two from SFU) and one professor (from SFU in the Fall and UBC in the spring). This change is being done to expand the networking aspect of the event, in order to allow students more time to meet their peers from the other institution. There will be time dedicated to an ice breaker event and a social trivia game with prizes at the end of the event. The seminar also contains the traditional important social components, namely the morning coffee and the lunch where students get more opportunity to network with each other and foster a mutually beneficial relationship between the departments.

Information on previous seminars can be found on the UBC statistics department website (here).

Sponsorship

This seminar could not take place without the generous help of our sponsors: The Pacific Institute for the Mathematical Sciences (PIMS) and the Graduate Student Society at Simon Fraser University (GSS).

Agenda For Saturday October 24th
8:30 - 9:00
Coffee and Pastries at Blenz's Coffee (508 West Hastings Street)
Across the street from the Harbour Centre

9:00 - 10:00
Networking Event: Icebreaker

10:00 - 10:25
Student Talk: Will Ruth
The Effect of Heteroscedasticity on Regression Trees Abstract

10:25 - 10:50
Student Talk: David Lee
Parsimonious Models for Multivariate Extremes Abstract

11:00 - 11:45 Faculty Talk: Liangliang Wang
Essentials for Success in Graduate Training Abstract

11:45 - 2:00
Lunch at Rogue Wetbar


2:00 - 2:25
Student Talk: David Kepplinger
Robust Regularized Covariance Estimation with Application to Linear Discriminant Analysis Abstract

2:25 - 2:50
Student Talk: Michael Grosskopf
Model Emulation and Calibration in Radiation Transport Experiments Abstract

2:50 - 3:15
Networking Event: Trivia and Prizes
Networking trivia



Directions and Accessibility
The seminar conveniently takes place in room 7000 on the SFU downtown campus in the Harbour Centre in downtown Vancouver (map). From SFU, the 135 bus will take you directly to the seminar location. From UBC, the 044 and 14 bus provide direct access. It is also near Waterfront station, which allows access from all Skytrain lines: Canada Line, Expo Line and Millenium Line.

Link to last year's Fall Joint Seminar Agenda here


X

Model Emulation and Calibration in Radiation Transport Experiments

Calibration and assessment of predictive uncertainty in complex computer models is an important emerging field in scientific computing. As expensive computer models become more important across areas of science and engineering, their reliability and the understanding of the precision and accuracy of their predictions is critical. We present work is motivated by research modeling radiation transport in high-energy-density physics experiments at the Center for Exascale Radiation Transport (CERT). There, state-of-the-art high performance computing is used at extreme scales in order to simulate the radiation dynamics. We present work addressing the unique challenges in combining the output from these models and experiments using the Kennedy-O'Hagan Bayesian framework for model calibration. Additionally, we will address the experimental design for field measurements, including discussion of the value of replicates in this experimental environment.
X

Essentials for Success in Graduate Training

It is a long journey to obtain a master or PhD degree, which needs a lot of determination, dedication, self-discipline, and effort. I will share with you my experiences as a graduate student and as a junior faculty member. Then I will provide my thoughts on some essential factors for successfully completing a graduate degree. I will briefly talk about finding a job in academia.

X

Parsimonious Models for Multivariate Extremes

The modelling of extreme observations is important in many areas such as finance, hydrology and meteorology. In univariate setting this can be achieved through the Generalized Extreme Value distribution. However, there is no straightforward extension to multivariate observations, the simultaneous investigation of which could be crucial due to the dependence among variables. In this presentation, I will introduce two classes of multivariate extreme value copula models that are useful for data exhibiting factor and vine structures. These models are parsimonious in the sense that the number of parameters grows linearly with the dimension of the data. A real data example will be given to illustrate how these models can be interpreted.

X

The Effect of Heteroscedasticity on Regression Trees

Regression trees are a popular prediction tool and are the basis of numerous modern statistical learning ensembles. Part of their popularity is their ability to create a regression prediction without ever specifying a structure for the mean model. However, the method implicitly assumes homogeneous variance across the entire explanatory-variable space. In this study, we assess the performance of the most popular regression-tree algorithm in a single variable setting under a very simple step-function model for heteroscedasticity. We use simulation to show that the locations of splits, and hence the ability to accurately predict means, are both adversely influenced by the change in variance.
X

Robust Regularized Covariance Estimation with Application to Linear Discriminant Analysis

Its simple form makes linear discriminant analysis (LDA) a prevalent tool for classification, yet the dependency on an estimate of the precision matrix is a major drawback. In many applications more features than observations are available and some of these observations may be contaminated, impeding use of this simple tool. Regularization techniques, or sparse methods, are well known to give good estimates of the precision matrix when the sample covariance matrix is rank-deficient or ill-conditioned, however contamination also breaks these methods. By borrowing ideas from the FAST-MCD algorithm for robust multivariate location and scale estimation, a robust regularized estimate of the precision matrix can be obtained and used for LDA. In consideration of the classification context, a measure similar to the deviance measure used in other classification methods is defined and used to obtain the optimal value for the required regularization parameter. An extensive simulation study shows the superior performance of the new classification algorithm for high-dimensional data and low sample size in the presence of contaminated observations, but also its high efficiency for uncontaminated data.