MENU

Project with Liangliang Wang

Efficient Statistical and Computational Methods for Genetics and Dynamical Models

This project is to develop Bayesian methods for the Transformer-based large language models (LLMs) and to explore applications of these models.  Two approaches will be considered: 1) modifying the Transformer architecture to incorporate the uncertainty from the parameters; 2) using the pretrained LLMs as either priors or likelihood in a Bayesian model for downstream tasks. We will also explore efficient training schemes for the proposed methods. The USRA for this project should be strong in software development in popular programming languages  (e.g. R, Python, Java)  and be familiar with Bayesian statistics and LLMs. The student will help me implement the proposed methods and apply them to real data.