Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Future Blog Post

less than 1 minute read

Published:

This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.

Blog Post number 4

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 3

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 1

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

portfolio

publications

Nonlinear Stability at the Zigzag Boundary

Published in arXiv preprint, 2020

An investigation into the dynamics of roll solutions at the zigzag boundary of the planar Swift-Hohenberg equation. Done as a part of an REU at Ohio University during the summer of 2019.

Download here

Sensitivity Analysis of the Information Gain in Infinite-Dimensional Bayesian Linear Inverse Problems

Published in arXiv preprint, 2023

We study the sensitivity of infinite-dimensional Bayesian linear inverse problems governed by partial differential equations (PDEs) with respect to modeling uncertainties. In particular, we consider derivative-based sensitivity analysis of the information gain, as measured by the Kullback-Leibler divergence from the posterior to the prior distribution. To facilitate this, we develop a fast and accurate method for computing derivatives of the information gain with respect to auxiliary model parameters. Our approach combines low-rank approximations, adjoint-based eigenvalue sensitivity analysis, and post-optimal sensitivity analysis. The proposed approach also paves way for global sensitivity analysis by computing derivative-based global sensitivity measures. We illustrate different aspects of the proposed approach using an inverse problem governed by a scalar linear elliptic PDE, and an inverse problem governed by the three-dimensional equations of linear elasticity, which is motivated by the inversion of the fault-slip field after an earthquake.

Download here

PyOED: An Extensible Suite for Data Assimilation and Model-Constrained Optimal Design of Experiments

Published in arXiv preprint, 2023

This paper describes PyOED, a highly extensible scientific package that enables developing and testing model-constrained optimal experimental design (OED) for inverse problems. Specifically, PyOED aims to be a comprehensive Python toolkit for model-constrained OED. The package targets scientists and researchers interested in understanding the details of OED formulations and approaches. It is also meant to enable researchers to experiment with standard and innovative OED technologies with a wide range of test problems (e.g., simulation models). OED, inverse problems (e.g., Bayesian inversion), and data assimilation (DA) are closely related research fields, and their formulations overlap significantly. Thus, PyOED is continuously being expanded with a plethora of Bayesian inversion, DA, and OED methods as well as new scientific simulation models, observation error models, and observation operators. These pieces are added such that they can be permuted to enable testing OED methods in various settings of varying complexities. The PyOED core is completely written in Python and utilizes the inherent object-oriented capabilities; however, the current version of PyOED is meant to be extensible rather than scalable. Specifically, PyOED is developed to enable rapid development and benchmarking of OED methods with minimal coding effort and to maximize code reutilization. This paper provides a brief description of the PyOED layout and philosophy and provides a set of exemplary test cases and tutorials to demonstrate the potential of the package.

Download here

talks

Infinite-Dimensional Bayesian Inversion for Fault Slip from Surface Measurements

Published:

Abstract: Given the inability to directly observe the conditions of a fault line, inversion of parameters describing them has been a subject of practical interest for the past couple of decades. To resolve this under a linear elasticity forward model, we consider Bayesian inference in the infinite-dimensional setting given some surface displacement measurements, resulting in a posterior distribution characterizing the initial fault displacement. We employ adjoint-based gradient computation in order to resolve the underlying partial differential equation constrained optimization problem and take care to leverage both dimensionality reductions in the parameter space and the low-rank nature of the resulting posterior covariance, owing to sparse measurements locations, to do said computation in a scalable manner.

Computing Eigenvalue Sensitivities for Sensitivity Analysis of the Information Gain in Bayesian Linear Inverse Problems

Published:

Abstract: We consider sensitivity analysis of Bayesian inverse problems with respect to modeling uncertainties. To this end, we consider sensitivity analysis of the information gain, as measured by the Kullback-Leibler divergence from the posterior to the prior. This choice provides a principled approach that leverages key structures within the Bayesian inverse problem. Also, the information gain reduces to the Bayesian D-optimal design criterion in the case of linear Gaussian inverse problems. However, the derivatives of the information gain are not simple to compute, nor are finite differences always possible let alone scalable. To solve half the puzzle, in this talk we present the method of computing eigenvalue sensitivities for implicitly defined linear operators appearing in PDE-constrained optimization problems. Specifically, we consider eigenvalue sensitivities of the so-called data misfit Hessian and its preconditioned counterpart. We start with simple examples and work our way up to the expressions in the information gain. Our approach relies on adjoint based methods for gradient and Hessian computation. The resulting expressions for the sensitivities will be exact and can be computed in a scalable manner.

Sensitivity Analysis of the Information Gain in Infinite-Dimensional Bayesian Linear Inverse Problems

Published:

Abstract: We consider sensitivity analysis of Bayesian linear inverse problems with respect to modeling uncertainties. To this end, we consider sensitivity analysis of the information gain, as measured by the Kullback—Leibler divergence from the posterior to the prior. This choice provides a principled approach that leverages key structures within the Bayesian inverse problem. Also, the information gain admits a closed-form expression in the case of linear Gaussian inverse problems. The derivatives of the information gain are extremely challenging to compute. To address this challenge, we present accurate and efficient methods that combine eigenvalue sensitivities and hyper-differential sensitivity analysis that take advantage of adjoint based gradient and Hessian computation. This results in a computational approach whose cost, in number of PDE solves, does not grow upon mesh refinement. These results are presented in an application-driven model problem, considering a simplified earthquake model to infer fault slip from surface measurements.

teaching

Tutor and TA for Fundamental Algorithms

Graduate course, New York University, Department of Computer Science, 2017

In the fall semester for both 2017 and 2018, I was a tutor and TA for the fundamental algorithms course, taught by Alan Siegel, at New York University.

Tutor and TA for Basic Algorithms

Undergraduate course, New York University, Department of Computer Science, 2018

In the spring semester for both 2018 and 2019, I was a tutor and TA for the basic algorithms course, taught by Alan Siegel, at New York University.

Tutor for Numerical Computing

Undergraduate course, New York University, Department of Computer Science, 2020

In the spring semester of 2020, I worked as an undergraduate tutor for the numerical computing course, taught by Margaret Wright.

Recitation Leader and Grader for Calculus

Undergraduate course, North Carolina State University, Department of Mathematics, 2020

In the fall semester of 2020, I taught a recitation section and graded for the Elements of Calculus Course.

Recitation Leader and Grader for Calculus II

Undergraduate course, North Carolina State University, Department of Mathematics, 2021

In the spring semester of 2021, I taught a recitation section and graded for the course Calculus II taught by Patrick Sprenger.

Recitation Leader and Grader for Calculus II

Undergraduate course, North Carolina State University, Department of Mathematics, 2021

In the first 6-week summer semester of 2021, I taught all recitation sections and graded for the course Calculus II taught by Ashley Tharp.