# Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

## Markdown

This is a page not in th emain menu

## Future Blog Post

Published:

This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.

## Blog Post number 4

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

## Blog Post number 3

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

## Blog Post number 2

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

## Blog Post number 1

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

## Portfolio item number 1

Short description of portfolio item number 1

## Portfolio item number 2

Short description of portfolio item number 2

## Nonlinear Stability at the Zigzag Boundary

Published in arXiv preprint, 2020

An investigation into the dynamics of roll solutions at the zigzag boundary of the planar Swift-Hohenberg equation. Done as a part of an REU at Ohio University during the summer of 2019.

## Infinite-Dimensional Bayesian Inversion for Fault Slip from Surface Measurements

Published:

Abstract: Given the inability to directly observe the conditions of a fault line, inversion of parameters describing them has been a subject of practical interest for the past couple of decades. To resolve this under a linear elasticity forward model, we consider Bayesian inference in the infinite-dimensional setting given some surface displacement measurements, resulting in a posterior distribution characterizing the initial fault displacement. We employ adjoint-based gradient computation in order to resolve the underlying partial differential equation constrained optimization problem and take care to leverage both dimensionality reductions in the parameter space and the low-rank nature of the resulting posterior covariance, owing to sparse measurements locations, to do said computation in a scalable manner.

## Computing Eigenvalue Sensitivities for Sensitivity Analysis of the Information Gain in Bayesian Linear Inverse Problems

Published:

Abstract: We consider sensitivity analysis of Bayesian inverse problems with respect to modeling uncertainties. To this end, we consider sensitivity analysis of the information gain, as measured by the Kullback-Leibler divergence from the posterior to the prior. This choice provides a principled approach that leverages key structures within the Bayesian inverse problem. Also, the information gain reduces to the Bayesian D-optimal design criterion in the case of linear Gaussian inverse problems. However, the derivatives of the information gain are not simple to compute, nor are finite differences always possible let alone scalable. To solve half the puzzle, in this talk we present the method of computing eigenvalue sensitivities for implicitly defined linear operators appearing in PDE-constrained optimization problems. Specifically, we consider eigenvalue sensitivities of the so-called data misfit Hessian and its preconditioned counterpart. We start with simple examples and work our way up to the expressions in the information gain. Our approach relies on adjoint based methods for gradient and Hessian computation. The resulting expressions for the sensitivities will be exact and can be computed in a scalable manner.

## Tutor and TA for Fundamental Algorithms

Graduate course, New York University, Department of Computer Science, 2017

In the fall semester for both 2017 and 2018, I was a tutor and TA for the fundamental algorithms course, taught by Alan Siegel, at New York University.

## Tutor and TA for Basic Algorithms

Undergraduate course, New York University, Department of Computer Science, 2018

In the spring semester for both 2018 and 2019, I was a tutor and TA for the basic algorithms course, taught by Alan Siegel, at New York University.

## Tutor for Numerical Computing

Undergraduate course, New York University, Department of Computer Science, 2020

In the spring semester of 2020, I worked as an undergraduate tutor for the numerical computing course, taught by Margaret Wright.

Undergraduate course, North Carolina State University, Department of Mathematics, 2020

In the fall semester of 2020, I taught a recitation section and graded for the Elements of Calculus Course.