I am currently exploring utilizing "big data" for efficiently reducing the uncertainty of local models. I'm also working on uncertainty quantification of 3D surfaces (e.g., faults, folds, stratigraphy, ore bodies) with level set methods.

My research questions include:
-Can we make better decisions with "big data" in the context of uncertainty quantification?
-How can we efficiently extract useful information from "big data" to reduce the local uncertainty?
-How to fast build uncertainty models incorporating multiple sources of information?

Stanford Advisors

Personal Interests

Playing the Guitar; Hiking; Pooling;

Current Research and Scholarly Interests

Geostatistics; Computer graphics/vision; Machine Learning

Lab Affiliations

All Publications