I am currently exploring utilizing "big data" for efficiently reducing the uncertainty of local models. I'm also working on uncertainty quantification of 3D surfaces (e.g., faults, folds, stratigraphy, ore bodies) with level set methods.
My research questions include:
-Can we make better decisions with "big data" in the context of uncertainty quantification?
-How can we efficiently extract useful information from "big data" to reduce the local uncertainty?
-How to fast build uncertainty models incorporating multiple sources of information?
Playing the Guitar; Hiking; Pooling;
- Entropy-Based Weighting in One-Dimensional Multiple Errors Analysis of Geological Contacts to Model Geological Structure MATHEMATICAL GEOSCIENCES 2019; 51 (1): 29–51
- Assessing and visualizing uncertainty of 3D geological surfaces using level sets with stochastic motion COMPUTERS & GEOSCIENCES 2019; 122: 54–67
- GOSIM: A multi-scale iterative multiple-point statistics algorithm with global optimization COMPUTERS & GEOSCIENCES 2016; 89: 57-70
- Assessing quality of urban underground spaces by coupling 3D geological models: The case study of Foshan city, South China COMPUTERS & GEOSCIENCES 2016; 89: 1-11