I am currently exploring utilizing "big data" for efficiently reducing the uncertainty of local models. I'm also working on uncertainty quantification of 3D surfaces (e.g., faults, folds, stratigraphy, ore bodies) with level set methods.
My research questions include:
-Can we make better decisions with "big data" in the context of uncertainty quantification?
-How can we efficiently extract useful information from "big data" to reduce the local uncertainty?
-How to fast build uncertainty models incorporating multiple sources of information?
Jef Caers, Doctoral (Program)
Playing the Guitar; Hiking; Pooling;
- GOSIM: A multi-scale iterative multiple-point statistics algorithm with global optimization COMPUTERS & GEOSCIENCES 2016; 89: 57-70
- Assessing quality of urban underground spaces by coupling 3D geological models: The case study of Foshan city, South China COMPUTERS & GEOSCIENCES 2016; 89: 1-11