Professional Education


  • Doctor of Philosophy, Sichuan University (2023)
  • Master of Engineering, Sichuan University (2020)
  • Bachelor of Engineering, Sichuan University (2017)
  • Doctor of Philosophy, Sichuan University, Computational Pathology (2023)

Stanford Advisors


All Publications


  • CoNIC Challenge: Pushing the frontiers of nuclear detection, segmentation, classification and counting. Medical image analysis Graham, S., Vu, Q. D., Jahanifar, M., Weigert, M., Schmidt, U., Zhang, W., Zhang, J., Yang, S., Xiang, J., Wang, X., Rumberger, J. L., Baumann, E., Hirsch, P., Liu, L., Hong, C., Aviles-Rivero, A. I., Jain, A., Ahn, H., Hong, Y., Azzuni, H., Xu, M., Yaqub, M., Blache, M., Piegu, B., Vernay, B., Scherr, T., Bohland, M., Loffler, K., Li, J., Ying, W., Wang, C., Snead, D., Raza, S. E., Minhas, F., Rajpoot, N. M., CoNIC Challenge Consortium 2024; 92: 103047

    Abstract

    Nuclear detection, segmentation and morphometric profiling are essential in helping us further understand the relationship between histology and patient outcome. To drive innovation in this area, we setup a community-wide challenge using the largest available dataset of its kind to assess nuclear segmentation and cellular composition. Our challenge, named CoNIC, stimulated the development of reproducible algorithms for cellular recognition with real-time result inspection on public leaderboards. We conducted an extensive post-challenge analysis based on the top-performing models using 1,658 whole-slide images of colon tissue. With around 700 million detected nuclei per model, associated features were used for dysplasia grading and survival analysis, where we demonstrated that the challenge's improvement over the previous state-of-the-art led to significant boosts in downstream performance. Our findings also suggest that eosinophils and neutrophils play an important role in the tumour microevironment. We release challenge models and WSI-level results to foster the development of further methods for biomarker discovery.

    View details for DOI 10.1016/j.media.2023.103047

    View details for PubMedID 38157647

  • PhaseFIT: live-organoid phase-fluorescent image transformation via generative AI. Light, science & applications Zhao, J., Wang, X., Zhu, J., Chukwudi, C., Finebaum, A., Zhang, J., Yang, S., He, S., Saeidi, N. 2023; 12 (1): 297

    Abstract

    Organoid models have provided a powerful platform for mechanistic investigations into fundamental biological processes involved in the development and function of organs. Despite the potential for image-based phenotypic quantification of organoids, their complex 3D structure, and the time-consuming and labor-intensive nature of immunofluorescent staining present significant challenges. In this work, we developed a virtual painting system, PhaseFIT (phase-fluorescent image transformation) utilizing customized and morphologically rich 2.5D intestinal organoids, which generate virtual fluorescent images for phenotypic quantification via accessible and low-cost organoid phase images. This system is driven by a novel segmentation-informed deep generative model that specializes in segmenting overlap and proximity between objects. The model enables an annotation-free digital transformation from phase-contrast to multi-channel fluorescent images. The virtual painting results of nuclei, secretory cell markers, and stem cells demonstrate that PhaseFIT outperforms the existing deep learning-based stain transformation models by generating fine-grained visual content. We further validated the efficiency and accuracy of PhaseFIT to quantify the impacts of three compounds on crypt formation, cell population, and cell stemness. PhaseFIT is the first deep learning-enabled virtual painting system focused on live organoids, enabling large-scale, informative, and efficient organoid phenotypic quantification. PhaseFIT would enable the use of organoids in high-throughput drug screening applications.

    View details for DOI 10.1038/s41377-023-01296-y

    View details for PubMedID 38097545

  • CLC-Net: Contextual and local collaborative network for lesion segmentation in diabetic retinopathy images NEUROCOMPUTING Wang, X., Fang, Y., Yang, S., Zhu, D., Wang, M., Zhang, J., Zhang, J., Cheng, J., Tong, K., Han, X. 2023; 527: 100-109