Bio


Postdoctoral scholar at Stanford University | AI in Medical Imaging

Honors & Awards


  • BRICS Young Scientist Award 2021, Department of Science and Technology, Government of India (2021)
  • Winner of FAME BIOTECH 2021 Hackathon, Bionest-IASST (2021)
  • Finalist BIRAC SITARE GYTI award 2021, Department of Biotechnology, Government of India (2021)
  • Finalist in BIRAC BIG NER 2 Grant call 2022, BIRAC, Department of Biotechnology, Government of India (2022)

Boards, Advisory Committees, Professional Organizations


  • Member, BRICS Young Scientist Forum (2021 - Present)
  • Advisor, Rognidaan Technologies Private Limited- A Biomedical Artificial Intelligence Start-up in North-East India (2023 - Present)

Professional Education


  • Bachelor of Technology, Gauhati University, IT (2013)
  • Master of Technology, Gauhati University, IT (2015)
  • Doctor of Philosophy, Institute of Advanced Study in Science and Technology, DST, Government of India and Gauhati University (2023)

Stanford Advisors


Current Research and Scholarly Interests


Explainable AI and Federated Learning

All Publications


  • Exploring explainable artificial intelligence techniques for evaluating cervical intraepithelial neoplasia (CIN) diagnosis using colposcopy images Expert Systems with Applications Hussain, E., Mahanta, . B., Borbora, . A., Bora, H., Choudhury, . S., et al 2024
  • IHC-Net: A fully convolutional neural network for automated nuclear segmentation and ensemble classification for Allred scoring in breast pathology APPLIED SOFT COMPUTING Mahanta, L. B., Hussain, E., Das, N., Kakoti, L., Chowdhury, M. 2021; 103
  • A comprehensive study on the multi-class cervical cancer diagnostic prediction on pap smear images using a fusion-based decision from ensemble deep convolutional neural network TISSUE & CELL Hussain, E., Mahanta, L. B., Das, C., Talukdar, R. 2020; 65: 101347

    Abstract

    The diagnosis of cervical dysplasia, carcinoma in situ and confirmed carcinoma cases is more easily perceived by commercially available and current research-based decision support systems when the scenario of pathologists to patient ratio is small. The treatment modalities for such diagnosis rely exclusively on precise identification of dysplasia stages as followed by The Bethesda System. The classification based on The Bethesda System is a multiclass problem, which is highly relevant and vital. Reliance on image interpretation, when done manually, introduces inter-observer variability and makes the microscope observation tedious and time-consuming. Taking this into account, a computer-assisted screening system built on deep learning can significantly assist pathologists to screen with correct predictions at a faster rate. The current study explores six different deep convolutional neural networks- Alexnet, Vggnet (vgg-16 and vgg-19), Resnet (resnet-50 and resnet-101) and Googlenet architectures for multi-class (four-class) diagnosis of cervical pre-cancerous as well as cancer lesions and incorporates their relative assessment. The study highlights the addition of an ensemble classifier with three of the best deep learning models for yielding a high accuracy multi-class classification. All six deep models including ensemble classifier were trained and validated on a hospital-based pap smear dataset collected through both conventional and liquid-based cytology methods along with the benchmark Herlev dataset.

    View details for DOI 10.1016/j.tice.2020.101347

    View details for Web of Science ID 000555534400002

    View details for PubMedID 32746984

  • Automated classification of cells into multiple classes in epithelial tissue of oral squamous cell carcinoma using transfer learning and convolutional neural network NEURAL NETWORKS Das, N., Hussain, E., Mahanta, L. B. 2020; 128: 47-60

    Abstract

    The analysis of tissue of a tumor in the oral cavity is essential for the pathologist to ascertain its grading. Recent studies using biopsy images reveal computer-aided diagnosis for oral sub-mucous fibrosis (OSF) carried out using machine learning algorithms, but no research has yet been outlined for multi-class grading of oral squamous cell carcinoma (OSCC). Pertinently, with the advent of deep learning in digital imaging and computational aid in the diagnosis, multi-class classification of OSCC biopsy images can help in timely and effective prognosis and multi-modal treatment protocols for oral cancer patients, thus reducing the operational workload of pathologists while enhancing management of the disease. With this motivation, this study attempts to classify OSCC into its four classes as per the Broder's system of histological grading. The study is conducted on oral biopsy images applying two methods: (i) through the application of transfer learning using pre-trained deep convolutional neural network (CNN) wherein four candidate pre-trained models, namely Alexnet, VGG-16, VGG-19 and Resnet-50, were chosen to find the most suitable model for our classification problem, and (ii) by a proposed CNN model. Although the highest classification accuracy of 92.15% is achieved by Resnet-50 model, the experimental findings highlight that the proposed CNN model outperformed the transfer learning approaches displaying accuracy of 97.5%. It can be concluded that the proposed CNN based multi-class grading method of OSCC could be used for diagnosis of patients with OSCC.

    View details for DOI 10.1016/j.neunet.2020.05.003

    View details for Web of Science ID 000567812200005

    View details for PubMedID 32416467

  • A shape context fully convolutional neural network for segmentation and classification of cervical nuclei in Pap smear images ARTIFICIAL INTELLIGENCE IN MEDICINE Hussain, E., Mahanta, L. B., Das, C., Choudhury, M., Chowdhury, M. 2020; 107: 101897

    Abstract

    Pap smear is often employed as a screening test for diagnosing cervical pre-cancerous and cancerous lesions. Accurate identification of dysplastic changes amongst the cervical cells in a Pap smear image is thus essential for rapid diagnosis and prognosis. Manual pathological observations used in clinical practice require exhaustive analysis of thousands of cell nuclei in a whole slide image to visualize the dysplastic nuclear changes which make the process tedious and time-consuming. Automated nuclei segmentation and classification exist but are challenging to overcome issues like nuclear intra-class variability and clustered nuclei separation. To address such challenges, we put forward an application of instance segmentation and classification framework built on an Unet architecture by adding residual blocks, densely connected blocks and a fully convolutional layer as a bottleneck between encoder-decoder blocks for Pap smear images. The number of convolutional layers in the standard Unet has been replaced by densely connected blocks to ensure feature reuse-ability property while the introduction of residual blocks in the same attempts to converge the network more rapidly. The framework provides simultaneous nuclei instance segmentation and also predicts the type of nucleus class as belonging to normal and abnormal classes from the smear images. It works by assigning pixel-wise labels to individual nuclei in a whole slide image which enables identifying multiple nuclei belonging to the same or different class as individual distinct instances. Introduction of a joint loss function in the framework overcomes some trivial cell level issues on clustered nuclei separation. To increase the robustness of the overall framework, the proposed model is preceded with a stacked auto-encoder based shape representation learning model. The proposed model outperforms two state-of-the-art deep learning models Unet and Mask_RCNN with an average Zijdenbos similarity index of 97 % related to segmentation along with binary classification accuracy of 98.8 %. Experiments on hospital-based datasets using liquid-based cytology and conventional pap smear methods along with benchmark Herlev datasets proved the superiority of the proposed method than Unet and Mask_RCNN models in terms of the evaluation metrics under consideration.

    View details for DOI 10.1016/j.artmed.2020.101897

    View details for Web of Science ID 000566856900006

    View details for PubMedID 32828445

  • Liquid based-cytology Pap smear dataset for automated multi-class diagnosis of pre-cancerous and cervical cancer lesions DATA IN BRIEF Hussain, E., Mahanta, L. B., Borah, H., Das, C. 2020; 30: 105589

    Abstract

    While a publicly available benchmark dataset provides a base for the development of new algorithms and comparison of results, hospital-based data collected from the real-world clinical setup is also very important in AI-based medical research for automated disease diagnosis, prediction or classifications as per standard protocol. Primary data must be constantly updated so that the developed algorithms achieve as much accuracy as possible in the regional context. This dataset would support research work related to image segmentation and final classification for a complete decision support system (https://doi.org/10.1016/j.tice.2020.101347) [1]. Liquid-based cytology (LBC) is one of the cervical screening tests. The repository consists of a total of 963 LBC images sub-divided into four sets representing the four classes: NILM, LSIL, HSIL, and SCC. It comprises pre-cancerous and cancerous lesions related to cervical cancer as per standards under The Bethesda System (TBS). The images were captured in 40x magnification using Leica ICC50 HD microscope collected with due consent from 460 patients visiting the O&G department of the public hospital with various gynaecological problems. The images were then viewed and categorized by experts of the pathology department.

    View details for DOI 10.1016/j.dib.2020.105589

    View details for Web of Science ID 000541974800018

    View details for PubMedID 32368601

    View details for PubMedCentralID PMC7186519

  • A Study on Epidemiological Factors and its Association with Pathological Findings for Precancerous Symptoms of Cervical Cancer Indian Journal of Public Health Research & Development Das, C. R., Mahanta, L. B., Borah, H., Hussain, E., Devi, A., Choudhary, M., Adhikari, A. C., et al 2019; 10 (12)