Md Naim Hassan Saykat
Artificial Intelligence Researcher · Computer Vision · Medical Imaging · Explainable AI
Université Paris-Saclay
Orsay, Île-de-France, France
I am an Artificial Intelligence researcher specializing in Computer Vision, Medical Imaging, Deep Learning, and Transformer-based architectures.
I am currently pursuing my Master’s in Artificial Intelligence at Université Paris-Saclay, where I develop generalizable, explainable, and efficient deep learning systems for real-world diagnostic and healthcare applications.
My research focuses on building clinically reliable and generalizable AI systems, with an emphasis on cross-dataset validation, explainability, and real-world deployment.
My research interests include:
- Medical Image Analysis (dermatology, radiology, X-ray interpretation)
- Vision Transformers (ViT) and hybrid CNN-Transformer architectures
- Explainable AI (XAI), Grad-CAM, attention mechanisms, feature attribution
- Domain adaptation & cross-dataset generalization (e.g., HAM10000 to ISIC 2019)
- Edge-efficient and deployable AI for resource-constrained clinical settings
I have developed multiple end-to-end research systems, including:
- Generalizable deep ensemble models for skin lesion classification with cross-dataset validation
- Explainable Vision Transformer pipelines for medical image interpretation
- CycleGAN-based unpaired image-to-image translation for medical domain adaptation
- Patent retrieval and re-ranking systems using dense retrieval and cross-encoders
- Transformer-driven explainability frameworks for clinical decision support
My work emphasizes robustness, clinical interpretability, and real-world deployability, with a focus on producing publishable, high-impact research. I am currently preparing manuscripts targeting Q1 journals in medical imaging and AI.
I am actively seeking research opportunities in:
- Research collaborations
- Journal and conference publications
- PhD positions in Computer Vision, Medical Imaging, or Multimodal AI
Academic & Research Links
If you are interested in collaboration or research discussion, feel free to reach out.