About Me

 

Prof. Chaim Baskin

I completed my Ph.D. in 2021 at the Technion’s Computer Science Department under the supervision of Prof. Alex Bronstein and Prof. Avi Mendelson. My doctoral research focused on designing deep neural networks that are both efficient and robust—a fundamental challenge in deploying AI systems at scale and in safety-critical environments. As part of this work, I developed training techniques and architectural principles that significantly improve the quantization, sparsity, and adversarial robustness of neural networks, enabling them to perform reliably under constrained resources and real-world perturbations. Several of these contributions are now widely cited and have influenced subsequent research in model compression and efficient inference.

During my Ph.D., I also collaborated closely with Intel Labs and Habana Labs, where I explored hardware-aware optimization techniques for neural networks, particularly for deployment on FPGA and custom AI accelerators.

This industrial experience deepened my interest in bridging deep learning with systems and hardware constraints, shaping much of my future work.

From 2021 to 2022, I conducted my postdoctoral research at the Center for Intelligent Systems at the Technion, continuing to explore graph representation learning, test-time adaptation, and self-supervised learning. I remained at the Technion until 2024 as a Research Associate and Principal Investigator, where I led several projects on multimodal learning and graph neural networks, and mentored graduate students who have since continued to doctoral studies at top institutions or joined research roles in industry.

In 2024, I joined the School of Electrical and Computer Engineering at Ben-Gurion University of the Negev as an Assistant Professor (Senior Lecturer), where I also serve as the Academic Head of Computing. I am the Head of the INSIGHT Lab, which focuses on the intersection of efficient deep learning, geometric machine learning, and multimodal intelligence. I am also a member of the university’s Data Science Research Center, and a Senior Member of the IEEE.

Over the years, my work has been regularly published in premier venues such as NeurIPS, CVPR, ICLR, ICML, and ICCV, and I have served as an Area Chair or senior program committee member in many of these forums. I have been fortunate to receive funding from competitive national and industrial sources, including the Israel Innovation Authority, Ministry of Science and Technology, and Intel Labs.

🔬 My research interests include:

  • Multimodal Foundation Models – Understanding how to train and adapt large-scale models that integrate vision, language, and other modalities under limited supervision and compute.
  • Graph Neural Networks (GNNs) – Improving the expressivity, scalability, and theoretical understanding of GNNs for dynamic, temporal, and heterogeneous graphs.
  • Resource-Efficient Learning – Studying quantization, sparsity, and test-time optimization techniques that allow modern deep learning models to operate under tight resource, energy, and latency constraints.