An interview with Prof. Jennifer DHONT,
Head of the Data Science and AI Research Unit at the Jules Bordet Institute
Could you remind us of your career path?
I have an academic background in medical physics and biomedical engineering from the Vrije Universiteit Brussel (VUB). I began my PhD at VUB and UZ Brussel in the department of radiotherapy, where my research focused on medical image analysis and on improving the treatment of moving tumours, such as lung tumours affected by patient breathing.
Around 2014, when Artificial Intelligence (AI) started making major breakthroughs in imaging, I explored how these methods could support my research. This led to closer collaborations with the ETRO computer science department at VUB, which already had strong expertise in the field. I also spent several months at the Gemelli Hospital in Rome, working with an MRI-guided radiotherapy system – technology that is now also available at the Jules Bordet Institute.
After completing my PhD in 2020, I pursued a postdoctoral fellowship at Maastro Clinic in the Netherlands, focusing on the then-urgent topic of COVID-19. I investigated how medical imaging and AI could support rapid, automated diagnosis to improve patient triage.
In 2021, I joined the Jules Bordet Institute as head of the Data Science & AI Research Unit, returning to the oncology field, which has always been my main interest. Since this academic year, I also hold a Chair in Artificial Intelligence at the Université libre de Bruxelles (ULB).
Could you tell us about the Data Science and AI Research unit?
The Data Science and AI Research Unit is a dedicated team focused on analysing medical data to generate new insights and on developing and piloting novel algorithms for clinical applications. The unit consists of post-doctoral researchers, PhD candidates, and students from various universities and backgrounds who join us for master’s theses or internships, all sharing an interest in the combination of healthcare and AI. Most of our projects are conducted in a research or feasibility context, but some are carried out in collaboration with industry partners to pave the way for future clinical translation.
While traditional bioinformatics groups often rely on classical statistical approaches, our work is centered on machine learning and deep learning, including neural networks, to detect complex patterns in data. The algorithms we develop serve two main purposes: automating clinical tasks – for example, supporting diagnosis – and providing clinical decision support, such as predicting a patient’s response to specific treatments to enable more personalised therapy. A major advantage of AI is its ability to process raw, high-dimensional data, such as medical images or genomic profiles, with minimal human preprocessing, and to integrate multiple data sources at once, including radiology, pathology, and genomics. This capacity to learn from multimodal information allows AI to uncover complex patterns that traditional analytical methods cannot detect, making it a particularly powerful driver of precision medicine.
Could you give us some examples of research projects currently underway within the Data Science and AI Research Unit?
One of our flagship projects is ARTEMIS, where we aim to develop an AI model capable of identifying early-stage breast cancer patients who could safely avoid chemotherapy or immunotherapy based solely on routinely acquired data. The goal is to reduce unnecessary toxicity while maintaining excellent outcomes. Although commercial genomic signatures exist, they are costly and therefore not widely used, which often leads to overtreatment. For this large-scale, multidisciplinary project, we have collected, digitised, and structured data from more than 5,000 breast cancer patients treated at the Jules Bordet Institute. The data are securely stored within the hospital and serve as the foundation for building the AI model.
We are conducting similar initiatives for pancreatic cancer, a rare but highly aggressive disease. Thanks to the expertise of the Digestive Oncology Laboratory and its active clinical trial programme, Bordet has been able to assemble a relatively large and unique dataset, enabling the development of predictive models in a field where data scarcity is usually a major limitation.
Another line of research focuses on AI-based pipelines for the automatic quantification of biomarkers in histopathology images. These tools will be particularly valuable as Bordet is establishing a new central pathology laboratory to process slides from its clinical trials. Automated, standardised biomarker assessment can in this context improve consistency and reduce workload, accelerating the ongoing clinical and translational research at Institut Jules Bordet.
What are your main research collaborations?
I strongly believe that AI in healthcare can only succeed through genuine multidisciplinary collaboration. Our Data Science & AI Research Unit brings together engineers, computer scientists, and physicists, but we rely heavily on close partnerships with physicians and clinical scientists to ensure that we are asking the right research questions and correctly interpreting our findings. Within the Jules Bordet Institute, we maintain collaborations with all clinical departments. Being part of Belgium’s only comprehensive cancer centre is a significant advantage: it allows us to work in an environment where diverse expertise and large, centralised datasets come together.
Because AI research requires substantial amounts of high-quality data, we also collaborate extensively with partners beyond the hospital, both within Belgium and internationally. We work with centres across Europe, including strong partnerships in Denmark, where unique nationwide registries enable large-scale studies, as well as with collaborators in the United States. The U.S. has been an early leader in AI innovation, but Europe is now investing heavily and rapidly advancing in this field, with a strong emphasis on ethical data use and patient privacy, which remains a core value in our work as well.
How do you see the use of data science and AI evolving at the Jules Bordet Institute in the coming years?
I expect its role to grow significantly. Our team is expanding rapidly, our computational infrastructure is being doubled, and there is an increasing institutional focus on collecting clinical data in a more structured and high-quality manner. This will allow us to fully leverage these data to generate new hypotheses, support research, and ultimately improve patient care.
We closely monitor major advances in AI across sectors, identifying which of these innovations could bring real value to medicine. Techniques such as agentic AI systems, large foundation models, and more sophisticated reasoning approaches are already finding their way into healthcare applications. I anticipate that we will move well beyond traditional “black box” architectures toward AI systems that can interact with clinicians, understand context, justify their recommendations, and adapt to new information. These next-generation tools will not only generate patient-specific insights but also explain how they arrived at them, making the technology far more transparent, interpretable, and ultimately trustworthy, a crucial requirement in a clinical environment.
As these technologies evolve, their successful adoption in healthcare will also depend not just on their quantitative performance but also on a robust clinical implementation. Being embedded within a hospital, rather than a purely computer-science environment, shapes our perspective: as AI specialists, we can evaluate what is feasible, but it is ultimately the clinicians who must define what is desirable and useful in their daily practice. This dialogue is essential to ensure that the tools we develop genuinely support patient care. It also means thinking ahead about training and educating clinicians so they can confidently work with these new technologies. And, in line with the principles of the EU AI Act, our unit places strong emphasis on human oversight, transparency and fairness.

