PhD candidate in Monitoring and Control of Generative AI

Updated: 30 minutes ago
Deadline: ;

The SnT is seeking a Doctoral Researcher to support the research and development work within the SEDAN group (https://www.uni.lu/snt-en/research-groups/sedan ). We seek a candidate with expertise and/or interest in the following relevant fields: artificial intelligence and cybersecurity.

The candidate will have the opportunity to work on a collaborative project with a leading industry in cybersecurity allowing thus to validate and receive feedback from on-the-field cybersecurity practitioners.

As generative AI (GenAI) platforms and large language models (LLMs) are increasingly integrated into organizational workflows, they introduce new attack surfaces and vulnerabilities. Organizations often lack the tools to monitor and enforce policies governing access and usage, leaving them exposed to compliance violations and security risks. Challenges such as managing access rights, maintaining dynamic policy updates, and ensuring proper usage auditing are compounded by the evolving nature of AI technologies and associated threats.

The research project of the PhD student will thus focus on defining methods to track, monitor, and manage the use of GenAI. While this can rely on recentely proposed telemetry framework extended from cloud solutions (such as OpenLLMetry), the research question is how to identify anomalies in collected information that can come from multiple AI services either invoked manually by users or by AI agents themselves. The candidate will explore behavioural analysis techniques. Another research questions is related to the definition and application of a unified policy through both legacy IT systems and AI-based systems.

In addition, the candidate will be also involved in project reporting and dissemination and will participate to meetings with our partner. The project is an academic project oriented but applied research. It is a unique opportunity to develop new concepts with a close collaboration with industry.

During the PhD studies, the candidate will have the opportunity to participate and propose other projects within the group and so also develop his/her/their own research agenda. We are working on various topics related to applied ML and cyber-security, including applications and security of LLMs.

For further information, please contact us at jerome.francois@uni.lu and lama.sleem@uni.lu



Similar Positions