Sort by
Refine Your Search
-
Listed
-
Field
-
, or related field; Solid background in machine learning, deep learning and foundation models such as Large Language Models; Strong programming skills (Python/C++); Proven interest in generative models
-
degree in Computer Science, Artificial Intelligence, Data Science, or related field; Solid background in machine learning, deep learning and foundation models such as Large Language Models; Strong
-
Candidate Human-Centered Interpretable Machine Learning (1.0fte) Project description In recent years, practitioners and researchers have realized that predictions made by machine learning models should be
-
years, practitioners and researchers have realized that predictions made by machine learning models should be transparent and intelligible. Although explainable AI methods can shed some light on the inner
-
. The research will combine computational modeling (e.g., NLP, machine learning, deep learning) with human-centered research (e.g., user studies, experimental design, qualitative analysis). We are looking not only
-
emotion safety is crucial; Design interventions to reduce bias and improve fairness and safety in human-AI interaction. The research will combine computational modeling (e.g., NLP, machine learning, deep
-
) Project Why apply? Generative AI and large-language models (LLMs) are about to turn computer-aided engineering into true human–AI co-design. In the new MSCA Doctoral Network GenAIDE we team up with Honda
-
interest in neuro-behavioral sciences and a passion for behavioral signals. Demonstrable experience in advanced data analysis and data collection. Familiarity with machine learning and proficiency in Python
-
networks, or network science, and relevant background knowledge n methods in machine learning and AI. The successful candidate will focus on innovating the field of network analysis with AI methods. Examples