Sort by
Refine Your Search
-
Category
-
Program
-
Field
-
Category Mathematics, information, scientific, software Contract Internship Job title Large language models for automatic bug finding in source code analysis H/F Subject JOIN US, TO DO WHAT
-
challenges •Motivation to pursue a PhD in the same field The application should include CV, cover letter, and transcript of records. We offer: An internship in the heart of the Grenoble metropolitan area
-
:00 (UTC) Type of Contract Temporary Job Status Full-time Offer Starting Date 1 Jan 2026 Is the job funded through the EU Research Framework Programme? Not funded by a EU programme Is the Job related to
-
will work in a team of researchers, post-docs, and PhD students who are actively investigating various challenges and aspects of federated learning. The candidate should have knowledge in machine
-
required. This 6-month internship is part of the DIADEM-PEPR national project, and is designed for highly motivated candidates. The intern will have the opportunity to collaborate closely with PhD students
-
the energy systems modelling team within I-Tese. The position is based in Saclay, with occasional needs to travel to Grenoble. LET'S TALK ABOUT YOUR SKILLS AND GOALS : Education: PhD (Bac+8) in Energy
-
Stuttgart to work and exchange with the DLR team involved in the project. - PhD/Post-Doc in the field of economic modelling/simulation of energy systems. - Interest in the energy transition and in
-
(collection, selection and validation of relevant data for the analysis and production of input scenarios to our models); * A contribution to the definition of our scientific computing environment, in
-
Study the SOA and approaches on LLMs/VLMs + reasoning, with emphasis on Ontologies and other formal Knowledge Representation approaches. • Research on the ways the interaction can be exploited and provide added value (e.g. conformity). • Propose, produce and integrate into experiments that...
-
on efficiency in surface, power consumption, and computing performance. Vision Transformers (ViTs) have recently demonstrated superior performance over Convolutional Neural Networks (CNNs) in a wide