561 phd-computational-"IMPRS-ML"-"IMPRS-ML"-"IMPRS-ML" positions at University of Oxford in United Kingdom
Sort by
Refine Your Search
-
Listed
-
Category
-
Program
-
Field
-
with an international reputation for excellence. The Department has a substantial research programme, with major funding from Medical Research Council (MRC), Wellcome Trust and National Institute
-
) information-theoretic active learning, and c) capturing uncertainty in deep learning models (including large language models). The successful postholder will hold or be close to the completion of a PhD/DPhil in
-
inference attacks, to mitigate privacy leaks in MMFM. You will hold a PhD/DPhil (or be near completion) in a relevant discipline such as computer science, data science, statistics or mathematics; expertise in
-
hepatitis and liver disease. This post is funded by the National Institute for Health and Care Research (NIHR) as part of a significant research programme that leverages large-scale healthcare datasets
-
Oxford-Weidenfeld and Hoffmann Scholarships and Leadership Programme For more information about the scholarship visit the Weidenfeld and Hoffmann Scholarships and Leadership Programme page.
-
midwives, nurses, patient and public contributors, as well as NHS sites who are providing data for the programme. As a Medical Statistician you will contribute to a wide range of tasks across the project
-
blotting, microscopy, and drug/compound co-treatment in cellular assays Experience in antibiotic drug discovery You must also: Hold a PhD/DPhil in medicinal chemistry and chemical biology, together
-
the group and communicating our research; and generating new research and grant opportunities. It is essential that the post-holder has a doctoral degree (PhD or DPhil) in physics, engineering, or another
-
Oxford-DeepMind Graduate Scholarship (Computer Science) The Oxford-DeepMind Graduate Scholarships (Computer Science) are available for applicants to any full-time DPhil course within, or affiliated
-
with the possibility of renewal. This project addresses the high computational and energy costs of Large Language Models (LLMs) by developing more efficient training and inference methods, particularly