Sort by
Refine Your Search
-
Institute of Particle Astrophysics and Cosmology (BIPAC), on research aimed at extracting cosmological information from large-scale structure (LSS) and Cosmic Microwave Background (CMB) probes on very large
-
to automate scientific discovery in both the natural and social sciences. The postholder will contribute to one or more of the following strands: • Foundational work on large-scale/foundation models and
-
. About the Role The post is funded for 3 years and is based in the Big Data Institute, Old Road Campus. You will join an interdisciplinary team of researchers spanning imaging science, machine learning
-
foundational theory of how large ML systems can be regularised to have dramatically fewer trainable parameters without sacrificing accuracy by analysing the use of low-dimensional building blocks Implicit
-
of therapeutic genomics, leveraging large-scale functional genomic datasets and cutting-edge computational resources, including university HPC clusters and AWS. The post-holder will advise colleagues on data
-
between the two linked studies as well as taking the lead in the large-scale qualitative secondary analysis of interview data from multiple sources. In this role you will be expected to contribute
-
York Genome Center) to co-design experiments and generate novel datasets, including exome/genome sequencing of hundreds of thousands of individuals, large-scale single-cell data from primary human
-
be close to the completion of a PhD/DPhil in epidemiology, biostatistics or big data, along with demonstrable experience of working with population registers and large datasets. With proven
-
with the possibility of renewal. This project addresses the high computational and energy costs of Large Language Models (LLMs) by developing more efficient training and inference methods, particularly
-
proven expertise in seismic data processing and analysis, knowledge of volcanic/ geothermal processes, strong quantitative skills, and proficiency in Python for scientific computing. You should be