Sort by
Refine Your Search
-
, scale and resolution in which in vivo pathways of immune cells can be unraveled. Furthermore, it provides a goldmine for training causal machine learning models to move towards precision medicine
-
II is looking for a part-time (30 hours per week) PhD-Position: Machine Learning / Medical Imaging (m/f/x) (with immediate effect). This position is offered for a duration of 3 years. Join the AICARD
-
analysis Background in biomedicine and digital pathology What we offer Embedding within a computational team, with extensive experience in computational biology and machine learning. Embedding within
-
increasingly complex networks. By deploying and advancing techniques such as machine learning, graph-based network analysis, and synthetic data generation, the project tackles key challenges in anomaly detection
-
principles that regulate host-pathogen interactions and feedback, using a combination of quantitative imaging, microfluidics, statistical analysis and machine learning tools. A specific focus will be put
-
apply a fast and efficient forest trait mapping and monitoring method based on the Invertible Forest Reflectance Model. A machine learning / deep learning framework will be explored and developed
-
(Wissenschaftszeitvertragsgesetz - WissZeitVG). A shorter contract term is possible by arrangement. The position aims at obtaining further academic qualification. Professional assignment: Chair of Machine Learning for Spatial
-
Dortmund, we invite applications for a PhD Candidate (m/f/d): Analysis of Microscopic BIOMedical Images (AMBIOM) You will be responsible for Developing new machine learning algorithms for microscopy image
-
interdisciplinary, and together we contribute to science and society. Your role Multi-omics data integration and workflow improvement Development and application of machine learning-based algorithms
-
programme at the Faculty of Science . The ideal candidate has a background in or experience with one or more of the following topics: SIMD performance engineering. Machine Learning. Communication-efficient