28 postdoc-distributed-algorithms scholarships at Technical University of Denmark in Denmark
Sort by
Refine Your Search
-
Job Description The aim of this PhD project is to investigate the immediate and long-term deformation behaviour of plain and fiber-reinforced concrete using distributed fibre-optical sensing (FOS
-
, Responsibilities and qualifications Electricity markets are undergoing a rapid transformation: Market participants are deploying AI algorithms towards making their bidding decisions. AI algorithms are instructed
-
cutting-edge data science? And would you like to be part of a newly formed research collaboration between DTU and Novo Nordisk? Then you could be our new Postdoc. Read on to learn more! About the PhD
-
available sensor and meter infrastructure, affordable computational resources, and advanced modeling algorithms. MPCs excel in handling constrained optimizations and new operational conditions, whereas RLs
-
achieve automated data driven optimization (in terms of time and quality) of polishing process parameters by application of machine learning algorithms, leading to a robust, repeatable and fast polishing
-
characterization of glycoside hydrolases, and a postdoc working on computational modelling of the same enzymes. The PhD focuses on ligand-observed NMR analyses and other relevant methods to provide insight
-
will take advanced courses to build and deepen your skills, implement and evaluate algorithms, and develop your ability to write and present scientific work. We are a supportive team that will welcome
-
available sensor and meter infrastructure, affordable computational resources, and advanced modeling algorithms. MPCs excel in handling constrained optimizations and new operational conditions, whereas RLs
-
for Science & Technology (KAIST), and an external stay at KAIST will be included as part of the PhD program. Qualifications Proficiency with Python Experience implementing various Machine Learning algorithms
-
experimentation with Asst. Prof. Eli N. Weinstein. Your goal will be to develop fundamental algorithmic techniques to overcome critical bottlenecks on data scale and quality, enabling scientists to gather vastly