Sort by
Refine Your Search
-
Listed
-
Country
-
Field
-
and pedagogical skills. Throughout your studies, you will be guided by supervisors. Doctoral studies end with a thesis and a doctoral degree. More about being a doctoral student at LTH on lth.se. https
-
The University of North Carolina at Chapel Hill | Chapel Hill, North Carolina | United States | 16 days ago
or other semantic technologies. Experience in the collection, storage, transformation, standardization, harmonization, and analysis of legacy data stored in a variety of formats (e.g., OWL, RDF, JSON-LD
-
departments, IRISA is a center of excellence whose scientific priorities include bioinformatics, system security, new software architectures, virtual reality, big data analysis, and artificial intelligence
-
Austrian Academy of Sciences, the Johann Radon Institute for Computational and Applied Mathematics (RICAM) | Austria | about 1 month ago
, and the working language is English. The successful candidate will work on research at the interface of learning theory, approximation theory, and harmonic analysis, with a particular emphasis on the
-
the enhancement of the efficiency or figure of merit of the stacks. Moreover, (ii) by using and combining various experimental techniques and methods available at the laboratory (harmonic Hall, spin pumping, FMR
-
to lead data harmonization efforts and establish secure, collaborative frameworks that directly impact the future of epilepsy research. Work Schedule: Hybrid, Monday - Friday, ~7am - ~ 5pm. Duties
-
and pedagogical skills. Throughout your studies, you will be guided by supervisors. Doctoral studies end with a thesis and a doctoral degree. More about being a doctoral student at LTH on lth.se. https
-
and harmonize user-friendly, standardized data analysis pipelines for a variety of computational tasks, including, but not limited to, mutation calling, bulk and single-cell RNA-seq analysis, ChIP-seq
-
, programming, analysis and modeling in order to support projects. The Data Scientist/Statistician I will implement the research analyses, executing large-scale data harmonization, statistical analysis, and
-
transformations, harmonize identifiers, manage slowly changing attributes, and create repeatable ingestion workflows with testing, logging, and validation. A documented source-to-graph mapping for priority datasets