Sort by
Refine Your Search
-
Listed
-
Program
-
Field
-
be research, data analysis, drafting and publishing research articles, and other scholarly outcomes that arise from a four-year project that uses temporal embeddings of large, heterogeneous scientific
-
propofol vs. inhaled VolatilE anesthesia (THRIVE) study. THRIVE is a prospective, multi-center, observational study combining data from patients surveys, EHR, and wearable devices. This role may also provide
-
EHR data processing, data queries, and dashboards Familiarity with processing data collected using wearable devices (biomedical) Experience with natural language processing and Large Language Model (LLM
-
) Names of 3 references, their affiliations, and contact information Applicants selected for an interview will be contacted via email. The interview process will consist of an initial 30-45 minute screening
-
wellness, related clinical information, and research. Assistant Level: High school diploma or GED is necessary. Both: An understanding of medical terminology, experience in a large complex health care
-
benchmarking advanced AI/DL architectures (e.g., physically-informed AI/DL models, state-space models, spatiotemporal transformers, diffusion models, foundation models and/or Large Language Processing Models
-
with recruitment activities for a large, multi-year clinical trial testing the efficacy of behavioral interventions on alcohol misuse and related outcomes among a diverse population of teens. Key job
-
related to specific proposals, or general models, for architectural propositions addressing an area of interest to the fellow and of concern to the field at large. Project Projects centered
-
such as a centrifuge, various freezers, pipettes, and standard laboratory safety requirements and guidelines. Data Coordinator Responsibilities Demonstrates the ability to document data in accordance with
-
queries, and dashboards Familiarity with (biomedical) image processing Familiarity with processing data collected using wearable devices (biomedical) Experience with natural language processing and Large