Sort by
Refine Your Search
-
Listed
-
Category
-
Country
-
Program
-
Field
-
experience during their gap year or prior to applying to medical school, as well. No evening or weekend work required. Follow this link to learn more, https://www.hopkinsmedicine.org/wilmer/education/clinical
-
tools (Linux, Git, experimental methods). Desired Assets - Experience in marine robotics, drones, or embedded systems. - Knowledge of AI applied to perception or decision-making. - Experience in real-time
-
Internet with Human-in-the-Loop (CeTI) a position as Research Associate (m/f/x) for Research room F1- Human Sensory Perception and Action (subject to personal qualification employees are remunerated
-
Postdoctoral Researcher in visual cognitive computational neuroscience Supervisor: Dr. Kamila Maria Jozwik, Jozwik lab, University of Cambridge Application deadline: 2 April 2026 Start date: October
-
for the Centre for Tactile Internet with Human-in-the-Loop (CeTI) a position as Research Associate (m/f/x) for Research room F1- Human Sensory Perception and Action (subject to personal qualification employees
-
experience, skills, knowledge, abilities, education, licensure and certifications, and other business considerations. Salary offers at the top of the range are not common. Visit UC Benefit package to discover
-
: Physical: Ability to pull, push, crouch, crawl and stand. Lift up to 25 lbs. Vision, including color, depth perception, and clarity. Ability to engage in repetitive motions. Manual dexterity. Environmental
-
, awareness and perception of hazards is particularly important for low-frequency or complex events that may depart from prior individual or community experience. Improved knowledge of how a community is likely
-
qualification (usually PhD). Tasks: The successful candidate will contribute to one or more of the following focus areas: Human-Centered Design and Interaction: advance understanding of human sensory perception
-
and moved by laser-based optical tweezers in a closed action–perception loop. Your work will help transform current observation-only live-cell imaging microscopy into actively controllable, automated