Sort by
Refine Your Search
-
Listed
-
Category
-
Country
-
Program
-
Field
-
how sensorimotor experience sculpts visual circuits and codes. By understanding how moving bodies and perception are integrated, we can develop more effective clinical interventions and inspire the next
-
Lbs. or more Push/Pull Weight: 100 Lbs. or more Physical Environment: Inside medical Facility Noise Level: Moderate Visual Requirements: Near Visual Acuity, Far Visual Acuity, Color Discrimination
-
environment perception in autonomous driving by integrating acoustics. Possible research directions include the use of audio-visual foundation models, audio-driven sensor fusion for object detection, cross
-
task or series of tasks. Maintain visual attention and concentration, depth perception and visual field for patient treatment tasks and safety. Identify and respond to emergency signals and indicators
-
excellent academic track record in topics relevant to robot perception. A specific focus on visual inertial odometry (VIO), mapping (SLAM) and 3D reconstruction is desirable. Informal enquiries may be
-
Facility Environment Noise Level: Moderate Visual Requirements: Near Visual Acuity, Far Visual Acuity, Color Discrimination, Depth Perception Hazards: Biological, Chemical, Sharp Objects/Tools, Dust, Fumes
-
Requirements: Color discrimination, Depth perception, Far visual acuity Near visual acuity Hazards: Biological, Dust, Fumes/Gases/Odors, Radiation, Sharp objects/tools #UAMSNurses Salary Information
-
employees: https://facilities.med.wustl.edu/Careers. WashU Medicine is actively seeking an experienced Building Automation System/DDC Controls Technician. This position will provide programming, repairs, re
-
for drone swarms. The role will focus on multi-agent visual perception techniques. Group website: https://personal.ntu.edu.sg/wptay/ Key Responsibilities: Develop signal processing and machine learning
-
Title: Disentangling and modelling behaviourally-relevant visual and semantic dimensions of visual cognition in the human brain (Kamila Maria Jozwik lab, the University of Cambridge)Application