Sort by
Refine Your Search
-
Listed
-
Category
-
Country
- United States
- United Kingdom
- Portugal
- Germany
- France
- Sweden
- Netherlands
- Denmark
- Spain
- Norway
- Singapore
- United Arab Emirates
- Belgium
- Greece
- Italy
- Czech
- Ireland
- Australia
- Hong Kong
- Canada
- Estonia
- Finland
- Japan
- Luxembourg
- Poland
- Switzerland
- China
- Croatia
- Lithuania
- New Zealand
- Romania
- Vietnam
- 22 more »
- « less
-
Program
-
Field
- Medical Sciences
- Computer Science
- Economics
- Engineering
- Materials Science
- Business
- Biology
- Psychology
- Science
- Arts and Literature
- Social Sciences
- Humanities
- Mathematics
- Physics
- Chemistry
- Law
- Linguistics
- Education
- Environment
- Philosophy
- Sports and Recreation
- Design
- Earth Sciences
- Electrical Engineering
- 14 more »
- « less
-
this job include close vision, distance vision, color vision, peripheral vision, depth perception, and the ability to adjust focus. Work Environment The work environment characteristics described here
-
application will concern multi-sensors based systems for autonomous vehicles embedding AI perception or decision modules. The experiments will rely on embedded platforms available at LCIS, including
-
for entering, updating and maintaining files in Perceptive Content Keep abreast of new developments in university policies and procedures, as well as federal and state regulations Demonstrate an understanding
-
impact stakeholders' perceptions of and interactions in renovation workflows under organisational, local, and national regulatory constraints. Essential activities within your project will be to: map
-
of tactile behaviors on robotic platforms and the design of user studies to evaluate perception, interpretation, and user experience. The expected outcomes include computational models and design guidelines
-
withdrawal or leave of absence from the University. This person must have the ability to work well with a diverse group of individuals and to be perceptive and understanding of student concerns. Job requires
-
for drone swarms. The role will focus on multi-agent visual perception techniques. Group website: https://personal.ntu.edu.sg/wptay/ Key Responsibilities: Develop signal processing and machine learning
-
(including but not limited FAMS, OSFP, Salesforce CRM, ImageNow/Perceptive Content, HESAA, COD, NSLDS, SRDB); serves as a liaison with key individuals in the functional offices to execute advanced actions and
-
secondments to EXPLORA partner labs. The host lab is embedded in the wider research group Perception, Action, Cognition (https://www.helsinki.fi/en/researchgroups/perception-action-cognition) which hosts
-
, contributing technical solutions, definitions, and guidelines to guide future touchless system development. What will you be doing? Explore and understand the literature on haptics perception and multisensory