Sort by
Refine Your Search
-
. Proficiency in data analysis software (e.g., R, MATLAB, SPSS, Primer, Python). Experience with analytical techniques such as ANOVA, PCA, Regression, and Multidimensional Scaling (MDS). Experience in ecosystem
-
AI applications as well as Python-based coding . Have a degree in Computer Science/Computer Engineering. Possessing a Master’s or PhD degree will definitely be advantageous. Knowledge of machine
-
, vibration, and structural dynamics. Familiarity with signal processing and vibration/acoustic analysis techniques. Experience with data analysis using tools such as MATLAB, Python, or equivalent engineering
-
. Proficiency in algorithm development using Python will be advantageous Where to apply Website https://www.timeshighereducation.com/unijobs/listing/407953/research-fellow-ai-… Requirements Additional Information
-
tools such as MATLAB, Python, or equivalent engineering software. Experience with experimental and instrumentation measurements setup for noise and vibration (e.g., sound intensity probes, vibration
-
techniques. Training or demonstrated experience in remote sensing, spatial data collection, and thematic mapping. Proficiency in data analysis software (e.g., R, MATLAB, SPSS, Primer, Python). Experience with
-
-based applications (e.g., AWS, Google Cloud, Microsoft Azure), including containerization (Docker), orchestration (Kubernetes), serverless computing, and REST API development. Proficient in Python, with
-
Computer Science, Artificial Intelligence, Software Engineering, or a related field. Strong programming proficiency in Python and/or C++. Demonstrable experience with machine learning frameworks (e.g., PyTorch
-
will be advantageous. Knowledge of machine learning or reinforcement learning techniques will be advantageous. Proficiency in algorithm development using Python will be advantageous
-
and algorithms Python and relevant libraries (e.g., PyQt, OpenCV, NumPy, scikit-learn), particularly for developing Windows desktop application software incorporating deep learning models 2. Hold at