33 natural-language-processing-phd Fellowship positions at Hong Kong Polytechnic University
Sort by
Refine Your Search
-
more than five years of post-qualification experience at the time of application; and (b) strong background in accounting, finance or computer science/natural language processing/machine learning
-
, NVivo and other Microsoft Office software, or web design/development including Chinese word processing; (e) have an excellent command of both written and spoken English; (f) have a strong
-
at the time of application; (b) have extensive experience in the field of natural language processing; (c) have good working knowledge of Python, Pytorch, Linux, Git, etc; (d) have a good command of
-
representation for multi-model fusion”. Qualifications Applicants for the Postdoctoral Fellow post should have a PhD degree in Computer Science, Electrical and Computer Engineering or a related discipline or an
-
Department of Language Science and Technology Part-time Postdoctoral Fellow (Ref. 260226003) [Appointment period: twelve months] Duties The appointee will assist the project leader in the research
-
for the Senior Research Fellow post should have a PhD degree in Computer Science, Artificial Intelligence, Data Science, Digital Economy or a related field plus at least six years of postdoctoral research
-
for the Senior Research Fellow post should have a PhD degree in Computer Science, Artificial Intelligence, Data Science, Digital Economy or a related field plus at least six years of postdoctoral research
-
design prototypes and provide technical support; (b) conduct design studies with data collection and analysis; (c) document the design and research process; (d) conduct research writing; and (e
-
for the Research Assistant post should have an honours degree or an equivalent qualification. For both posts, preference will be given to those who have a high level of proficiency in English and are interested in
-
support on foundation models and generative AI systems; (b) explore advanced techniques in foundation large language models and multimodal large language models, with a focus on pre-training, post