Sort by
Refine Your Search
-
report writing experience. Experience working in Agile teams with Scrum/Kanban/etc. PREFERRED QUALIFICATIONS 3 years' experience with coding in Python or Groovy to transfer, manipulate identity data, and
-
such as Python or R. Proficiency with relational database systems (SQL) and object-based data stores. Ability to define and solve logical problems for technical applications. Knowledge of and ability
-
: Familiarity and experience utilizing data warehouse, data analytics, and reporting tools a plus (e.g., Cognos, SQL, R, Python, Tableau, Power BI) KEY RESPONSIILITIES & ACCOUNTABILITIES Strategic Supplier
-
, REST/SOAP APIs SQL and database management Scripting languages (Python, PowerShell, or similar) Version control systems (Git) Integration monitoring and troubleshooting tools Key Accountabilities
-
platform and coordinating the integration and testing of commercial cellular equipment. The candidate must have fluency in multiple programming languages such as bash scripting, C++ and Python and comfort
-
science and familiarity with commonly used tools and techniques in analytics (R, python, Tableau, Spark, SQL, etc.). Knowledge about how to manage, analyze, communicate, visualize, and lead with artificial
-
in Geographic Information Systems (GIS) and science communication, as well as those who can supervise capstone projects by leveraging prior experience and networks with government, industry or non
-
personal style willing to receive assistance and coaching when needed from the faculty lead. Prior experience in a diverse higher education institution preferred. Knowledge of Python and/or R programming
-
Engineering: Excellent software development skills with proficiency in Python, TensorFlow/PyTorch, and experience with containerized deployments and MLOps practices. Data Pipeline Engineering: Extensive
-
skills with proficiency in Python, TensorFlow/PyTorch, and experience with containerized deployments and MLOps practices. Data Pipeline Engineering: Extensive experience with end-to-end data pipelines