Sort by
Refine Your Search
-
recordings. Perform coding-based data analysis (Python and/or MATLAB); working with and organizing big datasets (e.g. chronic in-vivo recording, videos and tracking). Optimize and troubleshoot pipelines to run
-
. What you’ll do: Design, develop, assemble, integrate, test, debug, modify, and optimize simple to complex systems, instruments, and one-of-a-kind prototypes for neurobiological research. Applications
-
You’ll Do: Lead the development, optimization, and benchmarking of Expansion Microscopy protocols for diverse model organisms, with general supervision. Perform hands-on bench research, including tissue
-
sample prep, sectioning and imaging. Optimizing and writing up lab protocols. Shipping and cataloguing of reagents. Lab fridge and freezer maintenance. General lab upkeep alongside other members
-
. About the role: We're seeking a skilled Data Engineer to drive scientific innovation through robust data infrastructure. In this role, you’ll design, develop, and optimize scalable data pipelines and
-
Develop and implement procurement strategies that align with institutional goals in research and operations. Collaborate with Finance on strategic planning, budgeting, and resource optimization. Stay
-
. Develop and optimize SOPs and workflows for iPSC services. Train others in iPSC-related protocols, processes, and instrument usage. Work both independently and collaboratively with minimal supervision
-
. Professional development opportunities through internal and external conferences and workshops. What You’ll Do: Design, plan and perform method optimization experiments independently Perform data analysis and
-
the care of essential experimental animals within the food and water restriction program. Our mission encompasses optimizing scientific productivity for researchers while ensuring the highest standards
-
, visit here . About the role: We're seeking a skilled Data Engineer to drive scientific innovation through robust data infrastructure. In this role, you’ll design, develop, and optimize scalable data