At a Glance
- Tasks: Support AI safety researchers and manage impactful projects in a collaborative environment.
- Company: MATS is dedicated to advancing AI safety through innovative research and mentorship.
- Benefits: Competitive pay, flexible work options, and the chance to work with leading experts.
- Why this job: Make a real difference in AI safety while developing your skills and network.
- Qualifications: 2 years of experience in research management, project management, or related fields required.
- Other info: Diverse backgrounds encouraged; apply even if you don't fit all profiles!
The predicted salary is between 62400 - 93600 £ per year.
ML Alignment & Theory Scholars (MATS) are looking for a talented individual to join our London team as a Research Manager.
Overview:
- Full-time position
- Language: English
- Location: London, UK
- Salary: £30 – £45/h, depending on experience
- Closing date: until filled
Job Description
As a Research Manager, you will play a crucial role in supporting and guiding AI safety researchers, facilitating projects, and contributing to the overall success of our programme. This role offers a unique opportunity to develop your skills, make a significant impact in the field of AI safety, and work with top researchers from around the world.
Your day to day will involve talking to both scholars and mentors to understand the needs and direction of their projects. This may involve becoming integrated into the research team, providing feedback on papers and ensuring that there is a plan to get from where the project is now to where it needs to be.
We are excited for candidates that can augment their work as a research manager by utilising their pre-existing expertise in one of the following domains:
- Theory – providing informed feedback to scholars on research direction, and helping MATS to assess research priorities.
- Engineering – helping scholars to become stronger research engineers, and building out the internal tooling of MATS.
- Projects – providing scholars with structure and accountability for their research, and helping MATS to build better systems and infrastructure.
- Communication – helping scholars to present their research in more compelling ways to influential audiences, and improving how MATS communicates its mission.
Responsibilities:
- Work with world-class academics & industry mentors to: people-manage their AI safety mentees
- Manage and support AI safety research projects
- Facilitate communication and collaboration between scholars, mentors, and other collaborators
- Organise and lead research meetings
- Work with individual junior AI safety researchers to: provide guidance and feedback on research directions and writeups
- Connect them with relevant domain experts to support their research
- Contribute to the strategic planning and development of MATS: spearheading internal projects
- Building and maintaining the systems and infrastructure that MATS requires to run efficiently
- Providing input into strategy discussions
Role Requirements
We welcome applications from individuals with diverse backgrounds, and strongly encourage you to apply if you fit into at least one of these profiles:
- AI safety researchers looking to develop a more holistic skillset
- Professionals with research or research management experience
- Product/project managers from tech or a STEM industry
- People managers with technical or governance backgrounds
- Technical writers or science communicators
- Community builders
If you do not fit into one of these profiles but think you could be a good fit, we are still excited for you to apply!
Essential Qualifications and Skills
We are looking for candidates who have the following:
- 2 years experience across a combination of the following:
- Technical research
- Governance or policy work
- Project management
- Research management (not necessarily technical)
- People management
- Community building
- Mentoring
Desirable Qualifications and Skills
We expect especially strong applicants to have deep experience in at least one of the following areas:
- Familiarity with AI safety concepts and research landscape
- Experience in ML engineering, software engineering, or related technical fields
- Experience in AI policy
- Background in coaching or professional development
- Entrepreneurial mindset and experience in building systems or infrastructure
- PhD or extensive academic research experience
How to apply?
To apply, please fill out the form here.
MATS is committed to fostering a diverse and inclusive work environment. We encourage applications from individuals of all backgrounds and experiences.
Join us in shaping the future of AI safety research!
Research Manager ML Alignment & Theory Scholars (MATS) MATS empowers researchers to advance AI [...] employer: Freelancingforgood
Contact Detail:
Freelancingforgood Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Research Manager ML Alignment & Theory Scholars (MATS) MATS empowers researchers to advance AI [...]
✨Tip Number 1
Network with professionals in the AI safety field. Attend relevant conferences, webinars, or meetups to connect with researchers and industry experts. This can help you gain insights into the role and potentially get referrals.
✨Tip Number 2
Familiarise yourself with the latest trends and challenges in AI safety research. Read recent papers and articles to understand the current landscape, which will enable you to engage in meaningful conversations during interviews.
✨Tip Number 3
Showcase your project management skills by discussing specific examples of how you've successfully led teams or projects in the past. Highlight your ability to facilitate communication and collaboration among diverse groups.
✨Tip Number 4
Prepare to discuss how your unique background aligns with MATS' mission. Whether it's through technical expertise, community building, or mentoring, be ready to articulate how you can contribute to their goals.
We think you need these skills to ace Research Manager ML Alignment & Theory Scholars (MATS) MATS empowers researchers to advance AI [...]
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in research management, AI safety, or any of the specified domains. Use keywords from the job description to demonstrate your fit for the role.
Craft a Compelling Cover Letter: Write a cover letter that not only outlines your qualifications but also expresses your passion for AI safety and how your background aligns with MATS's mission. Be specific about how you can contribute to their projects.
Showcase Communication Skills: Since excellent communication is essential for this role, consider including examples in your application that demonstrate your ability to convey complex ideas clearly, whether through previous work or academic experiences.
Highlight Relevant Projects: If you've managed or contributed to any relevant projects, make sure to detail these in your application. Discuss your role, the challenges faced, and the outcomes achieved to showcase your project management skills.
How to prepare for a job interview at Freelancingforgood
✨Understand the Role
Make sure you have a clear understanding of the Research Manager position and its responsibilities. Familiarise yourself with AI safety concepts and the specific needs of the MATS programme, as this will help you demonstrate your fit for the role.
✨Showcase Your Expertise
Highlight your relevant experience in research management, project management, or any technical background you possess. Be prepared to discuss how your skills can contribute to the success of AI safety projects and support scholars effectively.
✨Prepare Thoughtful Questions
Come equipped with insightful questions about the team, ongoing projects, and the future direction of MATS. This shows your genuine interest in the role and helps you assess if the organisation aligns with your career goals.
✨Demonstrate Communication Skills
Since excellent communication is key for this role, practice articulating your thoughts clearly and concisely. Be ready to explain complex ideas in simple terms, as this will be crucial when working with scholars and mentors.