Research Associate/Fellow in Safety and Security of AI
Research Associate/Fellow in Safety and Security of AI

Research Associate/Fellow in Safety and Security of AI

York Full-Time 36000 - 60000 Β£ / year (est.) No home office possible
Go Premium
University of York

At a Glance

  • Tasks: Join a dynamic team researching AI safety and security, shaping the future of technology.
  • Company: Be part of the Centre for Assuring Autonomy, a leading research hub in AI and autonomy.
  • Benefits: Collaborate with industry leaders and gain valuable experience across various sectors.
  • Other info: Informal enquiries welcome; interviews scheduled for August 2025.
  • Why this job: Contribute to impactful research that influences AI adoption and safety standards.
  • Qualifications: PhD in AI/ML or related field, with a focus on safety or security required.

The predicted salary is between 36000 - 60000 Β£ per year.

Department

A great opportunity to join theCentre for Assuring Autonomy(CfAA), part of theDept of Computer Science, working on the intersection of AI, safety and security. This would be of interest to someone with a PhD or equivalent experience in any one of the three fields – and interests in the other two. The CfAA is working on the assurance and regulation of autonomy and AI, with a research team of around 70 academics, postgraduate researchers and PhD students researching many topics, e.g. explainability, human factors and system safety.

Role

This post will suit someone interested in developing expertise in an important topic which will shape the successful adoption of AI and autonomy, and who is keen to build links with industry across a range of sectors, including automotive, healthcare and maritime. For informal enquiries, please contact Prof John McDermid, CfAA Director, (john.mcdermid@york.ac.uk).

Skills, Experience & Qualification needed

  • Successful applicants will have a first degree in computer science or a cognate discipline, and have completed a PhD on AI/ML, ideally with a focus on safety or security of ML models.
  • Experience of carrying out both independent and collaborative research.
  • Willingness and ability to work as part of a multidisciplinary team.
  • Good communication skills including the ability to write up research for publication.
  • In addition, for the Research Fellow role the successful applicant will have postdoctoral experience and the ability to lead small-scale research projects.

Interview date:w/c 4thor 11thAugust 2025

For informal enquiries:please contact Prof John A McDermid onjohn.mcdermid@york.ac.uk

#J-18808-Ljbffr

Research Associate/Fellow in Safety and Security of AI employer: University of York

Joining the Centre for Assuring Autonomy (CfAA) offers a unique opportunity to be at the forefront of AI safety and security research within a collaborative and innovative environment. With a strong focus on employee development, the CfAA provides access to cutting-edge projects and industry partnerships, fostering growth and expertise in a rapidly evolving field. Located in a vibrant academic setting, employees benefit from a supportive work culture that values interdisciplinary collaboration and encourages meaningful contributions to the future of AI.
University of York

Contact Detail:

University of York Recruiting Team

StudySmarter Expert Advice 🀫

We think this is how you could land Research Associate/Fellow in Safety and Security of AI

✨Tip Number 1

Network with professionals in the AI safety and security field. Attend relevant conferences, workshops, or webinars where you can meet researchers and industry experts. This will not only enhance your knowledge but also increase your visibility to potential employers like us.

✨Tip Number 2

Engage with the Centre for Assuring Autonomy's work by following their publications and research outputs. Understanding their current projects and challenges will help you tailor your discussions during interviews and show your genuine interest in their mission.

✨Tip Number 3

Consider reaching out to Prof John McDermid for an informal chat about the role and the team's work. This could provide you with valuable insights and demonstrate your proactive approach, which is highly regarded in our selection process.

✨Tip Number 4

Showcase your collaborative research experience in your conversations. Highlight specific projects where you've worked in multidisciplinary teams, as this aligns well with the role's requirements and demonstrates your ability to contribute effectively to our team.

We think you need these skills to ace Research Associate/Fellow in Safety and Security of AI

PhD in AI/ML or related field
Research Methodology
Safety and Security of ML Models
Independent Research Skills
Collaborative Research Skills
Multidisciplinary Teamwork
Technical Writing
Communication Skills
Project Leadership
Industry Engagement
Explainability in AI
Human Factors in AI
System Safety Analysis

Some tips for your application 🫑

Tailor Your CV: Make sure your CV highlights your relevant experience in AI, safety, and security. Emphasise any research projects or publications that align with the role, particularly those that demonstrate your ability to work in multidisciplinary teams.

Craft a Strong Cover Letter: In your cover letter, express your enthusiasm for the position and the Centre for Assuring Autonomy. Discuss how your background in computer science and your PhD research relate to the role, and mention your interest in collaborating with industry sectors like automotive and healthcare.

Highlight Communication Skills: Since good communication skills are essential for this role, provide examples of how you've effectively communicated complex research findings in the past. This could include publications, presentations, or collaborative projects.

Prepare for Potential Interviews: While this step is not part of the written application, it's wise to prepare for interviews by thinking about how you would discuss your research and its implications for AI safety and security. Familiarise yourself with current trends in the field to demonstrate your knowledge and passion.

How to prepare for a job interview at University of York

✨Showcase Your Research Experience

Be prepared to discuss your previous research projects in detail, especially those related to AI, safety, and security. Highlight any independent or collaborative work you've done, as this will demonstrate your ability to contribute to the multidisciplinary team at the Centre for Assuring Autonomy.

✨Understand the Role's Impact

Familiarise yourself with how the role of Research Associate/Fellow contributes to the assurance and regulation of AI. Be ready to articulate your understanding of the importance of safety and security in AI applications across various sectors like automotive and healthcare.

✨Prepare Questions for the Interviewers

Think of insightful questions to ask during the interview. This could include inquiries about ongoing projects at the CfAA, potential collaborations with industry, or the future direction of research in AI safety and security. It shows your genuine interest in the role and the organisation.

✨Demonstrate Communication Skills

Since good communication skills are essential for this role, practice explaining complex concepts in a clear and concise manner. You may be asked to present your research findings, so ensure you can convey your ideas effectively, both verbally and in writing.

Research Associate/Fellow in Safety and Security of AI
University of York
Location: York
Go Premium

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>