At a Glance
- Tasks: Join our research team to advance security in generative AI technologies and tackle emerging threats.
- Company: Darktrace, a global leader in AI cybersecurity, protecting 10,000 organisations worldwide.
- Benefits: Generous holiday, private medical insurance, life insurance, and a cycle to work scheme.
- Why this job: Make a real impact on the future of AI security while working with cutting-edge technology.
- Qualifications: Understanding of generative AI systems, strong analytical skills, and a passion for collaborative problem-solving.
- Other info: Hybrid role with excellent career growth opportunities in a dynamic environment.
The predicted salary is between 36000 - 60000 £ per year.
Darktrace is a global leader in AI for cybersecurity that keeps organizations ahead of the changing threat landscape every day. Founded in 2013, Darktrace provides the essential cybersecurity platform protecting nearly 10,000 organizations from unknown threats using its proprietary AI. The Darktrace Active AI Security Platform delivers a proactive approach to cyber resilience to secure the business across the entire digital estate – from network to cloud to email. Breakthrough innovations from our R&D teams have resulted in over 200 patent applications filed. Darktrace’s platform and services are supported by over 2,400 employees around the world.
Job Description: As part of our cutting-edge research team, you will play a pivotal role in advancing the security and trustworthiness of generative AI technologies. This position offers the opportunity to explore emerging threats, design innovative defenses, and shape best practices for safe and responsible AI deployment. You’ll work at the intersection of machine learning, cybersecurity, and applied research, helping to ensure that next-generation AI systems are robust, secure, and aligned with ethical standards. This is a hybrid role, with a compulsory attendance of 2 days a week in the Cambridge office.
What will I be doing: As a Security Researcher with a focus on generative AI systems, you will contribute to a range of projects ranging from rapid prototyping of new ideas to open-ended research initiatives. As a domain expert, you will provide initial insights and ongoing feedback supporting product development, communicating with the product, development, and machine learning teams as needed. Other responsibilities will include but not limited to:
- Investigating trends in generative AI compliance and visibility
- Researching attacker tradecraft targeting generative AI chatbots and agentic systems
- Creating, validating, and testing detections in a research environment
- Co-ordinating with relevant development, product, and machine learning teams
- Provide detailed and actionable feedback on product performance
What experience do I need: To succeed in this role, you should bring a solid understanding of generative AI systems and their security challenges, along with strong analytical and communication skills. You’ll be working closely with a detection engineering team, so an interest in collaborative problem-solving and a proactive approach to learning are essential.
- Familiarity with the evolving landscape of generative AI, including popular foundation models and emerging agentic architectures
- Knowledge of common attacker methodologies targeting AI systems (e.g., prompt injection, data poisoning, inference, and extraction attacks)
- Interest in contributing to a detection engineering team focused on safeguarding AI technologies
- Strong logical reasoning and problem-solving skills, especially in unfamiliar or complex scenarios
- Ability to communicate technical concepts clearly to both technical and non-technical stakeholders
Benefits:
- 23 days’ holiday + all public holidays, rising to 25 days after 2 years of service
- Additional day off for your birthday
- Private medical insurance which covers you, your cohabiting partner and children
- Life insurance of 4 times your base salary
- Salary sacrifice pension scheme
- Enhanced family leave
- Confidential Employee Assistance Program
- Cycle to work scheme
Security Researcher in Cambridge employer: Darktrace Ltd
Contact Detail:
Darktrace Ltd Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Security Researcher in Cambridge
✨Tip Number 1
Network like a pro! Reach out to folks in the cybersecurity and AI space on LinkedIn or at industry events. A friendly chat can open doors that a CV just can't.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects related to generative AI and security. This gives you a chance to demonstrate your expertise beyond what's on paper.
✨Tip Number 3
Prepare for interviews by brushing up on common questions in the field. Think about how you can relate your experiences to the role at Darktrace, especially around emerging threats and innovative defenses.
✨Tip Number 4
Don't forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you're genuinely interested in joining our team.
We think you need these skills to ace Security Researcher in Cambridge
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Security Researcher role. Highlight your experience with generative AI systems and any relevant projects you've worked on. We want to see how your skills align with what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about cybersecurity and generative AI. Share specific examples of your work that demonstrate your analytical and problem-solving skills. Let us know why you’d be a great fit for our team!
Showcase Your Communication Skills: Since you'll be working with both technical and non-technical teams, it's important to showcase your ability to communicate complex ideas clearly. In your application, include examples of how you've successfully communicated technical concepts in the past.
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to keep track of your application status. Plus, we love seeing applications come directly from our site!
How to prepare for a job interview at Darktrace Ltd
✨Know Your Stuff
Make sure you brush up on generative AI systems and their security challenges. Familiarise yourself with common attacker methodologies like prompt injection and data poisoning. This knowledge will not only help you answer questions confidently but also show your genuine interest in the role.
✨Showcase Your Problem-Solving Skills
Prepare to discuss specific examples where you've tackled complex problems, especially in unfamiliar scenarios. Think about how you approached these challenges and what the outcomes were. This will demonstrate your analytical skills and proactive approach to learning.
✨Communicate Clearly
Practice explaining technical concepts in simple terms. You’ll need to communicate with both technical and non-technical stakeholders, so being able to break down complex ideas is crucial. Consider doing mock interviews with friends or colleagues to refine this skill.
✨Engage with the Team
Since collaboration is key in this role, be prepared to discuss how you work within a team. Think of examples where you’ve coordinated with others, especially in research or product development contexts. Showing that you can contribute to a detection engineering team will set you apart.