At a Glance
- Tasks: Conduct groundbreaking research on moral psychology and AI at a leading university.
- Company: Join the University of Cambridge's innovative Leverhulme Centre for the Future of Intelligence.
- Benefits: Engage in funded conference travel and workshops, plus a collaborative research environment.
- Why this job: Make a real impact on understanding AI consciousness and ethics in a dynamic setting.
- Qualifications: PhD in Psychology or related field with expertise in moral psychology and AI.
- Other info: No teaching duties, but opportunities may arise; applications close on 22 February 2026.
The predicted salary is between 36000 - 60000 £ per year.
This is an exciting opportunity to work within a research project on the moral psychology of digital minds, led by Dr Lucius Caviola, The Leverhulme Centre for the Future of Intelligence (CFI), with the Institute for Technology and Humanity (ITH) at the University of Cambridge. This full-time Postdoctoral Research Associate position is fixed term for 2 years, starting in May 2026.
About the research
The research programme investigates how people perceive and make moral judgments about social AI systems and so-called digital minds. Key research questions include:
- How do laypeople perceive AI consciousness?
- What factors influence whether people attribute moral status to AI systems?
- How do these perceptions affect moral decision-making and ethical attitudes toward AI?
- Are there individual or cross-cultural differences?
The postdoc will conduct empirical research using experimental methods to advance our understanding of the psychology underlying human responses to digital minds. More information about the project can be found in the Further Information document.
Responsibilities
- Conduct empirical research using experimental methods to investigate moral psychology related to AI and digital minds.
- Develop and implement studies, analyze data, and contribute to scholarly publications.
- Collaborate with project team at the Leverhulme Centre for the Future of Intelligence (CFI) and the Institute for Technology and Humanity (ITH) at the University of Cambridge.
- Engage with ongoing research efforts and contribute to conference travel and workshop activities funded by the project.
- Disseminate findings through academic channels and participate in project meetings.
There are no teaching duties associated with this post, though teaching opportunities may arise.
Qualifications
- Completed (or near completion) PhD in Psychology, Cognitive Science, or a related field, with expertise in moral psychology, social cognition, or related areas.
- A strong understanding of both the philosophy and psychology of AI consciousness, ethics, and digital minds is required.
- Proficiency in quantitative research methods, including advanced statistics and experimental psychology.
- Experience with research tools such as Qualtrics and R.
- We particularly welcome applicants with published work in academic journals on topics related to perceptions of AI consciousness and moral status.
Position and environment
The postdoc will be based at the Leverhulme Centre for the Future of Intelligence (CFI), part of the Institute for Technology and Humanity at the University of Cambridge. The project has funds dedicated to conference travel for the post holder and for workshops.
About CFI
CFI is a highly interdisciplinary research centre addressing the challenges and opportunities posed by artificial intelligence (AI). Based at the University of Cambridge, the Centre has close links with industry, policymakers, and many academic disciplines.
How to apply
For information on how to apply, please see the Further Information document. Informal enquiries about the role are welcomed and should be directed to Dr Lucius Caviola at lmoc2@cam.ac.uk. For questions on the application process, please contact the School's HR Team at sahhr@admin.cam.ac.uk.
Timeline
The closing date for applications is midnight (GMT) on Sunday 22 February 2026. Interviews are planned to take place in March 2026, subject to change.
Application process
Click the 'Apply' button below to register an account with our recruitment system (if you have not already) and apply online.
Reference
Please quote reference GO48724 on your application and in any correspondence about this vacancy.
Equality and eligibility
The University actively supports equality, diversity and inclusion and encourages applications from all sections of society. The University has a responsibility to ensure that all employees are eligible to live and work in the UK.
Postdoctoral Research Associate In Digital Minds (Fixed Term) in Cambridge employer: University of Cambridge
Contact Detail:
University of Cambridge Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Postdoctoral Research Associate In Digital Minds (Fixed Term) in Cambridge
✨Tip Number 1
Network like a pro! Reach out to folks in your field, especially those connected to the Leverhulme Centre for the Future of Intelligence. A friendly chat can open doors and give you insights that might just set you apart.
✨Tip Number 2
Show off your research skills! Prepare to discuss your past projects and how they relate to moral psychology and AI. We want to see your passion and expertise shine through during interviews.
✨Tip Number 3
Stay updated on the latest in AI and moral psychology. Being well-versed in current trends and debates will help you engage meaningfully with the interview panel and demonstrate your commitment to the field.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets the attention it deserves. Plus, it shows you’re serious about joining our team at the University of Cambridge.
We think you need these skills to ace Postdoctoral Research Associate In Digital Minds (Fixed Term) in Cambridge
Some tips for your application 🫡
Tailor Your Application: Make sure to customise your CV and cover letter to highlight your experience in moral psychology and AI. We want to see how your background aligns with the research questions we're tackling!
Show Off Your Research Skills: Since this role involves empirical research, be sure to showcase your proficiency in quantitative methods and any relevant tools like Qualtrics and R. We love seeing candidates who can hit the ground running!
Highlight Your Publications: If you've got published work related to AI consciousness or moral status, flaunt it! This will really help us understand your expertise and how you can contribute to our project.
Apply Through Our Website: Don't forget to apply through our recruitment system! It's the easiest way for us to keep track of your application and ensure you’re considered for this exciting opportunity.
How to prepare for a job interview at University of Cambridge
✨Know Your Research Inside Out
Make sure you’re well-versed in the specifics of the research project on moral psychology and digital minds. Familiarise yourself with Dr Lucius Caviola's work and the key questions being explored. This will not only show your genuine interest but also help you engage in meaningful discussions during the interview.
✨Showcase Your Methodological Skills
Be prepared to discuss your experience with empirical research methods, especially in relation to quantitative analysis and experimental psychology. Highlight any relevant tools you've used, like Qualtrics and R, and be ready to provide examples of how you've applied these skills in past projects.
✨Demonstrate Your Understanding of AI Ethics
Since the role involves investigating perceptions of AI consciousness, it’s crucial to articulate your understanding of both the philosophical and psychological aspects of AI ethics. Prepare to discuss current debates in the field and how they relate to your research interests.
✨Engage with the Team Spirit
Collaboration is key in this role, so express your enthusiasm for working within a team at the Leverhulme Centre for the Future of Intelligence. Share examples of past collaborative projects and how you contributed to achieving common goals, as this will highlight your ability to fit into their research environment.