At a Glance
- Tasks: Join a cutting-edge project to develop an innovative VR app that responds to emotions.
- Company: Be part of the School of Digital Arts at Manchester Metropolitan University, a leader in creative technology.
- Benefits: Flexible working hours, collaborative environment, and opportunities for professional growth.
- Why this job: Make a real impact by merging art and technology in a revolutionary VR experience.
- Qualifications: PhD or equivalent experience in computer science, with Unity and C# programming skills.
- Other info: Work in a dynamic team of artists, psychologists, and AI researchers on a groundbreaking project.
The predicted salary is between 36000 - 60000 £ per year.
The School of Digital Arts is a purpose built, interdisciplinary school at one of the UK’s leading universities. Offering industry and research informed courses and specialist spaces with the latest technologies. The School of Digital Arts is a proud part of Manchester Metropolitan University. We build on the creative, science, tech and business strengths of a university whose research is rated as ‘world‑leading' and is changing the way we live, work, learn and play.
AI systems are increasingly able to detect a speaker’s emotions, leading to a new affective channel that can be explored in art. The controls available in standard Virtual Reality (VR) can be supplemented with speech recognition, natural language processing, and sentiment analysis. We aim to embody this potential in the front end of the Emote VR Voicer interface, which would translate detected meanings of vocal utterances to the morphing of abstract 3D animated shapes, enabling a radical new aesthetic experience. We are using iterative design cycles and ultimately aim to develop an interface that will improve the participant’s wellbeing.
About the role: Working within the School of Digital Arts (SODA) you will join state‑of‑the‑art research on the AHRC funded Emote VR Voicer project to develop a new, intelligently responsive VR app that incorporates speech recognition and meaning classification. You will help create a system that can detect live emotional content in the spoken (or sung) word, mapping this to visual animations based on sample banks of specially created 3D emotion shapes. You will be responsible for the VR development part. You will be working closely within a small project team consisting of artists, a psychologist and AI researchers in an iterative development cycle.
You will use your programming skills and Unity experience to integrate AI models that detect and tag emotional meaning from audio and map these to steer real‑time visuals in Unity. Live audio featured will also be mapped to animate graphics. Working closely together with the project lead, you will bring together assets to create animation blend trees and combine these with procedural animation to create a system where the shapes are animated differently depending on which emotion the system detects. Image synthesis, procedural content generation and style transfer will further expand on a bank of 3D graphics that are created specifically for this project. You will also be involved in some of the evaluation work and in writing up the research for publication(s).
The job will be for 2.5 days per week (0.5 FT) on a fixed‑term basis for 8 months. The working pattern will be mostly on‑campus with some remote working possible depending on project stage.
Key skills:
- A good understanding of programming within the Unity games engine using C# and experience with VR application development.
Essential skills and experience:
- A PhD in computer science, software engineering or a similar technical field, or equivalent professional experience
- Experience developing projects with C#
- Hands‑on experience developing Metaverse/VR applications using Unity
- Proficiency with scripting for procedural animation generation
- Experience with writing and co‑writing research papers
- Experience with image and/or audio‑based projects
- Experience with real‑time system optimisation (e.g. low‑latency audio/visual feedback in VR)
- Experience with data backup systems
- Experience with working in interdisciplinary teams
- Excellent communication and interpersonal skills.
- Creative problem‑solving skills
- Self‑motivation and able to undertake independent research related to the brief
- Excellent ability to work to deadlines
Desirable:
- Experience with user testing or co‑design methods, in arts or health settings
- Knowledge of the peer review process for research projects and journal articles
- Experience with bringing Python models into Unity
- Proficiency with Autodesk Maya modelling, skinning and rigging
- Familiarity with research projects productivity timelines
- Sensitive to nuances in visual aesthetics
Application process:
To apply, please submit your CV, a cover letter explaining how you meet the criteria and include a link to previous relevant work, and two named references via our application portal. If you would like to discuss the role, please email Adinda at: A.vant.Klooster@mmu.ac.uk
EEO and Inclusivity Statement:
Manchester Metropolitan University fosters an inclusive culture of belonging that promotes equity and celebrates diversity. We value a diverse workforce for the innovation and diversity of thought it brings and welcome applications from all local and international communities, including Black, Asian and Minority Ethnic backgrounds, disabled people and LGBTQ+ individuals. We support a range of flexible working arrangements, including hybrid and tailored schedules, which can be discussed with your line manager. If you require reasonable adjustments during the recruitment process or in your role, please let us know so we can provide appropriate support. Our commitment to inclusivity includes mentoring programmes, accessibility resources and professional development opportunities to empower and support underrepresented groups. Manchester Met is a Disability Confident Leader and, under this scheme, aims to offer an interview to disabled people who apply for the role and meet the essential criteria as listed in the attached Job Description for that vacancy.
Research Associate (Emote VR Voicer Virtual Reality) in Manchester employer: Manchester Metropolitan University
Contact Detail:
Manchester Metropolitan University Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Research Associate (Emote VR Voicer Virtual Reality) in Manchester
✨Tip Number 1
Get to know the team! Before your interview, do a bit of research on the people you'll be working with. Understanding their backgrounds and interests can help you connect during the chat and show that you're genuinely interested in the project.
✨Tip Number 2
Show off your skills! Bring along a portfolio or examples of your previous work that highlight your programming prowess in Unity and any VR projects you've tackled. This is your chance to shine and demonstrate how you can contribute to the Emote VR Voicer project.
✨Tip Number 3
Be ready to collaborate! Since this role involves working closely with artists and psychologists, be prepared to discuss how you can integrate your technical skills with their creative ideas. Highlight your openness to interdisciplinary collaboration during the interview.
✨Tip Number 4
Apply through our website! We encourage you to submit your application via our portal. It’s the best way to ensure your CV and cover letter get seen by the right people. Plus, it shows you’re serious about joining our innovative team!
We think you need these skills to ace Research Associate (Emote VR Voicer Virtual Reality) in Manchester
Some tips for your application 🫡
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Make sure to highlight how your skills and experiences align with the role. We want to see your passion for interdisciplinary collaboration and how you can contribute to the Emote VR Voicer project.
Showcase Your Previous Work: Don’t forget to include a link to your previous relevant work! This is a great way for us to see your programming skills in action, especially with Unity and C#. Make it easy for us to see what you can bring to the table.
Tailor Your CV: Make sure your CV is tailored to the job description. Highlight your experience with VR application development, C#, and any research projects you've been involved in. We love seeing candidates who pay attention to detail!
Apply Through Our Website: Remember to apply through our application portal! It’s the best way for us to keep track of your application and ensure it gets the attention it deserves. Plus, it makes the process smoother for everyone involved.
How to prepare for a job interview at Manchester Metropolitan University
✨Know Your Tech
Make sure you brush up on your programming skills, especially in C# and Unity. Familiarise yourself with VR application development and be ready to discuss any relevant projects you've worked on. This will show that you're not just a good fit for the role but also passionate about the technology.
✨Show Your Collaborative Spirit
Since this role involves working closely with artists, psychologists, and AI researchers, highlight your experience in interdisciplinary teams. Share examples of how you've successfully collaborated in the past, as this will demonstrate your openness to diverse perspectives and teamwork.
✨Prepare for Creative Problem-Solving
Think of specific challenges you've faced in previous projects and how you overcame them. Be ready to discuss your creative problem-solving skills, especially in relation to real-time system optimisation and procedural animation generation. This will showcase your ability to think on your feet.
✨Research and Reflect
Before the interview, take some time to research the Emote VR Voicer project and its goals. Reflect on how your skills and experiences align with their vision. Being able to articulate this connection will impress the interviewers and show that you're genuinely interested in contributing to their mission.