At a Glance
- Tasks: Build and maintain complex data pipelines for AI applications and support data projects.
- Company: Join the world's largest university press, dedicated to advancing academic knowledge.
- Benefits: Enjoy 25 days holiday, flexible working, and a generous pension contribution.
- Why this job: Be part of a mission-driven team that values innovation and collaboration in a dynamic environment.
- Qualifications: Expertise in Python, AWS, and data pipeline management is essential; experience with AI tools is a plus.
- Other info: This is a 12-month fixed term contract with opportunities for professional growth.
The predicted salary is between 48000 - 72000 £ per year.
We are the world\’s largest university press. That means we serve the academic community as no other publisher can. We work in partnership with institutions and learned societies to bring a world of knowledge to the fingertips of students and researchers worldwide.
The goal is impact. Together with our academic communities, we curate and seamlessly connect together the ideas that push their fields forward, so they can learn from them, add to them, and continue a virtuous cycle of scholarship.
And because we are a part of the academic community and guided in everything we do by our mission, we re-invest in our people, our publishing, and the world-leading research institution of which we are part.
About the Role
As our new Senior Data Engineer, you will be responsible for building, maintaining, and running complex data pipeline processes associated with integrated AI applications, to enable new product features and business benefits. You must investigate, diagnose and correct data defects, as well as identifying and reporting data processing issues.
You will also be responsible for integrating applications so they work effectively with data, and thus supporting colleagues on data projects and delivering against the strategic requirements of the New Ventures innovation programme. This involves working with a range of platforms, systems, and tools developed as standard applications by OUP\’s Technology teams or by third-party suppliers, integrating them with each other and with existing infrastructure, and configuring or customizing those applications to meet business needs.
As a senior team member, you will be responsible for operating at an advanced technical level, with expert programming skills and the ability to analyse and rearchitect existing data pipelines, as well as setting data engineering best practice.
We operate a hybrid working policy that requires a minimum of 2 days per week in the Oxford office.
About You
Essential:
- Experience building and optimising complex data pipelines, architectures and data sets.
- Experience of working with structured data in XML and / or JSON format as well as unstructured data.
- Practical experience with data pipelines and workflow management tools, with a preference for Jenkins.
- Expert proficiency in Python.
- Experience with AWS cloud services including EC2, S3, Lambda, and SageMaker.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- Experience of working with a variety of data manipulation tools to identify and correct problems.
- Experience in identifying and implementing process improvements.
- Demonstrated understanding of AI tools, concepts, and related skills such as prompt engineering
Desirable:
- Linux administration experience (e.g. bash scripting, sed, awk).
- Experience of working with linguistic data and NLP (natural language processing) tools.
- Experience of automated test methodologies and frameworks, including test-driven development and behaviour-driven development.
- Relational database and SQL skills.
- Proficiency in server-side scripting languages other than Python.
Please note, this is a 12 month fixed term contract.
Benefits
We care about work/life balance here at OUP. With this in mind we offer 25 days\’ holiday that rises with service, plus bank holidays and Christmas closure (3-days) and a 35-hour working week. We are open to discussing flexibility in respect to working patterns, dependent on role. We also have a great variety of active employee networks and societies.
We help make your money go further by contributing to your pension up to 12%, offering loans and savings schemes through our partnership with Salary Finance, in addition to travel to work schemes and access to a wide range of local discounts.
This role comes with the added benefit of a discretionary annual payment.
Please see our Rewards and Recognition page for more information.
Queries
Please contact grace.mcfadyen@oup.com with any queries relating to this role.
We are committed to supporting diversity in our workforce, and ensuring an inclusive environment where all individuals can thrive. We seek to employ a workforce representative of the markets that we serve and encourage applications from all. #J-18808-Ljbffr
Senior Data Engineer (Fixed Term Contract) employer: Oxford University Press
Contact Detail:
Oxford University Press Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer (Fixed Term Contract)
✨Tip Number 1
Familiarise yourself with the specific data pipeline tools mentioned in the job description, particularly Jenkins. Having hands-on experience or even a project showcasing your skills with these tools can set you apart from other candidates.
✨Tip Number 2
Brush up on your Python programming skills, as expert proficiency is essential for this role. Consider contributing to open-source projects or creating your own data engineering projects to demonstrate your capabilities.
✨Tip Number 3
Network with professionals in the field of data engineering, especially those who have experience with AWS services. Engaging in discussions or attending meetups can provide insights and potentially lead to referrals.
✨Tip Number 4
Stay updated on the latest trends in AI tools and concepts, as understanding these will be crucial for the role. Consider taking online courses or certifications that focus on AI and data engineering to enhance your knowledge.
We think you need these skills to ace Senior Data Engineer (Fixed Term Contract)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with building and optimising complex data pipelines, as well as your proficiency in Python and AWS services. Use specific examples to demonstrate your skills.
Craft a Strong Cover Letter: In your cover letter, explain why you are passionate about the role of Senior Data Engineer and how your background aligns with the company's mission. Mention your experience with AI tools and cross-functional teams.
Showcase Relevant Projects: If you have worked on projects involving data manipulation tools or AI applications, be sure to include these in your application. Describe your role and the impact of your contributions.
Proofread Your Application: Before submitting, carefully proofread your application for any spelling or grammatical errors. A polished application reflects your attention to detail, which is crucial for a data engineering role.
How to prepare for a job interview at Oxford University Press
✨Showcase Your Data Pipeline Expertise
Be prepared to discuss your experience in building and optimising complex data pipelines. Highlight specific projects where you successfully integrated AI applications, as this aligns closely with the role's requirements.
✨Demonstrate Technical Proficiency
Make sure to showcase your expert programming skills, particularly in Python. Be ready to explain how you've used AWS services like EC2, S3, and Lambda in past projects, as these are crucial for the position.
✨Discuss Cross-Functional Collaboration
Since the role involves working with cross-functional teams, prepare examples of how you've effectively collaborated with others in a dynamic environment. This will demonstrate your ability to support colleagues on data projects.
✨Prepare for Problem-Solving Scenarios
Expect questions about diagnosing and correcting data defects. Prepare to discuss specific instances where you've identified and resolved data processing issues, showcasing your analytical skills and attention to detail.