At a Glance
- Tasks: Build and maintain data pipelines, ensuring seamless data flow for analytics.
- Company: Ipsos is a leading global research company, dedicated to delivering top-quality insights.
- Benefits: Enjoy 25 days annual leave, flexible working options, and professional development opportunities.
- Why this job: Join a dynamic team shaping the future of data infrastructure with cutting-edge technology.
- Qualifications: 2-3 years experience in data engineering, proficient in Python, SQL, and cloud platforms.
- Other info: Hybrid work model; commitment to diversity and inclusion in the workplace.
The predicted salary is between 30000 - 42000 £ per year.
Job Description
Data Engineer – Audience Measurement Market Research
Make Your Mark at Ipsos
Ipsos CrossMedia is at a pivotal and exciting stage of growth, and we are looking for a Data Engineer to join our innovative Audience Measurement team. This is a fantastic opportunity to work on both high-profile, established projects for blue-chip clients and to contribute to brand-new, greenfield initiatives.
This role will suit candidates with 2 to 3 years of commercial experience.
The role is based in our Cambridge offices but we also welcome applications from those who can work from our central London offices and able to travel to our offices in Cambridge where the main team is based.
What is in it for you?
We are constantly evolving our workflows and are committed to investing in cutting-edge technology. If you are passionate about building and deploying data-centric systems on a major cloud platform and want to make a tangible impact, you will thrive here. You will have the opportunity to contribute ideas and grow with a team that is shaping the future of our data infrastructure.
- Opportunity to Build and Maintain Data Pipelines: Your primary responsibility will be to build, maintain, and improve our data pipelines and ETL/ELT processes
- Work with Data Warehousing Solutions: You will work with our data warehousing solutions, contributing to data models and optimizing queries to ensure data is accessible and performant for our analytics teams.
- Develop and Monitor Data Workflows: You will help develop, maintain, and monitor our data ingestion and delivery pipelines using modern orchestration tools, ensuring data flows seamlessly and reliably
- Uphold Data Quality: You will apply best practices for data quality, testing, and observability, helping to ensure the data delivered to stakeholders is accurate and trustworthy.
- Collaborate on Data-Driven Solutions: You will work closely with our talented Data Scientists and R&D teams, understanding their requirements to provide the clean and structured data needed to power their research
- Support System Reliability: You will help monitor the health and performance of our data systems. When issues arise, you\’ll assist with root cause analysis, deploy fixes, and provide technical support.
- Contribute to Technical Excellence: You will continuously learn about new data technologies, help test and implement enhancements to our data platform, and contribute to technical documentation.
The Role:
As a key member of our team, you will be a key part of our data platform, helping to ensure its reliability, scalability, and efficiency.
About you:
- Experience in Data Pipeline and ETL Development: Solid experience building and maintaining data pipelines, with a good understanding of ETL/ELT patterns.
- Proficiency in Python and SQL: Strong, hands-on experience using Python for data processing and automation, and solid SQL skills for querying and data manipulation.
- Understanding of Data Modeling and Warehousing: A good understanding of data modeling techniques and data warehousing concepts.
- Expertise with Cloud Platforms: Experience with major cloud providers (GCP, AWS, or Azure) and their core data services. We primarily use GCP, so experience there is a significant plus.
- Familiarity with Big Data Technologies: Exposure to or experience with large-scale data processing frameworks (e.g., Spark, or similar).
- Workflow Orchestration: Familiarity with data workflow orchestration tools (e.g., Airflow, or similar).
- Infrastructure as Code (IaC): An interest in or exposure to IaC tools (e.g., Terraform).
- Containerization: Familiarity with container technologies like Docker and Kubernetes.
- CI/CD for Data: A basic understanding of how to apply continuous integration/delivery principles to data workflows.
- Data Quality and Testing: An interest in modern data quality and testing frameworks.
- Version Control: Proficiency with version control systems like Git.
Benefits:
We offer a comprehensive benefits package designed to support you as an individual. Our standard benefits include 25 days annual leave, pension contribution, income protection and life assurance. In addition, there are a range health & wellbeing, financial benefits and professional development opportunities.
We realise you may have commitments outside of work and will consider flexible working applications – please highlight what you are looking for when you make your application. We have a hybrid approach to work and ask people to be in the office or with clients for 3 days per week.
We are committed to equality, treating people fairly, promoting a positive and inclusive working environment and ensuring we have diversity of people and views. We recognise that this is important for our business success – a more diverse workforce will enable us to better reflect and understand the world we research and ultimately deliver better research and insight to our clients. We are proud to be a member of the Disability Confident scheme, certified as Level 1 Disability Confident Committed. We are dedicated to providing an inclusive and accessible recruitment process.
Your application will be reviewed by someone from our Talent Team who will be in touch either way to let you know the outcome.
Ready to have an impact? Apply now!
About Us
Ipsos is one of the world\’s largest research companies and currently the only one primarily managed by researchers, ranking as a #1 full-service research organization for four consecutive years. With over 75 different data-driven solutions, and presence in 90 markets, Ipsos brings together research, implementation, methodological, and subject-matter experts from around the world, combining thematic and technical experts to deliver top-quality research and insights. Simply speaking, we help the biggest companies solve some of their biggest problems, serving more than 5000 clients across the globe by providing research, data, and insights on their target markets. And we\’re proud to share we\’ve received our Great Place to Work Certification in 2022 & 2023! #J-18808-Ljbffr
Data Engineer employer: Ipsos
Contact Detail:
Ipsos Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Familiarise yourself with the specific cloud platforms mentioned in the job description, especially GCP. Consider taking online courses or certifications that focus on Google Cloud services to demonstrate your commitment and expertise.
✨Tip Number 2
Engage with the data engineering community by joining relevant forums or attending meetups. Networking with professionals in the field can provide insights into the latest trends and technologies, which could be beneficial during interviews.
✨Tip Number 3
Prepare to discuss your experience with data pipelines and ETL processes in detail. Be ready to share specific examples of projects you've worked on, including challenges faced and how you overcame them, as this will showcase your problem-solving skills.
✨Tip Number 4
Research Ipsos and their work in audience measurement. Understanding their projects and clients will allow you to tailor your discussions and show genuine interest in contributing to their innovative initiatives.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data pipeline and ETL development, as well as your proficiency in Python and SQL. Use specific examples to demonstrate your skills and achievements.
Craft a Compelling Cover Letter: In your cover letter, express your passion for data engineering and how your background aligns with the role at Ipsos. Mention any experience with cloud platforms like GCP, AWS, or Azure, and your familiarity with big data technologies.
Showcase Your Technical Skills: Clearly outline your technical skills related to data modeling, warehousing, and workflow orchestration tools. If you have experience with containerization or CI/CD principles, be sure to include that as well.
Highlight Your Collaborative Experience: Since the role involves working closely with Data Scientists and R&D teams, mention any past experiences where you collaborated on data-driven solutions. This will show your ability to work effectively in a team environment.
How to prepare for a job interview at Ipsos
✨Showcase Your Technical Skills
Be prepared to discuss your experience with data pipelines, ETL/ELT processes, and the specific technologies mentioned in the job description. Highlight your proficiency in Python and SQL, and be ready to provide examples of how you've used these skills in previous roles.
✨Demonstrate Problem-Solving Abilities
Expect questions that assess your ability to troubleshoot and resolve issues within data systems. Prepare to share specific instances where you identified a problem, conducted root cause analysis, and implemented a solution, particularly in relation to data quality and system reliability.
✨Familiarise Yourself with Their Tools
Research the cloud platforms and orchestration tools that Ipsos uses, especially GCP and Airflow. Showing familiarity with these technologies will demonstrate your readiness to contribute from day one and your commitment to continuous learning in the field.
✨Prepare for Collaborative Scenarios
Since the role involves working closely with Data Scientists and R&D teams, be ready to discuss how you approach collaboration. Think of examples where you successfully worked in a team to deliver data-driven solutions, and how you ensured that the data provided met their needs.