At a Glance
- Tasks: Design and automate data pipelines using cutting-edge tools like Kubernetes and Docker.
- Company: Join Capgemini, a global leader in tech transformation with a focus on sustainability.
- Benefits: Enjoy hybrid working, flexible hours, and access to 250,000 training courses.
- Why this job: Be part of a diverse team driving digital transformation and making a real-world impact.
- Qualifications: Proficiency in data orchestration tools and experience with Docker, Kubernetes, and CI/CD principles required.
- Other info: Must obtain Security Check clearance; ideal for those passionate about innovation and technology.
The predicted salary is between 36000 - 60000 £ per year.
The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers' digital and data transformation journey using the modern cloud platforms. We specialise in using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP.
Hybrid working: The places that you work from day to day will vary according to your role, your needs, and those of the business; it will be a blend of Company offices, client sites, and your home; noting that you will be unable to work at home 100% of the time.
If you are successfully offered this position, you will go through a series of pre-employment checks, including: identity, nationality (single or dual) or immigration status, employment history going back 3 continuous years, and unspent criminal record check (known as Disclosure and Barring Service).
The Data Ops Engineer role focuses on designing, building, automating, and orchestrating data pipelines and applications within containerised environments, primarily Kubernetes. This role bridges the gap between traditional cloud data engineering and DevOps, emphasising automation and continuous delivery of data solutions.
Your work will be to:
- Designing, building, automating and orchestrating data pipelines using tools such as Airflow, Prefect, or Dagster.
- Containerising data applications using Docker and deploying them to Container Platforms (EKS, AKS and Kubernetes).
- Implementing and managing CI/CD pipelines for data applications.
- Implementing and managing comprehensive monitoring and observability solutions using tools like Grafana, Prometheus, and other non-native monitoring tools, ensuring data quality across the entire data flow.
- Working with Infrastructure as Code (IaC) tools (e.g., Terraform, Ansible) to provision and manage data infrastructure within pre-existing platforms.
- Optimising data processing for performance and scalability.
You can bring your whole self to work. At Capgemini, equity, diversity and inclusion is part of everyday life, and will be part of your working reality. We have built an inclusive and welcoming environment for everyone.
Your Skills and Experience:
- Proficiency in data pipeline orchestration tools (e.g., Airflow, Prefect, Dagster).
- Extensive experience with Docker and Kubernetes.
- Proficiency in CI/CD principles and tools.
- Familiarity with open-source data tools (e.g., Spark, Kafka, PostgreSQL).
- Competency understanding of IaC concepts (e.g., Terraform, Ansible).
- Understanding of data architecture principles.
- Experience with monitoring and observability tools like Grafana and Prometheus.
To be successfully appointed to this role, it is a requirement to obtain Security Check (SC) clearance. To obtain SC clearance, the successful applicant must have resided continuously within the United Kingdom for the last 5 years, along with other criteria and requirements. Throughout the recruitment process, you will be asked questions about your security clearance eligibility such as, but not limited to, country of residence and nationality. Some posts are restricted to sole UK Nationals for security reasons; therefore, you may be asked about your citizenship in the application process.
You will be encouraged to have a positive work-life balance. Our hybrid-first way of working means we embed hybrid working in all that we do and make flexible working arrangements the day-to-day reality for our people. All UK employees are eligible to request flexible working arrangements. You will be empowered to explore, innovate, and progress. You will benefit from Capgemini's 'learning for life' mindset, meaning you will have countless training and development opportunities from thinktanks to hackathons, and access to 250,000 courses with numerous external certifications from AWS, Microsoft, Harvard ManageMentor, Cybersecurity qualifications and much more.
Capgemini is a global business and technology transformation partner, helping organisations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fuelled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
DataOps Engineer employer: 55 Redefined Ltd
Contact Detail:
55 Redefined Ltd Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land DataOps Engineer
✨Tip Number 1
Familiarise yourself with the specific tools mentioned in the job description, such as Airflow, Docker, and Kubernetes. Having hands-on experience or projects showcasing your skills with these technologies can set you apart from other candidates.
✨Tip Number 2
Network with current employees or professionals in the DataOps field through platforms like LinkedIn. Engaging in conversations about their experiences at Capgemini can provide valuable insights and potentially lead to referrals.
✨Tip Number 3
Stay updated on the latest trends and advancements in cloud data engineering and DevOps practices. Demonstrating your knowledge of emerging technologies during interviews can show your commitment to continuous learning and innovation.
✨Tip Number 4
Prepare for questions related to security clearance and residency requirements, as these are crucial for this role. Being ready to discuss your eligibility can help streamline the hiring process and demonstrate your understanding of the job's prerequisites.
We think you need these skills to ace DataOps Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience and skills that align with the DataOps Engineer role. Focus on your proficiency in data pipeline orchestration tools, Docker, Kubernetes, and CI/CD principles.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and DevOps. Mention specific projects where you've designed or automated data pipelines, and how you can contribute to the company's goals.
Highlight Relevant Skills: In your application, emphasise your familiarity with open-source data tools like Spark and Kafka, as well as your understanding of Infrastructure as Code concepts. This will demonstrate your technical fit for the role.
Prepare for Security Clearance Questions: Be ready to answer questions regarding your residency and nationality, as obtaining Security Check clearance is a requirement for this position. Ensure you have all necessary information at hand to facilitate this process.
How to prepare for a job interview at 55 Redefined Ltd
✨Showcase Your Technical Skills
Be prepared to discuss your experience with data pipeline orchestration tools like Airflow, Prefect, or Dagster. Highlight specific projects where you've successfully implemented these tools, and be ready to explain the challenges you faced and how you overcame them.
✨Demonstrate Containerisation Knowledge
Since the role involves containerising applications using Docker and deploying them on Kubernetes, make sure to share your hands-on experience with these technologies. Discuss any relevant projects where you optimised performance and scalability in a containerised environment.
✨Understand CI/CD Principles
Familiarise yourself with Continuous Integration and Continuous Delivery (CI/CD) practices. Be ready to explain how you've implemented CI/CD pipelines in past roles, including the tools you used and the impact it had on project delivery.
✨Prepare for Security Clearance Questions
As obtaining Security Check (SC) clearance is a requirement, be prepared to answer questions about your residency and nationality. Ensure you understand the criteria for SC clearance and have your information ready to demonstrate your eligibility.