At a Glance
- Tasks: Design and maintain data pipelines using Hadoop technologies in a hybrid work environment.
- Company: Join a forward-thinking company focused on enhancing operational data platforms.
- Benefits: Enjoy hybrid working, competitive salary, and opportunities for professional growth.
- Other info: Work in a collaborative environment with a focus on security and compliance.
- Why this job: Be part of a dynamic team driving innovation in data engineering and analytics.
- Qualifications: 5+ years in Hadoop and data engineering with strong Python skills required.
The predicted salary is between 48000 - 72000 β¬ per year.
Role Title: Hadoop Engineer / ODP Platform
Location: Birmingham / Sheffield - Hybrid working with 3 days onsite per week
End Date: 28/11/2025
Role Overview:
We are seeking a highly skilled Hadoop Engineer to support and enhance our Operational Data Platform (ODP) deployed in an on-premises environment.
The ideal candidate will have extensive experience in the Hadoop ecosystem, strong programming skills, and a solid understanding of infrastructure-level data analytics. This role focuses on building and maintaining scalable, secure, and high-performance data pipelines within enterprise-grade on-prem systems.
Key Responsibilities:
- Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure.
- Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing.
- Develop robust data engineering solutions using Python for automation and transformation.
- Collaborate with infrastructure and analytics teams to support operational data use cases.
- Monitor and troubleshoot data jobs, ensuring reliability and performance across the platform.
- Ensure compliance with enterprise security and data governance standards.
Required Skills & Experience:
- Minimum 5 years of experience in Hadoop and data engineering.
- Strong hands-on experience with Python, Apache Airflow, and Spark Streaming.
- Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments.
- Exposure to data analytics, preferably involving infrastructure or operational data.
- Experience working with Linux systems, shell scripting, and enterprise-grade deployment tools.
- Familiarity with monitoring and logging tools relevant to on-prem setups.
Preferred Qualifications:
- Experience with enterprise ODP platforms or similar large-scale data systems.
- Knowledge of configuration management tools (e.g., Ansible, Puppet) and CI/CD in on-prem environments.
- Understanding of network and storage architecture in data centers.
- Familiarity with data security, compliance, and audit requirements in regulated industries.
JBRP1_UKTJ
Hadoop Engineer - ODP Platform in Coventry employer: Experis
Join a forward-thinking company that values innovation and collaboration, offering a dynamic work culture in Birmingham or Sheffield with hybrid working options. As a Hadoop Engineer, you will have the opportunity to work on cutting-edge data solutions while benefiting from professional development programmes and a supportive environment that encourages growth and creativity. Enjoy competitive benefits and the chance to make a meaningful impact within a team dedicated to excellence in data engineering.
StudySmarter Expert Adviceπ€«
We think this is how you could land Hadoop Engineer - ODP Platform in Coventry
β¨Tip Number 1
Make sure to showcase your hands-on experience with Hadoop technologies during networking events or meetups. Engaging with professionals in the field can help you learn about potential job openings and get insider tips on what employers are looking for.
β¨Tip Number 2
Join online forums or communities focused on Hadoop and data engineering. Participating in discussions can not only enhance your knowledge but also connect you with industry experts who might refer you to job opportunities at companies like us.
β¨Tip Number 3
Consider contributing to open-source projects related to Hadoop or data engineering. This not only demonstrates your skills but also helps you build a portfolio that can impress potential employers when you apply through our website.
β¨Tip Number 4
Stay updated on the latest trends and technologies in the Hadoop ecosystem. Following relevant blogs, podcasts, or webinars can give you insights that you can discuss during interviews, showing your passion and commitment to the field.
We think you need these skills to ace Hadoop Engineer - ODP Platform in Coventry
Some tips for your application π«‘
Tailor Your CV:Make sure your CV highlights your experience with Hadoop, Python, and data engineering. Use specific examples from your past roles that demonstrate your skills in building data pipelines and working with the Hadoop ecosystem.
Craft a Compelling Cover Letter:In your cover letter, express your enthusiasm for the role and the company. Mention your relevant experience and how it aligns with the responsibilities outlined in the job description, particularly your hands-on experience with Apache Airflow and Spark Streaming.
Showcase Relevant Projects:If you have worked on projects involving Hadoop or data engineering, include them in your application. Describe your role, the technologies used, and the impact of your work to demonstrate your capability in handling similar tasks.
Highlight Compliance Knowledge:Since the role requires understanding of data security and compliance, mention any relevant experience you have in these areas. This could include familiarity with regulations or standards you've worked with in previous positions.
How to prepare for a job interview at Experis
β¨Showcase Your Technical Skills
Be prepared to discuss your hands-on experience with Hadoop technologies, especially HDFS, Hive, and Spark Streaming. Bring examples of past projects where you designed and maintained data pipelines, as this will demonstrate your expertise in the field.
β¨Demonstrate Problem-Solving Abilities
Expect to face technical scenarios or case studies during the interview. Practice explaining how you would troubleshoot data jobs and ensure performance across the platform. Highlight your analytical thinking and how you've resolved issues in previous roles.
β¨Familiarise Yourself with the Companyβs Data Environment
Research the company's operational data platform and understand its architecture. Knowing how they utilise Hadoop and related technologies will help you tailor your responses and show genuine interest in their specific challenges and goals.
β¨Prepare for Collaboration Questions
Since the role involves working closely with infrastructure and analytics teams, be ready to discuss your experience in collaborative environments. Share examples of how youβve worked with cross-functional teams to achieve common objectives, particularly in data engineering projects.