At a Glance
- Tasks: Design and maintain data pipelines using Hadoop technologies in a hybrid work environment.
- Company: Join a forward-thinking company focused on enhancing operational data platforms.
- Benefits: Enjoy hybrid working, competitive salary, and opportunities for professional growth.
- Why this job: Be part of a dynamic team that values innovation and collaboration in data engineering.
- Qualifications: 5+ years in Hadoop and data engineering with strong Python skills required.
- Other info: Work in a supportive environment with a focus on security and compliance.
The predicted salary is between 48000 - 72000 £ per year.
Role Title: Hadoop Engineer / ODP Platform
Location: Birmingham / Sheffield – Hybrid working with 3 days onsite per week
End Date: 28/11/2025
Role Overview:
We are seeking a highly skilled Hadoop Engineer to support and enhance our Operational Data Platform (ODP) deployed in an on-premises environment.
The ideal candidate will have extensive experience in the Hadoop ecosystem, strong programming skills, and a solid understanding of infrastructure-level data analytics. This role focuses on building and maintaining scalable, secure, and high-performance data pipelines within enterprise-grade on-prem systems.
Key Responsibilities:
- Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure.
- Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing.
- Develop robust data engineering solutions using Python for automation and transformation.
- Collaborate with infrastructure and analytics teams to support operational data use cases.
- Monitor and troubleshoot data jobs, ensuring reliability and performance across the platform.
- Ensure compliance with enterprise security and data governance standards.
Required Skills & Experience:
- Minimum 5 years of experience in Hadoop and data engineering.
- Strong hands-on experience with Python, Apache Airflow, and Spark Streaming.
- Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments.
- Exposure to data analytics, preferably involving infrastructure or operational data.
- Experience working with Linux systems, shell scripting, and enterprise-grade deployment tools.
- Familiarity with monitoring and logging tools relevant to on-prem setups.
Preferred Qualifications:
- Experience with enterprise ODP platforms or similar large-scale data systems.
- Knowledge of configuration management tools (e.g., Ansible, Puppet) and CI/CD in on-prem environments.
- Understanding of network and storage architecture in data centers.
- Familiarity with data security, compliance, and audit requirements in regulated industries.
JBRP1_UKTJ
Hadoop Engineer - ODP Platform employer: Experis
Contact Detail:
Experis Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Hadoop Engineer - ODP Platform
✨Tip Number 1
Familiarise yourself with the specific Hadoop technologies mentioned in the job description, such as HDFS, Hive, and Spark Streaming. Being able to discuss your hands-on experience with these tools during an interview will demonstrate your expertise and suitability for the role.
✨Tip Number 2
Showcase your problem-solving skills by preparing examples of how you've monitored and troubleshot data jobs in previous roles. This will highlight your ability to ensure reliability and performance, which is crucial for the position.
✨Tip Number 3
Network with professionals in the data engineering field, especially those who have experience with on-premises environments. Engaging with industry peers can provide insights and potentially lead to referrals that could strengthen your application.
✨Tip Number 4
Stay updated on the latest trends and best practices in data governance and security, particularly in regulated industries. Being knowledgeable about compliance standards will set you apart as a candidate who understands the importance of data integrity.
We think you need these skills to ace Hadoop Engineer - ODP Platform
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Hadoop, Python, and data engineering. Use specific examples from your past roles that demonstrate your skills in building data pipelines and working with the Hadoop ecosystem.
Craft a Strong Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention your relevant experience with Apache Airflow and Spark Streaming, and how you can contribute to enhancing their Operational Data Platform.
Showcase Relevant Projects: If you have worked on projects involving Hadoop or data engineering, include them in your application. Describe your role, the technologies used, and the impact of your work on the project's success.
Highlight Compliance Knowledge: Since the role requires understanding of data security and compliance, mention any experience you have in these areas. This could include familiarity with regulations or standards you've worked with in previous positions.
How to prepare for a job interview at Experis
✨Showcase Your Technical Skills
Be prepared to discuss your hands-on experience with Hadoop technologies, especially HDFS, Hive, and Spark Streaming. Bring examples of past projects where you designed and maintained data pipelines, as this will demonstrate your expertise in the field.
✨Demonstrate Problem-Solving Abilities
Expect to face scenario-based questions that assess your troubleshooting skills. Prepare to explain how you've monitored and optimised data jobs in previous roles, highlighting your ability to ensure reliability and performance.
✨Familiarise Yourself with the Company’s Data Environment
Research the company's operational data platform and its specific requirements. Understanding their infrastructure and data governance standards will help you tailor your responses and show that you're genuinely interested in the role.
✨Prepare for Collaboration Questions
Since the role involves working closely with infrastructure and analytics teams, be ready to discuss your experience in collaborative environments. Share examples of how you've successfully worked with cross-functional teams to achieve common goals.