Hadoop Engineer - ODP Platform in Edinburgh

Hadoop Engineer - ODP Platform in Edinburgh

Edinburgh Full-Time 48000 - 72000 € / year (est.) No home office possible
Experis

At a Glance

  • Tasks: Design and maintain data pipelines using Hadoop technologies in a hybrid work environment.
  • Company: Join a forward-thinking company focused on enhancing operational data platforms.
  • Benefits: Enjoy hybrid working, competitive salary, and opportunities for professional growth.
  • Other info: Work in a collaborative environment with a focus on security and compliance.
  • Why this job: Be part of a dynamic team driving innovation in data engineering and analytics.
  • Qualifications: 5+ years in Hadoop and data engineering with strong Python skills required.

The predicted salary is between 48000 - 72000 € per year.

Role Title: Hadoop Engineer / ODP Platform
Location: Birmingham / Sheffield - Hybrid working with 3 days onsite per week
End Date: 28/11/2025
Role Overview:
We are seeking a highly skilled Hadoop Engineer to support and enhance our Operational Data Platform (ODP) deployed in an on-premises environment.
The ideal candidate will have extensive experience in the Hadoop ecosystem, strong programming skills, and a solid understanding of infrastructure-level data analytics. This role focuses on building and maintaining scalable, secure, and high-performance data pipelines within enterprise-grade on-prem systems.
Key Responsibilities:

  • Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure.
  • Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing.
  • Develop robust data engineering solutions using Python for automation and transformation.
  • Collaborate with infrastructure and analytics teams to support operational data use cases.
  • Monitor and troubleshoot data jobs, ensuring reliability and performance across the platform.
  • Ensure compliance with enterprise security and data governance standards.


Required Skills & Experience:

  • Minimum 5 years of experience in Hadoop and data engineering.
  • Strong hands-on experience with Python, Apache Airflow, and Spark Streaming.
  • Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments.
  • Exposure to data analytics, preferably involving infrastructure or operational data.
  • Experience working with Linux systems, shell scripting, and enterprise-grade deployment tools.
  • Familiarity with monitoring and logging tools relevant to on-prem setups.


Preferred Qualifications:

  • Experience with enterprise ODP platforms or similar large-scale data systems.
  • Knowledge of configuration management tools (e.g., Ansible, Puppet) and CI/CD in on-prem environments.
  • Understanding of network and storage architecture in data centers.
  • Familiarity with data security, compliance, and audit requirements in regulated industries.

JBRP1_UKTJ

Hadoop Engineer - ODP Platform in Edinburgh employer: Experis

Join a forward-thinking company that values innovation and collaboration, offering a dynamic work culture in Birmingham or Sheffield with hybrid working options. As a Hadoop Engineer, you will have the opportunity to work on cutting-edge data engineering projects while benefiting from professional development programmes and a supportive environment that encourages growth and creativity. Enjoy competitive benefits and the chance to make a meaningful impact within a team dedicated to excellence in data analytics.

Experis

Contact Detail:

Experis Recruiting Team

StudySmarter Expert Advice🤫

We think this is how you could land Hadoop Engineer - ODP Platform in Edinburgh

Tip Number 1

Familiarise yourself with the specific Hadoop technologies mentioned in the job description, such as HDFS, Hive, and Spark Streaming. Being able to discuss your hands-on experience with these tools during an interview will demonstrate your suitability for the role.

Tip Number 2

Showcase your problem-solving skills by preparing examples of how you've monitored and troubleshot data jobs in previous roles. This will highlight your ability to ensure reliability and performance, which is crucial for the position.

Tip Number 3

Network with professionals in the data engineering field, especially those who have experience with on-premises environments. Engaging with industry peers can provide insights and potentially lead to referrals that could strengthen your application.

Tip Number 4

Stay updated on the latest trends and best practices in data governance and security, particularly in regulated industries. Being knowledgeable about compliance standards will set you apart as a candidate who understands the importance of data integrity.

We think you need these skills to ace Hadoop Engineer - ODP Platform in Edinburgh

Hadoop Ecosystem Expertise
Python Programming
Apache Airflow
Spark Streaming
Data Pipeline Development
HDFS Knowledge
Hive Proficiency

Some tips for your application 🫡

Tailor Your CV:Make sure your CV highlights your experience with Hadoop, Python, and data engineering. Use specific examples from your past roles that demonstrate your skills in building data pipelines and working with Apache Airflow and Spark Streaming.

Craft a Compelling Cover Letter:In your cover letter, explain why you are passionate about the role of Hadoop Engineer and how your background aligns with the responsibilities outlined in the job description. Mention your experience with on-premises environments and any relevant projects you've worked on.

Showcase Relevant Projects:If you have worked on specific projects involving Hadoop or data engineering, include these in your application. Describe your role, the technologies used, and the outcomes achieved to give the hiring team a clear picture of your capabilities.

Highlight Compliance Knowledge:Since the role requires an understanding of data security and compliance, make sure to mention any experience you have in these areas. This could include familiarity with regulations or standards you've worked with in previous positions.

How to prepare for a job interview at Experis

Showcase Your Technical Skills

Be prepared to discuss your hands-on experience with Hadoop technologies, especially HDFS, Hive, and Spark Streaming. Bring examples of data pipelines you've built or optimised, and be ready to explain the challenges you faced and how you overcame them.

Demonstrate Problem-Solving Abilities

Expect technical questions that assess your troubleshooting skills. Prepare to walk through scenarios where you've monitored and resolved data job issues, ensuring reliability and performance. Highlight your analytical thinking and how you approach problem-solving.

Understand the Company’s Data Needs

Research the company’s operational data use cases and be ready to discuss how your experience aligns with their needs. Show that you understand the importance of compliance with security and data governance standards in their environment.

Prepare for Collaboration Questions

Since this role involves working closely with infrastructure and analytics teams, be ready to discuss your experience collaborating on projects. Share examples of how you’ve effectively communicated and worked with cross-functional teams to achieve common goals.