Hadoop Engineer - ODP Platform in Aberdeen

Hadoop Engineer - ODP Platform in Aberdeen

Aberdeen Full-Time 48000 - 72000 € / year (est.) No home office possible
Experis

At a Glance

  • Tasks: Design and maintain data pipelines using Hadoop technologies in a hybrid work environment.
  • Company: Join a forward-thinking company focused on enhancing operational data platforms.
  • Benefits: Enjoy hybrid working, competitive salary, and opportunities for professional growth.
  • Other info: Work in a collaborative environment with a focus on security and compliance.
  • Why this job: Be part of a dynamic team driving innovation in data engineering and analytics.
  • Qualifications: 5+ years in Hadoop and data engineering with strong Python skills required.

The predicted salary is between 48000 - 72000 € per year.

Role Title: Hadoop Engineer / ODP Platform
Location: Birmingham / Sheffield - Hybrid working with 3 days onsite per week
End Date: 28/11/2025
Role Overview:
We are seeking a highly skilled Hadoop Engineer to support and enhance our Operational Data Platform (ODP) deployed in an on-premises environment.
The ideal candidate will have extensive experience in the Hadoop ecosystem, strong programming skills, and a solid understanding of infrastructure-level data analytics. This role focuses on building and maintaining scalable, secure, and high-performance data pipelines within enterprise-grade on-prem systems.
Key Responsibilities:

  • Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure.
  • Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing.
  • Develop robust data engineering solutions using Python for automation and transformation.
  • Collaborate with infrastructure and analytics teams to support operational data use cases.
  • Monitor and troubleshoot data jobs, ensuring reliability and performance across the platform.
  • Ensure compliance with enterprise security and data governance standards.


Required Skills & Experience:

  • Minimum 5 years of experience in Hadoop and data engineering.
  • Strong hands-on experience with Python, Apache Airflow, and Spark Streaming.
  • Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments.
  • Exposure to data analytics, preferably involving infrastructure or operational data.
  • Experience working with Linux systems, shell scripting, and enterprise-grade deployment tools.
  • Familiarity with monitoring and logging tools relevant to on-prem setups.


Preferred Qualifications:

  • Experience with enterprise ODP platforms or similar large-scale data systems.
  • Knowledge of configuration management tools (e.g., Ansible, Puppet) and CI/CD in on-prem environments.
  • Understanding of network and storage architecture in data centers.
  • Familiarity with data security, compliance, and audit requirements in regulated industries.

JBRP1_UKTJ

Hadoop Engineer - ODP Platform in Aberdeen employer: Experis

Join a forward-thinking company that values innovation and collaboration, offering a hybrid working model in the vibrant cities of Birmingham and Sheffield. As a Hadoop Engineer, you will thrive in a supportive work culture that prioritises employee growth through continuous learning opportunities and cutting-edge projects. Enjoy the unique advantage of working on enterprise-grade data solutions while being part of a team that champions diversity and inclusion.

Experis

Contact Detail:

Experis Recruiting Team

StudySmarter Expert Advice🤫

We think this is how you could land Hadoop Engineer - ODP Platform in Aberdeen

Tip Number 1

Familiarise yourself with the specific Hadoop technologies mentioned in the job description, such as HDFS, Hive, and Spark Streaming. Being able to discuss your hands-on experience with these tools during an interview will demonstrate your suitability for the role.

Tip Number 2

Showcase your problem-solving skills by preparing examples of how you've monitored and troubleshot data jobs in previous roles. This will highlight your ability to ensure reliability and performance, which is crucial for the position.

Tip Number 3

Network with professionals in the data engineering field, especially those who have experience with on-premises environments. Engaging in discussions about best practices and challenges can provide valuable insights that you can bring up during your interview.

Tip Number 4

Research the company’s current projects and initiatives related to their Operational Data Platform. Being knowledgeable about their work will allow you to tailor your responses and show genuine interest in contributing to their goals.

We think you need these skills to ace Hadoop Engineer - ODP Platform in Aberdeen

Hadoop Ecosystem Expertise
Python Programming
Apache Airflow
Spark Streaming
Data Pipeline Development
HDFS Knowledge
Hive Proficiency

Some tips for your application 🫡

Tailor Your CV:Make sure your CV highlights your experience with Hadoop, Python, and data engineering. Use specific examples from your past roles that demonstrate your skills in building data pipelines and working with on-premises environments.

Craft a Strong Cover Letter:In your cover letter, express your enthusiasm for the role and the company. Mention your relevant experience with Hadoop technologies and how you can contribute to enhancing their Operational Data Platform.

Showcase Relevant Projects:If you have worked on projects involving Apache Airflow, Spark Streaming, or similar technologies, be sure to include these in your application. Describe your role and the impact of your contributions.

Highlight Compliance Knowledge:Since the role requires understanding of data security and compliance, mention any experience you have in these areas. This could include familiarity with regulations or standards you've worked with in previous positions.

How to prepare for a job interview at Experis

Showcase Your Technical Skills

Be prepared to discuss your hands-on experience with Hadoop technologies, especially HDFS, Hive, and Spark Streaming. Bring examples of data pipelines you've built or optimised, and be ready to explain the challenges you faced and how you overcame them.

Demonstrate Problem-Solving Abilities

Expect technical questions that assess your troubleshooting skills. Prepare to walk through scenarios where you've monitored and resolved data job issues, ensuring reliability and performance. Highlight your analytical thinking and how you approach problem-solving.

Familiarise Yourself with the Company’s Data Environment

Research the company's operational data platform and understand its architecture. Knowing how they utilise Hadoop and related technologies will help you tailor your responses and show genuine interest in their specific environment.

Emphasise Collaboration and Communication

Since the role involves working with various teams, be ready to discuss your experience collaborating with infrastructure and analytics teams. Share examples of how you’ve communicated complex technical concepts to non-technical stakeholders.