Senior Data Engineer

Senior Data Engineer

Kidlington Full-Time 43200 - 72000 ÂŁ / year (est.) No home office possible
E

At a Glance

  • Tasks: Lead the design and implementation of scalable data pipelines using Python and SQL.
  • Company: Join Elysia - Battery Intelligence, a forward-thinking tech company focused on battery solutions.
  • Benefits: Enjoy hybrid working, competitive salary, and opportunities for professional growth.
  • Why this job: Be part of a mission-driven team that values innovation and collaboration in data engineering.
  • Qualifications: Master’s degree or equivalent experience with 5+ years in cloud-based data engineering required.
  • Other info: Mentorship opportunities available for junior engineers and a chance to work with cutting-edge technologies.

The predicted salary is between 43200 - 72000 ÂŁ per year.

Senior Data Engineer – Elysia – Battery Intelligence from Fortescue

Job Title: Senior Data Engineer

Reports To: Principal Data Engineer

Department: Digital – Elysia

Direct Reports: As required (not immediately)

Position Type: Permanent

Location: Kidlington, Oxford or Central London location available

Onsite policy: Hybrid, 3 days on / 2 days off working offered

Role Purpose: In this “Senior Data Engineer” role within the Elysia Battery Intelligence, you will lead the design and implementation of scalable, production-grade data pipelines across a range of sources such as Automotive, Stationary Storage (ESS), Battery Testing Facilities, and R&D environments. You will take ownership of architectural decisions, define and enforce data modelling and engineering standards, and mentor junior engineers in best practices. Leveraging tools like AWS, Snowflake, Dagster, and Python, you’ll drive the delivery of automated, secure, and observable pipelines that support mission‑critical analytics and product features. This role is pivotal in aligning engineering workflows with scientific, regulatory, and business needs, ensuring high data fidelity, pipeline efficiency, and operational resilience.

Key Responsibilities

  • Lead the design and implementation of robust data pipelines utilising python and SQL
  • Architect scalable ingestion strategies for high‑volume telemetry and time‑series data.
  • Data quality and monitoring: Implement robust data quality checks and monitoring systems to ensure the accuracy, consistency, and reliability of data.
  • Define data modelling standards and enforce schema governance in database solutions.
  • Collaborate with product, analytics, and cloud teams to define SLAs, metrics, and data contracts.
  • Mentor and support junior data engineers via code reviews, pairing, and design sessions.
  • Identify inefficient data processes, engineer solutions to improve operational efficiencies, performance, and scalability, and create accessible data models supporting analytics business functions.
  • Actively participate in improving customer onboarding data pipelines and ensure data integrity and security.
  • Collaborate with cross‑functional teams to understand data requirements and ensure data pipelines meet business needs.
  • Contribute to roadmap planning, tool evaluation, and architectural decisions.

Qualifications & Experience

  • Master’s degree in a relevant field (Engineering, Physics, Mathematics, Computer Science, or similar) or equivalent experience.
  • 5+ years’ experience in data engineering on cloud‑based systems
  • Strong expertise in Python and SQL for data engineering
  • Strong skills in data modelling and ETL/ELT processes
  • Experience with modern data warehouse platforms (e.g., Snowflake, BigQuery):
  • Experience with RDBMS or time‑series databases
  • Experience with streaming data sources such as Kafka or MQTT or AWS Kinesis
  • Experience in coding best practices, and source code management.
  • Experience in data engineering and data science‑related modules in Python ecosystem.
  • Strong experience in version control frameworks (GitHub/GitLab) and CI/CD workflows
  • Ability to communicate the ideas/solutions to colleagues and customers.
  • Ability to stay up to date with industry trends and emerging technologies.
  • Ability to present the solutions to management and stakeholders.
  • Ability to work both independently and collaboratively.

Beneficial

  • Experience using modern data orchestration tools (e.g. Dagster, Airflow, Prefect, AiiDA).
  • Experience with AWS services (EC2, ECS, Lambda, Glue, Athena, DynamoDB)
  • Experience in creating/managing containerized applications.

This job description is not exhaustive, and the job holder will be required to carry out from time‑to‑time tasks in addition to the above that will be both reasonable and within their capabilities.

#J-18808-Ljbffr

Senior Data Engineer employer: Elysia - Battery Intelligence from Fortescue

Elysia - Battery Intelligence from Fortescue is an exceptional employer, offering a dynamic work culture that fosters innovation and collaboration in the heart of Kidlington or Central London. With a strong emphasis on employee growth, you will have the opportunity to mentor junior engineers while working with cutting-edge technologies like AWS and Snowflake, all within a hybrid work model that promotes work-life balance. Join us to be part of a mission-driven team dedicated to advancing battery intelligence and making a meaningful impact in the industry.
E

Contact Detail:

Elysia - Battery Intelligence from Fortescue Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer

✨Tip Number 1

Familiarise yourself with the specific tools mentioned in the job description, such as AWS, Snowflake, and Dagster. Having hands-on experience or projects showcasing your skills with these technologies can set you apart from other candidates.

✨Tip Number 2

Network with current or former employees of Elysia or Fortescue. Engaging with them on platforms like LinkedIn can provide you with insider knowledge about the company culture and expectations, which can be invaluable during interviews.

✨Tip Number 3

Prepare to discuss your experience with data modelling and ETL/ELT processes in detail. Be ready to share specific examples of how you've improved data pipelines or ensured data quality in previous roles, as this aligns closely with the responsibilities of the position.

✨Tip Number 4

Showcase your mentoring experience. Since the role involves supporting junior engineers, highlighting any past experiences where you've guided others or led teams can demonstrate your leadership capabilities and fit for the role.

We think you need these skills to ace Senior Data Engineer

Python Programming
SQL Proficiency
Data Pipeline Design
ETL/ELT Processes
Data Modelling
Cloud-Based Data Engineering
Experience with Snowflake or BigQuery
Streaming Data Handling (Kafka, MQTT, AWS Kinesis)
Version Control (GitHub/GitLab)
CI/CD Workflows
Data Quality Assurance
Schema Governance
Collaboration with Cross-Functional Teams
Mentoring Junior Engineers
Problem-Solving Skills
Knowledge of Data Orchestration Tools (Dagster, Airflow, Prefect)
Familiarity with AWS Services (EC2, ECS, Lambda, Glue, Athena, DynamoDB)
Containerization Experience

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with Python and SQL. Emphasise your expertise in cloud-based systems and any experience with tools like Snowflake or AWS.

Craft a Compelling Cover Letter: In your cover letter, explain why you're passionate about the role at Elysia and how your background aligns with their mission. Mention specific projects or achievements that demonstrate your ability to lead data pipeline design and implementation.

Showcase Your Technical Skills: Include a section in your application that lists your technical skills, especially those mentioned in the job description, such as data modelling, ETL processes, and experience with orchestration tools like Dagster or Airflow.

Highlight Mentorship Experience: If you have experience mentoring junior engineers, be sure to include this in your application. Discuss how you've supported others in best practices and contributed to team development.

How to prepare for a job interview at Elysia - Battery Intelligence from Fortescue

✨Showcase Your Technical Skills

Be prepared to discuss your experience with Python, SQL, and data engineering tools like Snowflake and Dagster. Bring examples of past projects where you designed and implemented data pipelines, highlighting your problem-solving skills and technical expertise.

✨Understand the Company’s Mission

Research Elysia - Battery Intelligence and understand their focus on battery technology and data solutions. Be ready to explain how your skills align with their goals and how you can contribute to their mission of improving data fidelity and operational resilience.

✨Prepare for Scenario-Based Questions

Expect questions that assess your ability to handle real-world data challenges. Think about scenarios where you've improved data processes or mentored junior engineers, and be ready to discuss your approach and the outcomes.

✨Demonstrate Collaboration Skills

Since the role involves working with cross-functional teams, be prepared to talk about your experience collaborating with product, analytics, and cloud teams. Highlight any successful projects where teamwork was key to achieving results.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

E
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>