At a Glance
- Tasks: Lead the design and implementation of scalable data pipelines using Python and SQL.
- Company: Join Elysia - Battery Intelligence, a forward-thinking tech company focused on battery solutions.
- Benefits: Enjoy hybrid working, competitive salary, and opportunities for professional growth.
- Why this job: Be part of a mission-driven team that values innovation and collaboration in data engineering.
- Qualifications: Master’s degree or equivalent experience with 5+ years in cloud-based data engineering required.
- Other info: Mentorship opportunities available for junior engineers and a chance to work with cutting-edge technologies.
The predicted salary is between 43200 - 72000 ÂŁ per year.
Senior Data Engineer – Elysia – Battery Intelligence from Fortescue
Job Title: Senior Data Engineer
Reports To: Principal Data Engineer
Department: Digital – Elysia
Direct Reports: As required (not immediately)
Position Type: Permanent
Location: Kidlington, Oxford or Central London location available
Onsite policy: Hybrid, 3 days on / 2 days off working offered
Role Purpose: In this “Senior Data Engineer” role within the Elysia Battery Intelligence, you will lead the design and implementation of scalable, production-grade data pipelines across a range of sources such as Automotive, Stationary Storage (ESS), Battery Testing Facilities, and R&D environments. You will take ownership of architectural decisions, define and enforce data modelling and engineering standards, and mentor junior engineers in best practices. Leveraging tools like AWS, Snowflake, Dagster, and Python, you’ll drive the delivery of automated, secure, and observable pipelines that support mission‑critical analytics and product features. This role is pivotal in aligning engineering workflows with scientific, regulatory, and business needs, ensuring high data fidelity, pipeline efficiency, and operational resilience.
Key Responsibilities
- Lead the design and implementation of robust data pipelines utilising python and SQL
- Architect scalable ingestion strategies for high‑volume telemetry and time‑series data.
- Data quality and monitoring: Implement robust data quality checks and monitoring systems to ensure the accuracy, consistency, and reliability of data.
- Define data modelling standards and enforce schema governance in database solutions.
- Collaborate with product, analytics, and cloud teams to define SLAs, metrics, and data contracts.
- Mentor and support junior data engineers via code reviews, pairing, and design sessions.
- Identify inefficient data processes, engineer solutions to improve operational efficiencies, performance, and scalability, and create accessible data models supporting analytics business functions.
- Actively participate in improving customer onboarding data pipelines and ensure data integrity and security.
- Collaborate with cross‑functional teams to understand data requirements and ensure data pipelines meet business needs.
- Contribute to roadmap planning, tool evaluation, and architectural decisions.
Qualifications & Experience
- Master’s degree in a relevant field (Engineering, Physics, Mathematics, Computer Science, or similar) or equivalent experience.
- 5+ years’ experience in data engineering on cloud‑based systems
- Strong expertise in Python and SQL for data engineering
- Strong skills in data modelling and ETL/ELT processes
- Experience with modern data warehouse platforms (e.g., Snowflake, BigQuery):
- Experience with RDBMS or time‑series databases
- Experience with streaming data sources such as Kafka or MQTT or AWS Kinesis
- Experience in coding best practices, and source code management.
- Experience in data engineering and data science‑related modules in Python ecosystem.
- Strong experience in version control frameworks (GitHub/GitLab) and CI/CD workflows
- Ability to communicate the ideas/solutions to colleagues and customers.
- Ability to stay up to date with industry trends and emerging technologies.
- Ability to present the solutions to management and stakeholders.
- Ability to work both independently and collaboratively.
Beneficial
- Experience using modern data orchestration tools (e.g. Dagster, Airflow, Prefect, AiiDA).
- Experience with AWS services (EC2, ECS, Lambda, Glue, Athena, DynamoDB)
- Experience in creating/managing containerized applications.
This job description is not exhaustive, and the job holder will be required to carry out from time‑to‑time tasks in addition to the above that will be both reasonable and within their capabilities.
#J-18808-Ljbffr
Senior Data Engineer employer: Elysia - Battery Intelligence from Fortescue
Contact Detail:
Elysia - Battery Intelligence from Fortescue Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Familiarise yourself with the specific tools mentioned in the job description, such as AWS, Snowflake, and Dagster. Having hands-on experience or projects showcasing your skills with these technologies can set you apart from other candidates.
✨Tip Number 2
Network with current or former employees of Elysia or Fortescue. Engaging with them on platforms like LinkedIn can provide you with insider knowledge about the company culture and expectations, which can be invaluable during interviews.
✨Tip Number 3
Prepare to discuss your experience with data modelling and ETL/ELT processes in detail. Be ready to share specific examples of how you've improved data pipelines or ensured data quality in previous roles, as this aligns closely with the responsibilities of the position.
✨Tip Number 4
Showcase your mentoring experience. Since the role involves supporting junior engineers, highlighting any past experiences where you've guided others or led teams can demonstrate your leadership capabilities and fit for the role.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with Python and SQL. Emphasise your expertise in cloud-based systems and any experience with tools like Snowflake or AWS.
Craft a Compelling Cover Letter: In your cover letter, explain why you're passionate about the role at Elysia and how your background aligns with their mission. Mention specific projects or achievements that demonstrate your ability to lead data pipeline design and implementation.
Showcase Your Technical Skills: Include a section in your application that lists your technical skills, especially those mentioned in the job description, such as data modelling, ETL processes, and experience with orchestration tools like Dagster or Airflow.
Highlight Mentorship Experience: If you have experience mentoring junior engineers, be sure to include this in your application. Discuss how you've supported others in best practices and contributed to team development.
How to prepare for a job interview at Elysia - Battery Intelligence from Fortescue
✨Showcase Your Technical Skills
Be prepared to discuss your experience with Python, SQL, and data engineering tools like Snowflake and Dagster. Bring examples of past projects where you designed and implemented data pipelines, highlighting your problem-solving skills and technical expertise.
✨Understand the Company’s Mission
Research Elysia - Battery Intelligence and understand their focus on battery technology and data solutions. Be ready to explain how your skills align with their goals and how you can contribute to their mission of improving data fidelity and operational resilience.
✨Prepare for Scenario-Based Questions
Expect questions that assess your ability to handle real-world data challenges. Think about scenarios where you've improved data processes or mentored junior engineers, and be ready to discuss your approach and the outcomes.
✨Demonstrate Collaboration Skills
Since the role involves working with cross-functional teams, be prepared to talk about your experience collaborating with product, analytics, and cloud teams. Highlight any successful projects where teamwork was key to achieving results.