At a Glance
- Tasks: Design and maintain a modern Data Lakehouse while developing scalable data pipelines.
- Company: Join a dynamic, globally-minded tech team with offices in the UK, US, and Australia.
- Benefits: Enjoy flexible working, global impact, and a commitment to diversity and inclusion.
- Other info: Hybrid working with opportunities for career growth and a supportive team environment.
- Why this job: Make a real difference in data solutions and support analytics on a global scale.
- Qualifications: Experience in data engineering, ETL/ELT development, and Azure Data Platform expertise.
The predicted salary is between 50000 - 65000 ÂŁ per year.
In this role, you are part of the Data Engineering team, helping design, build, and maintain a modern Data Lakehouse and its associated data solutions. This is a hands‑on position where you directly contribute to both project delivery and day‑to‑day operations. You develop scalable, high‑performance data pipelines and ETL/ELT processes that integrate and transform large volumes of structured and unstructured data, ensuring solutions meet business needs and support analytics and reporting. You also play an important role in maintaining high development standards by contributing to code quality, testing, documentation, and governance. You work closely with stakeholders, analysts, and cross‑functional teams to understand requirements and deliver effective data solutions. Alongside this, you support Agile ways of working—managing sprints, backlogs, and incidents—while contributing to the ongoing development of the Azure‑based data platform and supporting other team members to deliver consistent, high‑quality results.
Please note this role is a 6‑month FTC with potential of extension.
Required Skills and Experience- Data Engineering Experience in large‑scale, high‑volume environments
- ETL/ELT Development – strong experience building scalable data pipelines, ideally metadata‑driven
- Azure Data Platform expertise with PySpark development, Azure Data Factory, Synapse, and Data Lake
- Knowledge of Medallion Architecture
- Advanced T‑SQL skills and solid experience in data modelling (3NF, dimensional)
- Experience with SCRUM, CI/CD, Git, and Azure DevOps for delivery and releases
- Global Impact: With offices in the UK, US, and Australia, and plans for further expansion, you'll be part of a dynamic, globally‑minded team, with opportunities to explore new markets and make a difference on a global scale.
- Flexible Working: Embrace the freedom to work from anywhere in the world for up to 30 days a year. We prioritize work‑life balance, recognizing that your well‑being matters.
- Commitment to Diversity and Inclusion: We celebrate our diverse culture and value individuals irrespective of background, disability, religion, gender identity, sexuality, or ethnicity. Join a team where diversity is not just welcomed but celebrated as a key driver of growth and innovation.
Hybrid working typically means two days in the office location listed on this advert and three days working at home each week. Some occasional travel to our other offices may be required.
Senior Data Engineer - 6 month FTC employer: SmartestEnergy Business
Contact Detail:
SmartestEnergy Business Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer - 6 month FTC
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field and let them know you're on the lookout for opportunities. A friendly chat can lead to referrals or insider info about job openings.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your best data pipelines and projects. This is your chance to demonstrate your expertise in ETL/ELT processes and Azure Data Platform—make it shine!
✨Tip Number 3
Prepare for interviews by brushing up on Agile methodologies and your experience with SCRUM. Be ready to discuss how you've managed sprints and backlogs, as this will show you’re a great fit for our team.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who take that extra step to connect with us directly.
We think you need these skills to ace Senior Data Engineer - 6 month FTC
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experience mentioned in the job description. Highlight your data engineering experience, especially with Azure and ETL/ELT processes, to show us you’re the right fit for the role.
Craft a Compelling Cover Letter: Use your cover letter to tell us why you’re passionate about data engineering and how your background aligns with our needs. Be specific about your experience with scalable data pipelines and working in Agile environments.
Showcase Your Projects: If you've worked on relevant projects, don’t hesitate to mention them! We love seeing real examples of your work, especially if they involve high-volume data environments or Azure technologies.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for this exciting opportunity with StudySmarter!
How to prepare for a job interview at SmartestEnergy Business
✨Know Your Data Stuff
Make sure you brush up on your data engineering skills, especially around ETL/ELT processes and Azure Data Platform tools like PySpark and Data Factory. Be ready to discuss specific projects where you've built scalable data pipelines and how you tackled challenges in large-scale environments.
✨Showcase Your Agile Experience
Since this role involves Agile methodologies, be prepared to talk about your experience with SCRUM, managing sprints, and using tools like Azure DevOps. Share examples of how you've contributed to team backlogs and incident management to demonstrate your hands-on approach.
✨Understand the Business Needs
Familiarise yourself with how data solutions support analytics and reporting. During the interview, highlight your ability to work closely with stakeholders and cross-functional teams to gather requirements and deliver effective data solutions that meet business objectives.
✨Emphasise Quality and Governance
Discuss your commitment to maintaining high development standards, including code quality, testing, and documentation. Bring examples of how you've contributed to governance in previous roles, as this will show your dedication to delivering consistent, high-quality results.