At a Glance
- Tasks: Build and manage data pipelines for impactful business reporting and analytics.
- Company: Join a high-growth tech company focused on data capabilities.
- Benefits: Remote-first role with high ownership and career growth opportunities.
- Why this job: Shape the future of data engineering and make a real impact.
- Qualifications: Experience in Data Engineering, strong SQL skills, and cloud data environment knowledge.
- Other info: Collaborative team culture with modern tools like AWS and Databricks.
The predicted salary is between 28800 - 48000 £ per year.
A great opportunity to join a high-growth, product-led technology business that’s investing in its data capabilities as it scales. You’ll join a small, friendly and supportive data function, helping to build and improve the pipelines and datasets that sit behind business reporting and analytics. The team is lean right now, so this role comes with genuine ownership and the chance to shape how data engineering is done going forward.
Working pattern: This is a remote-first role.
The Why? (Top 3)
- High ownership, early-stage data function: You’ll be one of the key people helping define standards, patterns, and best practice as the data capability grows.
- BI work with visible business impact: You’ll build pipelines that directly enable reporting and decision-making across the organisation.
- Modern tooling and strong engineering approach: Hands-on work with AWS and Databricks, focusing on scalable, maintainable data pipelines.
The What…
As a Data Engineer, you’ll help build and manage the data platform and tooling that supports reporting and analytics. You’ll work cross-functionally with engineers and stakeholders, contribute to technical direction, and help embed robust engineering practices across the data estate.
Core responsibilities include:
- Building and maintaining data pipelines that support BI reporting (AWS QuickSight).
- Developing and optimising SQL-first transformations (this role leans more SQL than Python).
- Working with Databricks / Spark for scalable batch processing (and some streaming exposure where relevant).
- Partnering with stakeholders to deliver reliable, well-structured datasets and reporting outputs.
- Contributing to documentation, data quality, version control and deployment best practice.
- Supporting the evolution of the data platform as the team grows.
What you’ll bring:
Essential experience:
- Commercial experience in a Data Engineering (or similar) role.
- Strong SQL and confidence working in a cloud data environment.
- Experience with Databricks and or Spark.
- Exposure to AWS.
- GitHub and a solid grasp of modern engineering workflows.
Nice to have (not required):
- AWS QuickSight.
- DBT.
- Structured Streaming or real-time pipeline exposure.
- Infrastructure as Code (Terraform or CDK).
Data Engineer employer: FORT
Contact Detail:
FORT Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Network like a pro! Reach out to folks in the industry on LinkedIn or at meetups. A friendly chat can lead to opportunities that aren’t even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data pipelines and projects. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your SQL and cloud data environment knowledge. Be ready to discuss your past experiences and how they relate to the role you’re applying for.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive!
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Show Your Passion for Data: When writing your application, let us see your enthusiasm for data engineering! Share specific examples of projects or experiences that highlight your skills and how they relate to the role. We love seeing candidates who are genuinely excited about shaping data practices.
Tailor Your CV and Cover Letter: Make sure to customise your CV and cover letter to reflect the job description. Highlight your experience with SQL, Databricks, and AWS, as these are key for us. A tailored application shows us you’ve done your homework and understand what we’re looking for.
Be Clear and Concise: Keep your application clear and to the point. Use bullet points where possible to make it easy for us to read through your qualifications and experiences. We appreciate straightforward communication, especially when it comes to technical skills!
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it gives you a chance to explore more about our company culture and values.
How to prepare for a job interview at FORT
✨Know Your SQL Inside Out
Since this role leans more towards SQL than Python, make sure you brush up on your SQL skills. Be prepared to discuss your experience with SQL transformations and how you've optimised queries in past projects. Practising common SQL interview questions can really help you stand out.
✨Familiarise Yourself with Databricks and AWS
Get comfortable with Databricks and AWS, as these are key tools for the role. If you have any hands-on experience, be ready to share specific examples of how you've used them to build or manage data pipelines. Showing that you understand the modern tooling will impress the interviewers.
✨Demonstrate Your Ownership Mindset
This position offers high ownership, so be prepared to discuss times when you've taken initiative in previous roles. Share examples of how you've contributed to defining standards or best practices in data engineering, as this aligns perfectly with what they're looking for.
✨Engage with Stakeholders
Since you'll be working cross-functionally, it's important to show that you can effectively partner with stakeholders. Think of examples where you've collaborated with others to deliver reliable datasets or reporting outputs. Highlighting your communication skills will show you're a team player.