At a Glance
- Tasks: Design and build data pipelines to transform messy data into actionable insights.
- Company: Join Qflow, a pioneering tech company focused on sustainable construction.
- Benefits: Competitive salary, remote work, generous leave, and career development opportunities.
- Why this job: Make a real impact in reducing waste and carbon emissions in the construction industry.
- Qualifications: 4+ years in data engineering, strong skills in Azure, Python, and SQL.
- Other info: Dynamic startup culture with a focus on innovation and collaboration.
The predicted salary is between 55000 - 75000 £ per year.
Who are we? Qflow was founded in 2018 with a bold mission to empower the world to build responsibly. Our platform helps construction and development teams cut waste and reduce carbon emissions through real-time, data-driven insights. We’ve built a cutting‑edge platform that gives project teams real‑time insights into materials, waste, cost, carbon, and quality right at the source. Using AI, machine learning, and smart integrations, our tech makes sense of chaotic data streams by extracting information from messy receipts and documents, auditing it against project requirements, flagging risks instantly, and feeding insights directly into reporting workflows. Our mobile app makes it effortless for workers on the ground to capture data in seconds, turning construction sites into real‑time, data‑rich ecosystems. The impact? Less waste. Lower carbon. Smarter decisions. A construction industry that uses only what it needs, building a future that works for people and the planet. We’re backed by leading climate‑tech and construction‑tech investors, including Systemiq Capital, Greensoil PropTech Ventures, and Suffolk Technologies, and have raised £11.2M in Series A funding to accelerate our international growth.
You should apply if … You’re a Data Engineer who wants to use your skills for impact, helping make one of the world’s most polluting industries more efficient, sustainable, and data‑driven. You enjoy building robust, well‑engineered data pipelines that turn messy, real‑world data into reliable foundations for AI and analytics. You care about data quality, not just data movement. You’re curious, collaborative, and want to work in a place where what you build directly shapes how construction companies improve quality, reduce waste, and make smarter decisions at scale.
Your team and your role: We’re looking for a Data Engineer to join our Data group, a cross‑functional, high‑impact team at the intersection of data engineering, machine learning, and data quality. The team designs, develops, and operates the scalable data infrastructure that powers Qflow’s platform and AI capabilities. Reporting to our Senior Engineering Manager and working closely with ML Engineers and Data Quality experts, you’ll own the pipelines that get the right data, in the right shape, to the right place.
Here’s what you’ll do day to day:
- Design, build, and ingest data from multiple sources into our data infrastructure (currently 100M+ rows and growing).
- Work with Azure Cosmos DB, Microsoft Fabric, and relational databases to model, store, and serve data at scale.
- Build and manage data lake layers in Microsoft Fabric, including ingestion, transformation, and serving patterns that support both ML and analytical workloads.
- Collaborate with ML Engineers to ensure training data is clean, versioned, and correctly structured — including pipelines that feed generative AI features.
- Partner with Data Quality experts to implement validation, monitoring, and lineage tracking that give the team confidence in what flows through our systems.
- Optimise pipeline performance, reliability, and cost; debugging failures quickly and building resilience in.
- Contribute to data governance practices, including schema management, access controls, and documentation.
- Maintain high standards in code quality, testing, and reproducibility, and share knowledge across the team.
- Make informed trade‑off decisions to manage the cost of Fabric compute.
The Data Experience Team works with Python and SQL for data processing, Azure and Terraform for cloud infrastructure, and modern ML/AI tools such as OpenAI and Gemini. Our data infrastructure centres on Azure Cosmos DB, Microsoft Fabric, and relational databases. We’re continuously raising our engineering standards through robust testing, CI/CD, and shared code quality practices. We value curiosity and innovation, always exploring new technologies to stay ahead of the curve.
Your Skills: We’re looking for a mid‑to‑senior engineer who is comfortable taking ownership of complex data infrastructure in a fast‑moving startup. You bring engineering rigour to data problems, and you understand that the quality of what goes in determines the quality of what comes out. What matters most is your ability to build reliable, production‑grade pipelines that real AI and analytics products depend on.
- 4+ years of experience in a data engineering role, ideally in a product or SaaS environment.
- Ability to think about data quality from an end user perspective – i.e. the value of our data and customer trust in data – as well as an internal perspective (validity, uniqueness, etc).
- Hands‑on experience with Azure Cosmos DB, including data modelling for document‑oriented workloads.
- Experience with Pyspark.
- Strong working knowledge of Microsoft Fabric or Azure Data Lake Storage, including experience designing medallion or equivalent layered architectures.
- Solid SQL skills and experience with relational databases (PostgreSQL or similar).
- Proficiency in Python for pipeline development, transformation logic, and orchestration.
- Experience building data pipelines that feed ML or generative AI workflows — understanding of what ‘good’ training and inference data looks like.
- Familiarity with data quality practices: validation, monitoring, alerting, and lineage.
- Working knowledge of Azure cloud infrastructure and services; exposure to Terraform or infrastructure‑as‑code is a plus.
- Exposure to CI/CD practices and containerisation with Docker or similar.
- Experience using AI coding tools to accelerate development while maintaining the ability to audit and correct LLM output for performance at scale.
- Excellent communication skills, able to work across engineering, product, and non‑technical stakeholders.
- Comfortable with ambiguity and incremental delivery in a startup environment.
Nice to have: experience with Retool; familiarity with Medallion architecture.
Our offer: Basic salary of £55,000 – £75,000, depending on experience. Remote‑first team with bi‑weekly engineering team gatherings and wide‑company gatherings once a quarter in our London HQ. Work travel expenses covered by Qflow. 25 days annual leave + 3 days company closure at Christmas + bank holidays. Critical illness and life insurance. Pension contribution up to 7%. Enhanced family policy. We allow our employees to work abroad for up to 90 days. We’ll offset your annual carbon footprint on your behalf via Ecologi. Learning & development and career progression opportunities. Company social events (online and in person!). Company laptop and tools.
Creating an environment where everyone feels valued, respected and heard is at the forefront of everything we do. We are committed to providing equal employment opportunities regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or veteran status.
Data Engineer employer: Qualis Flow (Qflow)
Contact Detail:
Qualis Flow (Qflow) Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Network like a pro! Reach out to people in the industry, attend meetups, and connect with Qflow employees on LinkedIn. A personal connection can make all the difference when it comes to landing that interview.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your data engineering projects. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for the interview by brushing up on your technical skills and understanding Qflow's mission. Be ready to discuss how your experience aligns with their goals of reducing waste and improving sustainability in construction.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining the Qflow team.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your Application: Make sure to customise your CV and cover letter for the Data Engineer role. Highlight your experience with Azure Cosmos DB, Python, and data quality practices, as these are key to what we’re looking for at Qflow.
Showcase Your Impact: We love seeing how your work has made a difference in previous roles. Share specific examples of how you’ve built robust data pipelines or improved data quality, especially in a fast-paced environment.
Be Clear and Concise: When writing your application, keep it straightforward. Use clear language and avoid jargon where possible. We want to understand your skills and experiences without having to decipher complex terms.
Apply Through Our Website: Don’t forget to submit your application through our website! It’s the best way for us to receive your details and ensures you’re considered for the role. We can’t wait to see what you bring to the table!
How to prepare for a job interview at Qualis Flow (Qflow)
✨Know Your Data Tools
Familiarise yourself with Azure Cosmos DB, Microsoft Fabric, and the data processing tools mentioned in the job description. Be ready to discuss your hands-on experience with these technologies and how you've used them to build robust data pipelines.
✨Showcase Your Problem-Solving Skills
Prepare examples of how you've tackled data quality issues in the past. Highlight your understanding of validation, monitoring, and lineage tracking, and be ready to explain how you ensure the integrity of data flowing through systems.
✨Demonstrate Collaboration
Since the role involves working closely with ML Engineers and Data Quality experts, think of instances where you've successfully collaborated across teams. Emphasise your communication skills and how you’ve contributed to shared goals in previous projects.
✨Be Curious and Innovative
Qflow values curiosity and innovation, so come prepared with ideas on new technologies or methodologies that could enhance their data infrastructure. Show your enthusiasm for continuous learning and how you stay ahead of industry trends.