At a Glance
- Tasks: Lead the design and ownership of our data platform, ensuring robust data systems.
- Company: Join Plentific, a proptech leader transforming property management with innovative technology.
- Benefits: Enjoy competitive pay, flexible work options, health care, and generous holiday leave.
- Why this job: Make a real impact in the proptech revolution while shaping the future of property management.
- Qualifications: Proven experience in data engineering, strong Python skills, and architectural mindset required.
- Other info: Dynamic team culture with opportunities for professional growth and global impact.
The predicted salary is between 48000 - 84000 ÂŁ per year.
About Us
At Plentific, we’re redefining property management in real time. Our mission is to lead real estate through the transformative journey into “The World of Now,” enabling us to empower property professionals through our innovative, cloud-based platform. We harness cutting-edge technology and data-driven insights to streamline operations for landlords, letting agents, and property managers—enabling them to optimize maintenance, manage repairs, and make informed decisions instantly. Our platform is designed to create seamless, real-time workflows that transform traditional property management into a dynamic, digital experience.
Backed by a world-class group of investors—including Noa, Highland Europe, Brookfields, Mubadala, RXR Digital Ventures, and Target Global—Plentific is at the forefront of the proptech revolution. Headquartered in London with a global outlook, we’re continually expanding our reach and impact. We’re looking for forward-thinking, passionate professionals who are ready to contribute to our mission and drive industry innovation. If you’re excited about making an immediate impact and shaping the future of property management, explore career opportunities with us at Plentific.
The Role
We are looking for a Tech Lead - Data Engineering to serve as the primary architect and owner of our data platform. Reporting to the Head of Engineering, you will own the end-to-end technical direction of our data ecosystem and act as the most senior individual contributor in this domain. This role sits at the intersection of data engineering and system design. You will define how data is ingested, modelled, stored, transformed, and exposed across the company, with an emphasis on robust pipelines, clear data contracts, and reliable operation at scale.
We are looking for someone who goes beyond building pipelines and focuses on designing durable, well-architected data systems. The large volumes of transactional data we generate form the foundation for machine learning and other AI-driven solutions that we are actively building and evolving. Your focus will be on designing and evolving data systems that are reliable, maintainable, and fit for long-term use, applying strong software engineering principles to how data is structured, integrated, and operated at scale.
Responsibilities
- Own the Data Platform: Take end-to-end ownership of the data platform, including ingestion, storage, transformation, and exposure layers. This includes setting technical direction and being accountable for system reliability, performance, and cost.
- System Architecture: Lead the design of distributed data systems, ensuring clean integration between backend services, external APIs, event streams, and data storage layers.
- ML-powered Product Enablement: Work closely with product and engineering teams to design and lead data foundations for machine-learning-powered product features, ensuring data quality, traceability, and production readiness.
- Data Modelling & Strategy: Act as the lead architect for data models and contracts. Design schemas for both structured and unstructured data, balancing flexibility, performance, and long-term maintainability.
- Engineering Standards & Artefacts: Set and uphold engineering standards across the data domain. Produce and maintain architecture diagrams, design documents, and Architecture Decision Records (ADRs). Champion best practices including version control, CI/CD, modular design, backwards compatibility, and automated testing.
- Pipeline & ETL/ELT Design: Architect and implement high-scale, fault-tolerant data pipelines. Make deliberate trade-offs around latency, freshness, cost, and complexity, selecting fit-for-purpose tools rather than defaulting to trends.
- Hands-on Delivery: Spend a significant portion of your time building and maintaining core pipelines, schemas, and services in production. This is a hands-on role with direct responsibility for critical systems.
- Technical Leadership: Define the technical roadmap for data, perform deep code reviews, and mentor engineers on system design, SQL, and Python.
- Workflow Automation: Design and implement automated workflows (using tools such as n8n or custom Python services) to bridge operational gaps and reduce manual processes.
- Governance & Security: Design enterprise-grade governance frameworks covering access control, data lineage, observability, and data integrity.
- Production Ownership: Be accountable for production incidents, data quality issues, and cost regressions within the data platform.
Experience and Qualifications
- Architectural Mindset: Proven experience as a Tech Lead, Principal Engineer, or System Architect designing and owning complex, distributed systems.
- Strong Software Engineering Foundations: A software-engineer-first mindset with deep experience in Python and production-grade engineering practices. Experience with libraries such as Pandas or Polars is expected, but architectural thinking matters more than specific tools.
- Machine Learning Exposure: Hands-on experience working with machine learning systems and tooling (e.g. Hugging Face, feature stores, model inference pipelines, or similar), with an emphasis on enabling ML in production rather than research experimentation.
- Database & Storage Expertise: Advanced SQL skills and hands-on experience with modern cloud data warehouses (e.g. Snowflake or equivalent), alongside solutions for unstructured or semi-structured data.
- ETL/ELT & Orchestration: Experience designing and operating modern data pipelines using tools such as dbt, Airflow, or equivalent orchestration and transformation frameworks.
- Engineering Rigor: Deep experience with Git-based workflows, CI/CD pipelines, automated testing, and maintaining long-lived systems in production.
- Engineering Judgement: Demonstrated ability to make and defend trade-offs—when to model data, when not to ingest data, and how to balance correctness, performance, and cost.
- Analytical Depth: Ability to interrogate and analyse data directly to validate system behaviour and ensure high levels of data quality.
Desirable
- Experience with Analytics-as-Code platforms such as Looker/LookML.
- Experience building internal platforms that enable, rather than directly deliver, BI and reporting.
- Experience with automation platforms such as n8n for connecting operational systems.
- Experience designing systems for multimodal data (text, images, video, documents).
Benefits
As you can see, we are quickly progressing with our ambitious plans and are eager to grow our team of doers to achieve our vision of managing over 2 million properties through our platform across various countries. You can help us shape the future of property management across the globe. Here’s what we offer:
- A competitive compensation package
- 25 days annual holiday + 1 additional day for every year served up to 5 years.
- Flexible working environment including the option to work abroad
- Private health care for you and immediate family members with discounted gym membership, optical, dental and private GP
- Enhanced parental leave
- Life insurance (4x salary)
- Employee assistance program
- Company volunteering day and charity salary sacrifice scheme
- Learning management system powered by Udemy
- Referral bonus and charity donation if someone you introduce joins the company
- Season ticket loan, Cycle to work, Electric vehicle and Techscheme programs
- Pension scheme
- Work abroad scheme
- Company-sponsored lunches, dinners and social gatherings
- Fully stocked kitchen with drinks, snacks, fruit, breakfast cereal etc.
Tech Lead - Data Engineering in London employer: Plentific
Contact Detail:
Plentific Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Tech Lead - Data Engineering in London
✨Tip Number 1
Network like a pro! Get out there and connect with folks in the industry. Attend meetups, webinars, or even just grab a coffee with someone who works at Plentific. Building relationships can open doors that a CV just can't.
✨Tip Number 2
Show off your skills! If you’ve got a portfolio or some projects that highlight your data engineering prowess, make sure to share them. A hands-on demonstration of what you can do speaks volumes more than just words on a page.
✨Tip Number 3
Prepare for the interview like it’s the Super Bowl! Research Plentific, understand their tech stack, and think about how your experience aligns with their mission. Be ready to discuss how you can contribute to their innovative platform.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining the Plentific team.
We think you need these skills to ace Tech Lead - Data Engineering in London
Some tips for your application 🫡
Show Your Passion: When you're writing your application, let your enthusiasm for the role shine through! We want to see that you're genuinely excited about redefining property management and how you can contribute to our mission at Plentific.
Tailor Your CV: Make sure your CV is tailored to the Tech Lead - Data Engineering role. Highlight your experience with data systems, machine learning, and any relevant projects you've worked on. We love seeing how your skills align with what we're looking for!
Be Clear and Concise: Keep your application clear and to the point. Use straightforward language to describe your experiences and achievements. We appreciate a well-structured application that makes it easy for us to see your qualifications.
Apply Through Our Website: Don't forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it gives you a chance to explore more about Plentific and our culture.
How to prepare for a job interview at Plentific
✨Know Your Data Inside Out
Before the interview, dive deep into your understanding of data engineering principles. Be ready to discuss how you would design robust data pipelines and manage large volumes of transactional data. Familiarise yourself with the specific technologies mentioned in the job description, like Python and SQL, as well as any relevant tools for ETL/ELT processes.
✨Showcase Your Architectural Mindset
Prepare to demonstrate your architectural thinking by discussing past projects where you designed complex, distributed systems. Highlight your decision-making process regarding trade-offs between performance, cost, and maintainability. This will show that you can think critically about system design, which is crucial for the Tech Lead role.
✨Emphasise Collaboration Skills
Since this role involves working closely with product and engineering teams, be ready to share examples of how you've successfully collaborated in the past. Discuss how you’ve enabled machine learning features or improved data quality through teamwork. This will illustrate your ability to lead and mentor others while driving innovation.
✨Prepare for Technical Challenges
Expect to face technical questions or challenges during the interview. Brush up on your problem-solving skills and be prepared to walk through your thought process. You might be asked to design a data model or troubleshoot a hypothetical data pipeline issue, so practice articulating your approach clearly and confidently.