At a Glance
- Tasks: Lead the design and ownership of our data platform, ensuring robust data systems.
- Company: Join Plentific, a pioneering proptech company transforming property management.
- Benefits: Enjoy competitive pay, flexible work, health care, and generous holiday leave.
- Why this job: Make a real impact in the proptech revolution with cutting-edge technology.
- Qualifications: Proven experience in data engineering and strong software development skills required.
- Other info: Dynamic team culture with opportunities for growth and innovation.
The predicted salary is between 48000 - 84000 £ per year.
About Us
At Plentific, we're redefining property management in real time. Our mission is to lead real estate through the transformative journey into "The World of Now," enabling us to empower property professionals through our innovative, cloud-based platform. We harness cutting-edge technology and data-driven insights to streamline operations for landlords, letting agents, and property managers—enabling them to optimize maintenance, manage repairs, and make informed decisions instantly. Our platform is designed to create seamless, real-time workflows that transform traditional property management into a dynamic, digital experience. Backed by a world-class group of investors—including Noa, Highland Europe, Brookfields, Mubadala, RXR Digital Ventures, and Target Global—Plentific is at the forefront of the proptech revolution. Headquartered in London with a global outlook, we're continually expanding our reach and impact.
We're looking for forward-thinking, passionate professionals who are ready to contribute to our mission and drive industry innovation. If you're excited about making an immediate impact and shaping the future of property management, explore career opportunities with us at Plentific.
The Role
We are looking for a Tech Lead - Data Engineering to serve as the primary architect and owner of our data platform. Reporting to the Head of Engineering, you will own the end-to-end technical direction of our data ecosystem and act as the most senior individual contributor in this domain. This role sits at the intersection of data engineering and system design. You will define how data is ingested, modelled, stored, transformed, and exposed across the company, with an emphasis on robust pipelines, clear data contracts, and reliable operation at scale.
We are looking for someone who goes beyond building pipelines and focuses on designing durable, well-architected data systems. The large volumes of transactional data we generate form the foundation for machine learning and other AI-driven solutions that we are actively building and evolving. Your focus will be on designing and evolving data systems that are reliable, maintainable, and fit for long-term use, applying strong software engineering principles to how data is structured, integrated, and operated at scale.
Responsibilities
- Own the Data Platform: Take end-to-end ownership of the data platform, including ingestion, storage, transformation, and exposure layers. This includes setting technical direction and being accountable for system reliability, performance, and cost.
- System Architecture: Lead the design of distributed data systems, ensuring clean integration between backend services, external APIs, event streams, and data storage layers.
- ML-powered Product Enablement: Work closely with product and engineering teams to design and lead data foundations for machine-learning-powered product features, ensuring data quality, traceability, and production readiness.
- Data Modelling & Strategy: Act as the lead architect for data models and contracts. Design schemas for both structured and unstructured data, balancing flexibility, performance, and long-term maintainability.
- Engineering Standards & Artefacts: Set and uphold engineering standards across the data domain. Produce and maintain architecture diagrams, design documents, and Architecture Decision Records (ADRs). Champion best practices including version control, CI/CD, modular design, backwards compatibility, and automated testing.
- Pipeline & ETL/ELT Design: Architect and implement high-scale, fault-tolerant data pipelines. Make deliberate trade-offs around latency, freshness, cost, and complexity, selecting fit-for-purpose tools rather than defaulting to trends.
- Hands-on Delivery: Spend a significant portion of your time building and maintaining core pipelines, schemas, and services in production. This is a hands-on role with direct responsibility for critical systems.
- Technical Leadership: Define the technical roadmap for data, perform deep code reviews, and mentor engineers on system design, SQL, and Python.
- Workflow Automation: Design and implement automated workflows (using tools such as n8n or custom Python services) to bridge operational gaps and reduce manual processes.
- Governance & Security: Design enterprise-grade governance frameworks covering access control, data lineage, observability, and data integrity.
- Production Ownership: Be accountable for production incidents, data quality issues, and cost regressions within the data platform.
Experience and Qualifications
- Architectural Mindset: Proven experience as a Tech Lead, Principal Engineer, or System Architect designing and owning complex, distributed systems.
- Strong Software Engineering Foundations: A software-engineer-first mindset with deep experience in Python and production-grade engineering practices. Experience with libraries such as Pandas or Polars is expected, but architectural thinking matters more than specific tools.
- Machine Learning Exposure: Hands-on experience working with machine learning systems and tooling (e.g. Hugging Face, feature stores, model inference pipelines, or similar), with an emphasis on enabling ML in production rather than research experimentation.
- Database & Storage Expertise: Advanced SQL skills and hands-on experience with modern cloud data warehouses (e.g. Snowflake or equivalent), alongside solutions for unstructured or semi-structured data.
- ETL/ELT & Orchestration: Experience designing and operating modern data pipelines using tools such as dbt, Airflow, or equivalent orchestration and transformation frameworks.
- Engineering Rigor: Deep experience with Git-based workflows, CI/CD pipelines, automated testing, and maintaining long-lived systems in production.
- Engineering Judgement: Demonstrated ability to make and defend trade-offs—when to model data, when not to ingest data, and how to balance correctness, performance, and cost.
- Analytical Depth: Ability to interrogate and analyse data directly to validate system behaviour and ensure high levels of data quality.
Desirable
- Experience with Analytics-as-Code platforms such as Looker/LookML.
- Experience building internal platforms that enable, rather than directly deliver, BI and reporting.
- Experience with automation platforms such as n8n for connecting operational systems.
- Experience designing systems for multimodal data (text, images, video, documents).
Benefits
As you can see, we are quickly progressing with our ambitious plans and are eager to grow our team of doers to achieve our vision of managing over 2 million properties through our platform across various countries. You can help us shape the future of property management across the globe. Here's what we offer:
- A competitive compensation package
- 25 days annual holiday + 1 additional day for every year served up to 5 years.
- Flexible working environment including the option to work abroad
- Private health care for you and immediate family members with discounted gym membership, optical, dental and private GP
- Enhanced parental leave
- Life insurance (4x salary)
- Employee assistance program
- Company volunteering day and charity salary sacrifice scheme
- Learning management system powered by Udemy
- Referral bonus and charity donation if someone you introduce joins the company
- Season ticket loan, Cycle to work, Electric vehicle and Techscheme programs
- Pension scheme
- Work abroad scheme
- Company-sponsored lunches, dinners and social gatherings
- Fully stocked kitchen with drinks, snacks, fruit, breakfast cereal etc.
Tech Lead - Data Engineering in London employer: Story Terrace Inc.
Contact Detail:
Story Terrace Inc. Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Tech Lead - Data Engineering in London
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, attend meetups, and connect with Plentific employees on LinkedIn. A personal touch can make all the difference when it comes to landing that interview.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your data engineering projects. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for the technical interview by brushing up on your system design and data modelling skills. Practice explaining your thought process clearly, as communication is key in tech roles like this one.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people at Plentific. Plus, it shows you’re genuinely interested in joining our team.
We think you need these skills to ace Tech Lead - Data Engineering in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Tech Lead - Data Engineering role. Highlight your experience with data systems, architecture, and any relevant projects that showcase your skills in Python and machine learning.
Craft a Compelling Cover Letter: Your cover letter should tell us why you're excited about joining Plentific and how your background aligns with our mission. Be genuine and let your passion for property management and tech shine through!
Showcase Your Technical Skills: In your application, don’t forget to mention specific tools and technologies you’ve worked with, especially those related to data pipelines and cloud data warehouses. We love seeing hands-on experience!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands and shows us you’re serious about joining our team!
How to prepare for a job interview at Story Terrace Inc.
✨Know Your Data Inside Out
Before the interview, dive deep into your understanding of data engineering principles. Be ready to discuss how you would design robust data pipelines and manage large volumes of transactional data. Familiarise yourself with the specific tools mentioned in the job description, like Python and SQL, as well as any relevant libraries.
✨Showcase Your Architectural Mindset
Prepare to demonstrate your experience in designing complex, distributed systems. Think about examples from your past roles where you took ownership of a data platform or led a project that involved system architecture. Be ready to explain your decision-making process and how you balance performance, cost, and maintainability.
✨Emphasise Collaboration with Product Teams
Since this role involves working closely with product and engineering teams, come prepared with examples of how you've collaborated in the past. Discuss how you’ve contributed to machine-learning-powered features and ensured data quality. Highlight your ability to communicate technical concepts to non-technical stakeholders.
✨Prepare for Technical Challenges
Expect to face technical questions or challenges during the interview. Brush up on your problem-solving skills and be ready to tackle scenarios related to data ingestion, transformation, and storage. Practising coding problems or system design questions can help you feel more confident and articulate during the interview.