At a Glance
- Tasks: Design and build data pipelines to improve neighbour wellbeing and operational performance.
- Company: Birchgrove, a unique build-to-rent operator for older adults in the UK.
- Benefits: Competitive pay, flexible working options, and impactful projects.
- Why this job: Make a real difference in people's lives through data-driven insights.
- Qualifications: Experience in data engineering, API integration, and cloud technologies.
- Other info: Join us at an exciting stage of our data journey with growth opportunities.
The predicted salary is between 10417 - 12500 £ per month.
Location: Hybrid or remote
Term: 3 month fixed term contract
Renumeration: £12,500 for 3 months
About Birchgrove
Birchgrove is the only build-to-rent operator in the UK exclusively for older adults. Our mission is to enrich the lives of our neighbours and add healthy years to their lives. We operate neighbourhoods rather than care homes, placing independence, dignity and community at the heart of what we do. We’re a forward-thinking organisation using data to improve neighbour wellbeing, operational performance and long-term decision-making.
The Opportunity
We’re looking for an experienced Data Engineer to join Birchgrove on a 3-month contract to deliver several clearly defined, high-impact data integration projects. This is a hands-on, delivery-focused role. You’ll design, build and document reliable, production-grade ETL/ELT pipelines that integrate operational systems into our cloud data warehouse enabling improved reporting and analytics across the business. You’ll be joining at an exciting stage in our data journey, helping us move from early foundations to a more connected, scalable and dependable data platform.
Key Project Deliverables
- Fall detection system integration
- Ingest data from a fall detection platform using APIs and webhooks
- Land and model the data in Snowflake
- Implement reliability best practices: monitoring, alerting, logging, retries, and clear documentation
- Extract and ingest data from our resident management system
- Design robust data models to support reporting on neighbour wellbeing and operations
- Ensure maintainable transformations and clear data definitions
- Design and build an API-based integration between two facilities management systems
- Enable joined-up reporting across maintenance, safety and operational data
- Deliver clean, consistent datasets suitable for analytics and dashboards
- Ingest data from our marketing platform using APIs
- Land and model the data in Snowflake
These projects will directly support improved insight, faster decision-making and better outcomes for our neighbours and team.
Tools & Technology Stack
You’ll work with and help establish best practice around the following tools:
- Snowflake (cloud data warehouse)
- Fivetran (managed ingestion)
- Airbyte (custom & API-based integrations)
- dbt (transformations, testing and documentation)
- Power BI (analytics and dashboards)
We’re particularly keen to speak with candidates who are highly confident with:
- API-driven pipeline design (authentication, pagination, rate limiting, incremental loads)
- Webhook ingestion patterns and event-driven data capture
- Building reliable, well-monitored pipelines with clear documentation and ownership
About You
- Proven experience as a Data Engineer, delivering pipelines end-to-end in modern cloud stacks
- Strong hands-on skills with APIs, webhooks, and pipeline-based ETL/ELT
- Confident using Python for data integration and automation
- Comfortable implementing practical reliability patterns (e.g., idempotency, retries, monitoring, alerting)
- Strong data modelling and transformation experience (ideally with dbt)
- Able to work independently, but collaborate closely with non-technical stakeholders
- Motivated by purpose-driven work and using data to improve real lives
How to Apply
If you’re an experienced Data Engineer looking for a short-term contract where you can deliver meaningful work with real-world impact, we’d love to hear from you.
Locations
Data Engineer in Cobham, Surrey employer: Birchgrove
Contact Detail:
Birchgrove Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in Cobham, Surrey
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field and let them know you're on the lookout for opportunities. You never know who might have a lead or can put in a good word for you.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your past projects, especially those involving ETL/ELT pipelines and data integration. This will give potential employers a clear idea of what you can bring to the table.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and scenarios. Be ready to discuss your experience with APIs, webhooks, and tools like Snowflake and dbt. Confidence is key!
✨Tip Number 4
Don't forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you're serious about joining Birchgrove and making a difference.
We think you need these skills to ace Data Engineer in Cobham, Surrey
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with APIs, ETL/ELT processes, and any relevant tools like Snowflake or dbt. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share why you're passionate about using data to improve lives and how your experience aligns with our mission at Birchgrove. Keep it concise but impactful!
Showcase Your Projects: If you've worked on similar projects before, don’t hold back! Include specific examples of your work with data integration, monitoring, and documentation. We love seeing real-world applications of your skills.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for this exciting opportunity. Don’t miss out!
How to prepare for a job interview at Birchgrove
✨Know Your Tech Stack
Make sure you’re familiar with the tools mentioned in the job description, like Snowflake, Fivetran, and dbt. Brush up on your API-driven pipeline design skills and be ready to discuss how you've used these technologies in past projects.
✨Showcase Your Problem-Solving Skills
Prepare to share specific examples of how you've tackled challenges in data integration or pipeline reliability. Think about times when you implemented monitoring or alerting systems and how that improved your projects.
✨Understand Their Mission
Birchgrove is all about enriching lives through data. Familiarise yourself with their mission and think about how your work as a Data Engineer can contribute to improving neighbour wellbeing. This will show your genuine interest in the role.
✨Ask Insightful Questions
Prepare some thoughtful questions about their current data projects or future goals. This not only shows your enthusiasm but also helps you gauge if the company aligns with your values and career aspirations.