At a Glance
- Tasks: Drive automation and design innovative data frameworks to enhance our data platform.
- Company: Join Perch Group, a leader in ethical debt resolution through technology.
- Benefits: Competitive salary of ÂŁ65,000 plus bonuses, flexible working, and professional development opportunities.
- Other info: Dynamic work environment with opportunities for growth and learning.
- Why this job: Make a real impact in the data engineering field while empowering customers.
- Qualifications: 5+ years in T-SQL, ADF, and Databricks with a focus on automation and frameworks.
The predicted salary is between 65000 - 78000 ÂŁ per year.
Perch are looking for a Senior Data Engineer to drive automation and metadata-driven frameworks. At Perch Group, our vision is clear: to lead the UK debt purchase and collection industry by harnessing cutting-edge technology to drive ethical, efficient, and data-driven debt resolution. Our annual mission is to empower hundreds of thousands of customers to positively engage with and resolve their outstanding debts through an empathetic and customer-centric approach that is at the heart of our success.
We’re hiring a Senior Data Engineer to drive the next evolution of our data platform by introducing metadata‑driven, configurable, and repeatable engineering frameworks. This role is for someone who has outgrown building pipelines one at a time and is ready to industrialise how a modern data platform operates. Your focus will be on designing patterns, frameworks, and automation that fundamentally change our speed and quality of delivery. This is not a “pipeline factory” role - it’s an engineering leadership position for someone who can solve problems at the framework and architecture level.
So, what will your day-to-day look like?
- Design and implement metadata‑driven and config‑driven frameworks that replace one‑off pipelines with reusable, scalable patterns.
- Develop templates, orchestration patterns, and reusable components that minimise manual engineering effort.
- Introduce engineering standards that prioritise automation, maintainability, and lifecycle governance.
Core Engineering Delivery:
- Build and optimise robust ingestion, transformation, and publishing pipelines using ADF, Databricks (PySpark/Delta Lake) and T‑SQL — with automation principles embedded from the start.
- Assist in maturing the enterprise data warehouse architecture and data models.
- Ensure all designs align with best practices around performance, optimisation, logging, monitoring, and quality.
- Partner with architects and product teams to identify high‑value automation opportunities.
- Mentor engineers and uplift engineering discipline through coaching, patterns, and code reviews.
- Strengthen DevOps, CI/CD, and automated testing approaches for data workloads.
Does this sound like you?
- 5+ years advanced T‑SQL with strong optimisation, dynamic SQL, and engineering‑oriented design patterns.
- 5+ years ADF: parameterisation, custom frameworks, and template‑based orchestration (not just building pipelines).
- 5+ years Databricks including PySpark optimisation, Delta Lake architecture, and DABs.
- Demonstrable experience building metadata‑driven or config‑driven frameworks — not just talking about them.
- 5+ years data modelling, including designing dimensional models that support enterprise‑scale consumption.
- End‑to‑end delivery experience across data warehouse or data platform programmes.
- Experience with dynamic schema detection, inferred metadata processing, or multi‑tenant ingestion patterns.
- Advanced orchestration approaches (Databricks Workflows, Functions, event‑driven patterns).
- Practical CI/CD experience using Azure DevOps for data engineering (not just familiarity).
- Experience in regulated industries where quality, lineage, and governance matter.
- Understanding of data management frameworks (DAMA, DCAM, etc.).
The Application Timeline:
- A first stage video call with the internal recruitment team (15 minutes).
- A technical test (up to 2 hours).
- A face-to-face interview with the hiring manager (1 hour).
- Typically, the average successful applicant will be within this timeline for 2-3 weeks.
Please note we will close this role once we have enough applications for the next stages therefore you should submit your application asap to avoid any disappointment. If you do not receive a response after 3 weeks of applying, please assume you have been unsuccessful as we may experience a high volume of applications.
Why you should be EXCITED to apply…
- ÂŁ65,000, + up to 20% of your annual salary, paid as a bonus.
- This role can be based at our Blackpool or Manchester office.
- 37.5 hours per week.
- We offer flexible and hybrid working between our core hours of 8am-6pm, Monday to Friday.
- The opportunity to complete formal qualifications and learn on the job in a successful, growing organisation.
- And many more benefits to support your wellbeing and professional development.
So, what are you waiting for? Submit your application today. We’re an equal opportunity employer. All applicants will be considered for employment without attention to age, ethnicity, religion, sex, sexual orientation, gender identity, family or parental status, national origin, or veteran, neurodiversity or disability status. If you have any questions or suggestions of how we can assist you in your application due to disability or personal reasons, please email recruitment@perchgroup.co.uk.
PLEASE NOTE - All new employees must undergo a full DBS and Credit Check upon acceptance of a job offer with Perch Group.
Senior Data Engineer employer: Perch Group
Contact Detail:
Perch Group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Get your networking game on! Connect with professionals in the data engineering field on LinkedIn or at industry events. We can’t stress enough how valuable personal connections can be when it comes to landing that dream job.
✨Tip Number 2
Prepare for those interviews like a pro! Research common technical questions related to T-SQL, ADF, and Databricks. Practising your answers will help you feel more confident and ready to showcase your skills.
✨Tip Number 3
Don’t forget to highlight your leadership experience! Since this role is about engineering leadership, make sure to share examples of how you've mentored others or led projects. We want to see your impact!
✨Tip Number 4
Apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, you’ll be one step closer to joining a company that values innovation and ethical practices in data engineering.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with metadata-driven frameworks and automation, as these are key to what we're looking for. Use specific examples that showcase your skills in ADF, Databricks, and T-SQL.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for data engineering and how you can contribute to our mission at Perch. Mention any relevant projects or achievements that demonstrate your ability to drive automation and improve data processes.
Showcase Your Problem-Solving Skills: In your application, don’t just list your skills—show us how you've used them to solve real problems. Talk about specific challenges you've faced in previous roles and how you tackled them, especially in relation to building scalable data solutions.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of applications and ensures you get all the updates directly. Plus, it shows you're keen on joining our team at Perch!
How to prepare for a job interview at Perch Group
✨Know Your Tech Inside Out
Make sure you’re well-versed in ADF, Databricks, and T-SQL. Brush up on your metadata-driven frameworks and be ready to discuss specific examples of how you've implemented them in the past. This role is all about engineering leadership, so show them you can think at a higher level than just building pipelines.
✨Showcase Your Problem-Solving Skills
Prepare to discuss how you've tackled complex data challenges in previous roles. Think about instances where you’ve designed scalable patterns or introduced automation that improved efficiency. They want to see your ability to solve problems at the framework and architecture level, so come armed with real-life examples.
✨Emphasise Collaboration and Mentorship
This position involves partnering with architects and mentoring other engineers. Be ready to talk about your experience working in teams, sharing knowledge, and uplifting others through coaching. Highlight any past experiences where you’ve contributed to a collaborative environment or led initiatives that improved team performance.
✨Prepare for Technical Assessments
Since there’s a technical test involved, practice coding challenges related to data engineering. Familiarise yourself with common scenarios they might present, especially around automation and orchestration. Being well-prepared will not only boost your confidence but also demonstrate your commitment to the role.