Senior Data Engineer

Senior Data Engineer

Full-Time 48000 - 84000 £ / year (est.) No home office possible
Comcarde Ltd

At a Glance

  • Tasks: Design and build a cutting-edge data platform for analytics and insights.
  • Company: Join a dynamic fintech company focused on data-driven growth.
  • Benefits: Enjoy flexible remote work, 33 days holiday, and comprehensive healthcare.
  • Why this job: Make a real impact in a greenfield data engineering space.
  • Qualifications: Proven experience with data pipelines, SQL, and AWS platforms.
  • Other info: Collaborative culture with rapid career progression and investment in your development.

The predicted salary is between 48000 - 84000 £ per year.

Benefits

  • Flexible and remote working
  • Remote working allowance
  • 33 days holiday including public holidays
  • Your birthday as a day off
  • Family healthcare
  • Life insurance
  • Employee assistance programme
  • A culture that champions rapid career progression
  • Investment in your learning and development
  • Regular team events & socials
  • A collaborative culture where documentation is treated as a first-class product

Why this role exists

Data is becoming a critical part of BR DGE’s next growth phase, powering internal analytics and customer-facing insights and monitoring. The data engineering space is largely greenfield. We need a production grade data platform that can ingest, transform, validate, and monitor data from core systems and operational tooling. The robustness, scalability, and governance of our data architecture impacts our ability to grow safely and meet regulatory expectations. This role owns the insights data platform, while partnering closely with Analytics, Product, and Engineering to ensure the platform delivers trusted datasets and timely signals.

What You Will Do

  • Design and ship a tiered data platform that supports multiple latency needs, including low latency pipelines for operational monitoring and customer-facing insights, plus batch pipelines for reporting and deeper analysis.
  • Build and own end-to-end ingestion patterns across batch, micro batch, and selected near real-time use cases, with strong orchestration and dependency management.
  • Implement schema evolution, data contracts, and approaches for late arriving and corrected data so consumers can trust the outputs.
  • Treat curated datasets as products that are well defined, documented, reliable, and safe to use for both internal and external consumers.
  • Set platform standards for idempotent ingestion, deduplication, data quality, lineage, and observability.
  • Ensure the platform meets regulated fintech & payments expectations for access control, security, and governance while staying cost efficient as volumes grow.
  • Partner with Product and Engineering on event and domain modelling.
  • Decide what data gets emitted and what latency and granularity is needed for analytics and product goals.
  • Support Data Science with reliable feature-ready datasets and pragmatic collaboration, without owning reporting or business analysis.
  • Evolve the current lightweight tooling into a more observable, structured platform.
  • Improve standards without creating unnecessary platform complexity.
  • Automate data infrastructure and workflows using infrastructure as code and CI/CD practices.

What We Are Looking For

Must have

  • Proven experience designing, building, and operating production grade data pipelines and platforms.
  • Strong SQL, specifically PostgreSQL, plus at least one programming language such as Python or Java.
  • Experience with data processing or orchestration tooling such as Spark, Airflow, or Kafka.
  • Experience designing data models for analytics and reporting workloads.
  • Practical knowledge of data quality, testing, observability, lineage, and governance patterns.
  • Strong experience with AWS based data platforms, with hands on use of services like S3, Glue, Athena, Redshift, Kinesis, EMR, or MSK.
  • Infrastructure as code experience using Terraform or CloudFormation, and comfort operating systems in production.
  • Ability to collaborate across Engineering, Product, Analytics, and Data Science, and drive standards through influence.

Nice to have

  • Experience building customer-facing data products where latency and correctness affect user outcomes.
  • Experience in regulated fintech or payments environments, especially around access control and auditability.
  • Experience with cost and performance optimisation at scale in AWS data stacks.

Tech context

This role will work across ingestion, orchestration, modelling, governance, and observability in an AWS centric environment, with PostgreSQL and modern data tooling. Current tooling is intentionally lightweight, and the platform is evolving as BR-DGE grows. In some cases you do not need to be hands-on day to day, but you must be fluent enough to make strong technical decisions and review work.

What We Offer

  • Flexible, remote-first working
  • 33 days holiday, including public holidays
  • Birthday off
  • Family healthcare
  • Life insurance
  • Employee assistance programme
  • Investment in learning and development
  • Regular team events and off-sites
  • A collaborative culture where documentation is treated as a first-class product

Senior Data Engineer employer: Comcarde Ltd

At BR-DGE, we pride ourselves on being an exceptional employer that fosters a collaborative and innovative work culture. With flexible remote working options, generous holiday allowances, and a strong commitment to employee development, we empower our team members to thrive both personally and professionally. Our focus on rapid career progression and investment in learning ensures that you will have the opportunity to grow your skills while contributing to a cutting-edge data platform in the dynamic fintech space.
Comcarde Ltd

Contact Detail:

Comcarde Ltd Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer

✨Tip Number 1

Network like a pro! Reach out to folks in the industry, attend meetups, and connect with potential colleagues on LinkedIn. You never know who might have the inside scoop on job openings or can put in a good word for you.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your data projects, especially those that highlight your experience with SQL, Python, or AWS. This gives you a tangible way to demonstrate your expertise beyond just a CV.

✨Tip Number 3

Prepare for interviews by practising common data engineering questions and scenarios. Think about how you would design a data pipeline or tackle data quality issues. The more prepared you are, the more confident you'll feel!

✨Tip Number 4

Don't forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you're genuinely interested in joining our team at BR-DGE.

We think you need these skills to ace Senior Data Engineer

Data Pipeline Design
Production Grade Data Platforms
SQL (PostgreSQL)
Python or Java Programming
Data Processing Tooling (Spark, Airflow, Kafka)
Data Model Design for Analytics
Data Quality and Governance Patterns
AWS Data Platforms (S3, Glue, Athena, Redshift, Kinesis, EMR, MSK)
Infrastructure as Code (Terraform, CloudFormation)
Collaboration Across Teams (Engineering, Product, Analytics, Data Science)
Event and Domain Modelling
Observability and Lineage
Cost and Performance Optimisation in AWS

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with data pipelines, SQL, and any relevant tools like Spark or Airflow. We want to see how your skills match what we're looking for!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for data engineering and how you can contribute to our team. Don’t forget to mention any experience in regulated fintech environments if you have it!

Showcase Your Projects: If you've worked on any cool data projects, make sure to include them! We love seeing real-world applications of your skills, especially if they involve building customer-facing data products or using AWS services.

Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and we can’t wait to see your application come through!

How to prepare for a job interview at Comcarde Ltd

✨Know Your Data Inside Out

Make sure you’re well-versed in the data engineering concepts mentioned in the job description. Brush up on your experience with SQL, PostgreSQL, and any relevant programming languages like Python or Java. Be ready to discuss specific projects where you've designed and built production-grade data pipelines.

✨Showcase Your Problem-Solving Skills

Prepare to talk about how you've tackled challenges in data ingestion, orchestration, and governance. Think of examples where you implemented schema evolution or improved data quality. This will demonstrate your ability to handle the complexities of a greenfield data platform.

✨Collaborate Like a Pro

Since this role involves working closely with Product, Engineering, and Analytics teams, be ready to share experiences where you’ve successfully collaborated across different departments. Highlight how you’ve influenced standards and driven projects forward through teamwork.

✨Stay Current with Tech Trends

Familiarise yourself with the latest tools and technologies in the data engineering space, especially those mentioned like AWS services, Spark, and Airflow. Showing that you’re proactive about learning and adapting will impress your interviewers and align with the company’s culture of investment in development.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>