Senior Data Engineer

Senior Data Engineer

Full-Time 36000 - 60000 ÂŁ / year (est.) No home office possible
Br-dge

At a Glance

  • Tasks: Design and build a cutting-edge data platform for analytics and insights.
  • Company: Join BR‑DGE, an award-winning FinTech based in Edinburgh.
  • Benefits: Enjoy flexible remote work, 33 days holiday, and family healthcare.
  • Why this job: Make a real impact in a growing data engineering space.
  • Qualifications: Proven experience with data pipelines, SQL, and AWS platforms.
  • Other info: Collaborative culture with a focus on learning and development.

The predicted salary is between 36000 - 60000 ÂŁ per year.

Employment type: Permanent, full time.

Location: Remote with the option to use the Edinburgh office.

About BR DGE

BR‑DGE is an award‑winning FinTech founded in Edinburgh. Our platform enables enterprise e‑commerce and technology businesses to design, optimise, and scale complex payment infrastructures with confidence. We operate in a regulated, technically demanding environment where reliability and execution quality matter. Our products increasingly rely on high‑quality data to support both internal decision making and merchant‑facing insight experiences. When relevant, we position clearly as compliance‑minded and white‑hat.

Why this role exists

Data is becoming a critical part of BR‑DGE’s next growth phase, powering internal analytics and customer‑facing insights and monitoring. The data engineering space is largely greenfield. We need a production‑grade data platform that can ingest, transform, validate, and monitor data from core systems and operational tooling. The robustness, scalability, and governance of our data architecture impacts our ability to grow safely and meet regulatory expectations. This role owns the insights data platform, while partnering closely with Analytics, Product, and Engineering to ensure the platform delivers trusted datasets and timely signals.

What you will do

  • Design and ship a tiered data platform that supports multiple latency needs, including low‑latency pipelines for operational monitoring and customer‑facing insights, plus batch pipelines for reporting and deeper analysis.
  • Build and own end‑to‑end ingestion patterns across batch, micro‑batch, and selected near‑real‑time use cases, with strong orchestration and dependency management.
  • Implement schema evolution, data contracts, and approaches for late‑arriving and corrected data so consumers can trust the outputs.
  • Treat curated datasets as products that are well defined, documented, reliable, and safe to use for both internal and external consumers.
  • Set platform standards for idempotent ingestion, deduplication, data quality, lineage, and observability.
  • Ensure the platform meets regulated fintech & payments expectations for access control, security, and governance while staying cost efficient as volumes grow.
  • Partner with Product and Engineering on event and domain modelling.
  • Decide what data gets emitted and what latency and granularity is needed for analytics and product goals.
  • Support Data Science with reliable feature‑ready datasets and pragmatic collaboration, without owning reporting or business analysis.
  • Evolve the current lightweight tooling into a more observable, structured platform.
  • Improve standards without creating unnecessary platform complexity.
  • Automate data infrastructure and workflows using infrastructure as code and CI/CD practices.

What we are looking for

Must have
  • Proven experience designing, building, and operating production‑grade data pipelines and platforms.
  • Strong SQL, specifically PostgreSQL, plus at least one programming language such as Python or Java.
  • Experience with data processing or orchestration tooling such as Spark, Airflow, or Kafka.
  • Experience designing data models for analytics and reporting workloads.
  • Practical knowledge of data quality, testing, observability, lineage, and governance patterns.
  • Strong experience with AWS‑based data platforms, with hands‑on use of services like S3, Glue, Athena, Redshift, Kinesis, EMR, or MSK.
  • Infrastructure as code experience using Terraform or CloudFormation, and comfort operating systems in production.
  • Ability to collaborate across Engineering, Product, Analytics, and Data Science, and drive standards through influence.
Nice to have
  • Experience building customer‑facing data products where latency and correctness affect user outcomes.
  • Experience in regulated fintech or payments environments, especially around access control and auditability.
  • Experience with cost and performance optimisation at scale in AWS data stacks.

Tech context

Role will work across ingestion, orchestration, modelling, governance, and observability in an AWS‑centric environment, with PostgreSQL and modern data tooling. Current tooling is intentionally lightweight, and the platform is evolving as BR‑DGE grows. In some cases you do not need to be hands‑on day to day, but you must be fluent enough to make strong technical decisions and review work.

What We Offer:

  • Flexible, remote‑first working
  • 33 days holiday, including public holidays
  • Birthday off
  • Family healthcare
  • Life insurance
  • Employee assistance programme
  • Investment in learning and development
  • Regular team events and off‑sites
  • A collaborative culture where documentation is treated as a first‑class product

Senior Data Engineer employer: Br-dge

BR‑DGE is an exceptional employer that champions a flexible, remote-first work culture while offering the option to collaborate in our Edinburgh office. With a strong focus on employee growth through investment in learning and development, we provide a supportive environment where documentation is valued as a key asset. Our commitment to work-life balance is reflected in generous benefits such as 33 days of holiday, family healthcare, and regular team events, making BR‑DGE a rewarding place to build your career in the dynamic FinTech sector.
Br-dge

Contact Detail:

Br-dge Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer

✨Tip Number 1

Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can put in a good word for you.

✨Tip Number 2

Show off your skills! Create a portfolio or GitHub repository showcasing your data engineering projects. This gives potential employers a taste of what you can do and sets you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on your technical skills and understanding the company’s tech stack. Practice common data engineering questions and be ready to discuss how you’ve tackled challenges in past roles.

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, we love seeing candidates who are genuinely interested in joining our team at StudySmarter.

We think you need these skills to ace Senior Data Engineer

Data Pipeline Design
Production-Grade Data Platforms
SQL (PostgreSQL)
Python or Java Programming
Data Processing Tools (Spark, Airflow, Kafka)
Data Model Design for Analytics
Data Quality and Governance
AWS Services (S3, Glue, Athena, Redshift, Kinesis, EMR, MSK)
Infrastructure as Code (Terraform, CloudFormation)
Collaboration Across Teams (Engineering, Product, Analytics, Data Science)
Event and Domain Modelling
Observability and Monitoring
Cost and Performance Optimisation in AWS

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with data pipelines, SQL, and any relevant tools like Spark or Airflow. We want to see how your skills match what we're looking for!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our team at BR‑DGE. Keep it concise but impactful – we love a good story!

Showcase Your Projects: If you've worked on any cool data projects, don’t forget to mention them! Whether it's a personal project or something from a previous job, we want to see your hands-on experience and creativity in action.

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands. Plus, it shows us you're keen on joining our awesome team!

How to prepare for a job interview at Br-dge

✨Know Your Data Inside Out

Make sure you’re well-versed in the data engineering concepts relevant to the role. Brush up on your experience with SQL, PostgreSQL, and any data processing tools like Spark or Airflow. Be ready to discuss specific projects where you designed and built production-grade data pipelines.

✨Showcase Your Problem-Solving Skills

Prepare to share examples of how you've tackled challenges in data architecture and governance. Think about scenarios where you had to ensure data quality and observability, and be ready to explain your thought process and the outcomes.

✨Understand the FinTech Landscape

Familiarise yourself with the regulatory environment of FinTech, especially around access control and auditability. Being able to discuss how your work aligns with compliance and security standards will show that you understand the industry's demands.

✨Collaborate and Communicate

Since this role involves working closely with various teams, practice articulating how you’ve successfully collaborated with Product, Engineering, and Data Science in the past. Highlight your ability to influence standards and drive projects forward through teamwork.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>