At a Glance
- Tasks: Design and build a cutting-edge data platform for analytics and insights.
- Company: Join BR-DGE, a forward-thinking fintech company with a collaborative culture.
- Benefits: Enjoy flexible remote work, 33 days holiday, and family healthcare.
- Why this job: Make a real impact in a greenfield data engineering space.
- Qualifications: Proven experience in data pipelines, SQL, and AWS platforms.
- Other info: Rapid career progression and investment in your learning and development.
The predicted salary is between 48000 - 72000 £ per year.
All BR-DGE Builders Receive The Following Benefits:
- Flexible and remote working
- Remote working allowance
- 33 days holiday including public holidays
- Your birthday as a day off
- Family healthcare
- Life insurance
- Employee assistance programme
- A culture that champions rapid career progression
- Investment in your learning and development
- Regular team events & socials
Why this role exists:
Data is becoming a critical part of BR-DGE's next growth phase, powering internal analytics and customer facing insights and monitoring. The data engineering space is largely greenfield. We need a production grade data platform that can ingest, transform, validate, and monitor data from core systems and operational tooling. The robustness, scalability, and governance of our data architecture impacts our ability to grow safely and meet regulatory expectations. This role owns the insights data platform, while partnering closely with Analytics, Product, and Engineering to ensure the platform delivers trusted datasets and timely signals.
What You Will Do:
- Design and ship a tiered data platform that supports multiple latency needs, including low latency pipelines for operational monitoring and customer facing insights, plus batch pipelines for reporting and deeper analysis.
- Build and own end-to-end ingestion patterns across batch, micro batch, and selected near real time use cases, with strong orchestration and dependency management.
- Implement schema evolution, data contracts, and approaches for late arriving and corrected data so consumers can trust the outputs.
- Treat curated datasets as products that are well defined, documented, reliable, and safe to use for both internal and external consumers.
- Set platform standards for idempotent ingestion, deduplication, data quality, lineage, and observability.
- Ensure the platform meets regulated fintech & payments expectations for access control, security, and governance while staying cost efficient as volumes grow.
- Partner with Product and Engineering on event and domain modelling. Decide what data gets emitted and what latency and granularity is needed for analytics and product goals.
- Support Data Science with reliable feature ready datasets and pragmatic collaboration, without owning reporting or business analysis.
- Evolve the current lightweight tooling into a more observable, structured platform. Improve standards without creating unnecessary platform complexity.
- Automate data infrastructure and workflows using infrastructure as code and CI/CD practices.
What We Are Looking For:
Must have:
- Proven experience designing, building, and operating production grade data pipelines and platforms.
- Strong SQL, specifically PostgreSQL, plus at least one programming language such as Python or Java.
- Experience with data processing or orchestration tooling such as Spark, Airflow, or Kafka.
- Experience designing data models for analytics and reporting workloads.
- Practical knowledge of data quality, testing, observability, lineage, and governance patterns.
- Strong experience with AWS based data platforms, with hands on use of services like S3, Glue, Athena, Redshift, Kinesis, EMR, or MSK.
- Infrastructure as code experience using Terraform or CloudFormation, and comfort operating systems in production.
- Ability to collaborate across Engineering, Product, Analytics, and Data Science, and drive standards through influence.
Nice to have:
- Experience building customer-facing data products where latency and correctness affect user outcomes.
- Experience in regulated fintech or payments environments, especially around access control and auditability.
- Experience with cost and performance optimisation at scale in AWS data stacks.
Tech context:
This role will work across ingestion, orchestration, modelling, governance, and observability in an AWS centric environment, with PostgreSQL and modern data tooling. Current tooling is intentionally lightweight, and the platform is evolving as BR-DGE grows. In some cases you do not need to be hands-on day to day, but you must be fluent enough to make strong technical decisions and review work.
What We Offer:
- Flexible, remote-first working
- 33 days holiday, including public holidays
- Birthday off
- Family healthcare
- Life insurance
- Employee assistance programme
- Investment in learning and development
- Regular team events and off-sites
- A collaborative culture where documentation is treated as a first-class product
Apply for this role.
Senior Data Engineer in Edinburgh employer: Comcarde Ltd
Contact Detail:
Comcarde Ltd Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer in Edinburgh
✨Tip Number 1
Network like a pro! Reach out to folks in your industry on LinkedIn or at events. A friendly chat can lead to opportunities that aren’t even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repo showcasing your data projects. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by practising common questions and scenarios related to data engineering. We recommend doing mock interviews with friends or using online platforms to boost your confidence.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive!
We think you need these skills to ace Senior Data Engineer in Edinburgh
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with data pipelines, SQL, and any relevant tools like Spark or Airflow. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for data engineering and how you can contribute to our team. Be sure to mention any experience in fintech or building customer-facing data products if you have it.
Showcase Your Projects: If you've worked on any cool data projects, don’t hold back! Include links or descriptions of your work that demonstrate your ability to design and operate production-grade data platforms. We love seeing real-world applications of your skills.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of your application and ensures you get all the updates directly from us. Plus, it’s super easy!
How to prepare for a job interview at Comcarde Ltd
✨Know Your Data Inside Out
Make sure you’re well-versed in the data engineering concepts relevant to the role. Brush up on your experience with SQL, PostgreSQL, and any data processing tools like Spark or Airflow. Be ready to discuss specific projects where you designed and built production-grade data pipelines.
✨Showcase Your Problem-Solving Skills
Prepare to share examples of how you've tackled challenges in data ingestion and orchestration. Think about scenarios where you implemented schema evolution or ensured data quality. This will demonstrate your ability to handle the complexities of a greenfield data platform.
✨Understand the Business Impact
Familiarise yourself with how data impacts business decisions, especially in a fintech context. Be prepared to discuss how your work can drive insights and support product goals. Showing that you understand the bigger picture will set you apart from other candidates.
✨Collaborate and Communicate
Since this role involves working closely with various teams, think of examples that highlight your collaboration skills. Be ready to discuss how you’ve influenced standards across Engineering, Product, and Analytics. Good communication is key, so practice articulating your thoughts clearly.