At a Glance
- Tasks: Design and build cloud-based data pipelines using AWS tools for financial transformation.
- Company: Join Capco, a leader in tech solutions for Tier 1 financial institutions.
- Benefits: Enjoy competitive salary, health insurance, flexible holidays, and continuous learning opportunities.
- Why this job: Make a real impact on digital transformation in the financial services industry.
- Qualifications: Proficient in Python, Scala, or Java with experience in Big Data technologies.
- Other info: Collaborative culture with a focus on diversity, equity, and inclusivity.
The predicted salary is between 48000 - 72000 £ per year.
Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: Permanent
Engineer future-ready data platforms that power financial transformation.
The Role
We’re looking for a Senior Data Engineer with AWS to join our growing team of engineering experts driving next-gen transformation for Tier 1 financial services clients. You’ll play a pivotal role in designing, building, and deploying cloud-based data pipelines. Working across greenfield projects and enterprise-scale platforms, your work will directly impact the way data is ingested, transformed, and served at scale across the financial services industry.
What You’ll Do
- Design and build end-to-end data pipelines leveraging AWS-native tools and modern data architectures.
- Collaborate with clients to gather requirements, define solutions, and deliver production-grade systems.
- Apply AWS Well-Architected Principles to ensure scalability, security, and resilience.
- Lead in the development of robust, tested, and fault-tolerant data engineering solutions.
- Support and mentor junior engineers, contributing to knowledge sharing across the team.
What We’re Looking For
- Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as Spark, Hadoop etc.
- Practical knowledge of building real-time event streaming pipelines (e.g., Kafka, Spark Streaming, Kinesis).
- Proficiency in AWS cloud environments.
- Proven experience developing modern data architectures including Data Lakehouse and Data Warehousing.
- A solid understanding of CI/CD practices, DevOps tooling, and data governance including GDPR.
Bonus Points For
- Expertise in Data Modelling, schema design, and handling both structured and semi-structured data.
- Familiarity with distributed systems such as Hadoop, Spark, HDFS, Hive, Databricks.
- Exposure to AWS Lake Formation and automation of ingestion and transformation layers.
- Background in delivering solutions for highly regulated industries.
- Passion for mentoring and enabling data engineering best practices across teams.
Why Join Capco
- Deliver high-impact technology solutions for Tier 1 financial institutions.
- Work in a collaborative, flat, and entrepreneurial consulting culture.
- Access continuous learning, training, and industry certifications.
- Be part of a team shaping the future of digital financial services.
- Help shape the future of digital transformation across FS & Energy.
Benefits
- Core Benefits: Discretionary bonus, competitive pension, health insurance, life insurance and critical illness cover.
- Mental Health: Easy access to CareFirst, Unmind, Aviva consultations, and in-house first aiders.
- Family-Friendly: Maternity, adoption, shared parental leave, plus paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.
- Family Care: 8 complimentary backup care sessions for emergency childcare or elder care.
- Holiday Flexibility: 5 weeks of annual leave with the option to buy or sell holiday days based on your needs.
- Continuous Learning: Minimum 40 hours of training annually and a Business Coach from Day One.
- Extra Perks: Gympass, travel insurance, Tastecard, season ticket loans, Cycle to Work, and dental insurance.
Inclusion at Capco
We’re committed to a barrier-free, inclusive recruitment process. If you need any adjustments at any stage, just let us know – we’ll be happy to help. We welcome applicants from all backgrounds. At Capco, we value the difference you make, and the differences that make you. Our #BeYourselfAtWork culture champions diversity, equity and inclusivity, and we bring a collaborative mindset to our partnerships with clients and colleagues. #BeYourselfAtWork is the cornerstone of our success and a value that our employees live and breathe every day.
Senior Data Engineer - AWS employer: Capco
Contact Detail:
Capco Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer - AWS
✨Tip Number 1
Network like a pro! Reach out to your connections in the industry, attend meetups, and engage with online communities. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving AWS and data engineering. This gives potential employers a tangible look at what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common technical questions related to AWS and data pipelines. Practice explaining your thought process clearly, as communication is key in collaborative environments.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining our team.
We think you need these skills to ace Senior Data Engineer - AWS
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with AWS, data pipelines, and any relevant Big Data technologies. We want to see how your skills align with what we’re looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how you can contribute to our team. Don’t forget to mention any specific projects or achievements that showcase your expertise.
Showcase Your Technical Skills: Be sure to highlight your proficiency in Python, Scala, or Java, along with your experience in building real-time event streaming pipelines. We love seeing practical examples of your work, so include any relevant projects or case studies!
Apply Through Our Website: We encourage you to apply through our website for a smoother application process. It’s the best way for us to receive your application and ensure it gets the attention it deserves. We can’t wait to hear from you!
How to prepare for a job interview at Capco
✨Know Your AWS Inside Out
Make sure you brush up on your AWS knowledge, especially the tools and services relevant to data engineering. Be prepared to discuss how you've applied AWS Well-Architected Principles in your previous projects, as this will show your understanding of scalability, security, and resilience.
✨Showcase Your Coding Skills
Since proficiency in Python, Scala, or Java is key, be ready to demonstrate your coding skills. You might be asked to solve a problem on the spot, so practice coding challenges related to data pipelines and big data technologies like Spark and Hadoop.
✨Prepare for Real-Time Scenarios
Expect questions about building real-time event streaming pipelines. Brush up on your experience with Kafka, Spark Streaming, or Kinesis, and be ready to explain how you've implemented these technologies in past projects.
✨Emphasise Mentorship and Collaboration
As a senior role, they'll want to see your leadership qualities. Prepare examples of how you've mentored junior engineers or contributed to team knowledge sharing. Highlight your collaborative approach to problem-solving and how you’ve worked with clients to deliver solutions.