At a Glance
- Tasks: Lead the re-architecture of big data systems and migrate services to Google Cloud.
- Company: Join Deutsche Bank, a global leader in banking with a diverse and innovative culture.
- Benefits: Enjoy a hybrid work model and collaborate with top-tier professionals in a supportive environment.
- Why this job: Make a real impact on critical systems while solving complex data challenges in a respected institution.
- Qualifications: Strong Scala skills and experience with big data technologies like Apache Spark and GCP are essential.
- Other info: This role is a 6-month contract with opportunities for growth and development.
The predicted salary is between 43200 - 72000 £ per year.
You are here: Home / Job Search / Senior Big Data Engineer/Architect – Scala – Distributed Systems Job Title: Senior Big Data Engineer/Architect – Scala – Distributed Systems Location: London AMS is the world’s leading provider of Talent Acquisition and Management Services. Our Contingent Workforce Solutions (CWS) service, partner with Deutsche Bank to support contingent recruitment processes. On behalf of Deutsche Bank, we are looking for a Senior Big Data Engineer/Architect with Scala for an initial 6-month contract on a hybrid basis in London. Deutsche Bank is a global banking business with strong roots in Germany and operations in over 70 countries. Their large but focused footprint gives an established position in Europe plus a significant presence in the Americas and Asia Pacific. There are four business divisions: the Corporate Bank, the Investment Bank, the Private Bank and the Asset Manager DWS. There are also a number of highly skilled functions performing key management tasks. ‘Together we’re sharing new perspectives and transforming what it means to be a bank.’ Purpose of the role: This is a high-profile technology role within a globally recognised Tier-1 investment bank. Deutsche Bank is seeking a hands-on technologist with deep experience in big data engineering to support a major modernisation effort across its credit risk platform. The role sits at the heart of the bank’s stress testing function , ensuring the business is well-prepared for macroeconomic shocks and meets capital adequacy guidelines (technology experience is key; domain knowledge is not required). This is an opportunity to work at enterprise scale on mission-critical systems , with a clear business purpose and the chance to influence the end-to-end design. You’ll take the lead in a cross-functional engineering team focused on re-architecting distributed systems, implementing scalable solutions, and migrating services from on-prem to GCP using best-in-class tooling. If you’re someone who enjoys solving complex data problems, writing robust code, and bringing clarity to legacy environments — while actively influencing architecture decisions — this is a rare chance to do that inside one of the world’s most respected financial institutions. Key Responsibilities Build, modernise, and re-architect enterprise data systems. Migrate on-prem systems to Google Cloud Platform (GCP), leveraging Dataproc, BigQuery, and other GCP-native tooling. Use technologies such as Apache Spark, Hadoop, and Scala to process large-scale distributed datasets. Contribute to infrastructure automation (CI/CD) and hybrid cloud deployment pipelines using tools such as GitHub Actions, Renovate, CodeQL, and Quality Gate. Apply engineering principles to select optimal reengineering approaches (lift-and-shift vs redesign). Collaborate with architects and senior engineers to define scalable data processing models. Participate in design reviews, code walkthroughs, and performance optimisations. Take a methodical, thoughtful approach to coding — focusing on solution quality over speed of delivery. Be willing to work hands-on across full delivery lifecycle and, if required, step up to contribute to architecture and system design. Essential Technical Criteria: Strong Scala development skills with experience in large-scale Big Data environments. Proficiency in Apache Spark; working knowledge of Hadoop. Familiarity with GCP (Dataproc, BigQuery) or other public cloud platforms (AWS, Azure). Experience with Kubernetes or OpenShift (on-prem or hybrid environments). Understanding of CI/CD pipelines and tools such as GitHub, CodeQL, QualityGate, Renovate. Nice to Have: Experience in banking, credit risk, or financial services domains. Experience with stress testing, risk platforms, or ‘what-if’ scenario-based data modelling. Deutsche Bank’s values define the working environment they strive to create – diverse, supportive and welcoming of different views. They embrace a culture reflecting a variety of perspectives, insights and backgrounds to drive innovation. They build talented and diverse teams to drive business results and encourage their people to develop to their full potential. The Bank promotes good working relationships and encourage high standards of conduct and work performance. They welcome applications from talented people from all cultures, countries, races, genders, sexual orientations, disabilities, beliefs, and generations and are committed to providing a working environment free from harassment, discrimination and retaliation. This client will only accept workers operating via a PAYE engagement model. AMS’s payroll service is in partnership with Giant, we have worked with them for many years and have good processes in place to ensure you get the best service. If you are successful in your application for this role, your contract will be via Giant. For more information on Giant, please follow this link: https://ams-giant-paye-introduction. AMS, a Recruitment Process Outsourcing Company, may in the delivery of some of its services be deemed to operate as an Employment Agency or an Employment Business. #J-18808-Ljbffr
Senior Big Data Engineer/Architect – Scala – Distributed Systems employer: Deutsche Bank AG, Frankfurt am Main
Contact Detail:
Deutsche Bank AG, Frankfurt am Main Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Big Data Engineer/Architect – Scala – Distributed Systems
✨Tip Number 1
Familiarise yourself with the specific technologies mentioned in the job description, such as Scala, Apache Spark, and GCP. Having hands-on experience or projects that showcase your skills in these areas can significantly boost your chances.
✨Tip Number 2
Network with professionals in the banking and finance sector, especially those who work with Deutsche Bank or similar institutions. Engaging in conversations about their experiences can provide valuable insights and potentially lead to referrals.
✨Tip Number 3
Prepare to discuss your approach to solving complex data problems during interviews. Be ready to share examples of how you've tackled similar challenges in past roles, particularly in big data environments.
✨Tip Number 4
Showcase your understanding of CI/CD pipelines and infrastructure automation tools. Being able to articulate your experience with tools like GitHub Actions and CodeQL will demonstrate your readiness for the role's responsibilities.
We think you need these skills to ace Senior Big Data Engineer/Architect – Scala – Distributed Systems
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Scala, Apache Spark, and big data environments. Focus on specific projects where you've built or modernised data systems, especially in cloud environments like GCP.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention how your skills align with the responsibilities outlined in the job description, particularly your experience with distributed systems and CI/CD pipelines.
Showcase Relevant Projects: Include examples of relevant projects in your application that demonstrate your ability to solve complex data problems and your hands-on experience with technologies mentioned in the job description, such as Kubernetes or OpenShift.
Highlight Soft Skills: Don't forget to mention your soft skills, such as collaboration and communication, which are essential for working in cross-functional teams. Emphasise your methodical approach to coding and your commitment to quality.
How to prepare for a job interview at Deutsche Bank AG, Frankfurt am Main
✨Showcase Your Scala Expertise
Make sure to highlight your strong Scala development skills during the interview. Be prepared to discuss specific projects where you've used Scala in large-scale Big Data environments, and how you tackled challenges related to performance and scalability.
✨Demonstrate Cloud Knowledge
Familiarity with Google Cloud Platform (GCP) is essential for this role. Brush up on your knowledge of GCP services like Dataproc and BigQuery, and be ready to explain how you've leveraged cloud technologies in past projects, especially in migrating on-prem systems.
✨Discuss Distributed Systems Experience
Since the role involves re-architecting distributed systems, be prepared to discuss your experience with technologies like Apache Spark and Hadoop. Share examples of how you've designed or optimised data processing models in a distributed environment.
✨Emphasise Collaboration Skills
This position requires working closely with cross-functional teams. Highlight your ability to collaborate with architects and senior engineers, and provide examples of how you've contributed to design reviews or code walkthroughs in previous roles.