At a Glance
- Tasks: Design and develop scalable data processing solutions using Spark and Scala.
- Company: Join a leading global financial services firm with a dynamic team.
- Benefits: Competitive daily rate, hybrid work model, and opportunities for professional growth.
- Why this job: Make an impact in a high-performing team while working with cutting-edge technologies.
- Qualifications: 8+ years of Scala experience and strong knowledge of Apache Spark.
- Other info: Collaborative environment with a focus on continuous improvement and innovation.
Location: London, England, United Kingdom
Salary: £350 - £390 /day
Job Type: Contract
Date Posted: February 2nd, 2026
Important: Due to contractual restrictions, candidates who have been employed by the client within the past 12 months cannot be considered.
Must-Have Skills
- Strong hands-on experience with Scala (8+ years)
- Extensive experience with Apache Spark (Spark Core & Spark SQL)
- Proven background in designing and building large-scale distributed data pipelines
- Solid understanding of data structures, ETL concepts, and data warehousing
- Strong experience with SQL and database concepts (SQL/NoSQL)
Nice-to-Have Skills
- Spark Streaming
- Hadoop, HDFS
- Hive, Impala
- Sqoop
- UNIX/Linux shell scripting
Role Responsibilities
- Design, develop, and maintain scalable Spark-based data processing solutions
- Write clean, efficient, and maintainable Scala code following best practices
- Work in an Agile/Scrum environment (stand-ups, sprint planning, retrospectives)
- Collaborate with global stakeholders and upstream/downstream teams
- Troubleshoot and resolve complex data and performance issues
- Contribute to continuous improvement and adoption of new technologies
What We’re Looking For
- Strong analytical and problem-solving skills
- Excellent verbal and written communication
- Experience working in global delivery environments
- Ability to work effectively in diverse, multi-stakeholder teams
Contract: 12 months
Openings: 2
We are looking for experienced Spark–Scala Developers to join a high-performing data engineering team working on large-scale, distributed data platforms within a leading global financial services environment.
Spark–Scala Developer employer: N Consulting Limited
Contact Detail:
N Consulting Limited Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Spark–Scala Developer
✨Network Like a Pro
Get out there and connect with folks in the industry! Attend meetups, webinars, or even just grab a coffee with someone who’s already in the game. You never know when a casual chat might lead to your next big opportunity.
✨Show Off Your Skills
Don’t just tell them you’re great at Scala and Spark; show them! Create a portfolio of projects or contribute to open-source work that highlights your expertise. This way, you can back up your claims with real evidence.
✨Ace the Interview
Prepare for those interviews by brushing up on common questions related to data pipelines and distributed systems. Practice coding challenges and be ready to discuss your past projects in detail. Confidence is key!
✨Apply Through Us
Make sure to apply through our website for the best chance at landing that Spark–Scala Developer role. We’re here to help you every step of the way, so don’t hesitate to reach out if you need any tips or guidance!
We think you need these skills to ace Spark–Scala Developer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your strong hands-on experience with Scala and Apache Spark. We want to see how your skills match the job description, so don’t be shy about showcasing your relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re the perfect fit for the Spark–Scala Developer role. We love seeing your personality come through, so keep it engaging and relevant.
Showcase Your Problem-Solving Skills: In your application, mention specific examples where you've tackled complex data issues or improved processes. We’re looking for those strong analytical skills, so don’t hold back on sharing your successes!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy!
How to prepare for a job interview at N Consulting Limited
✨Know Your Tech Inside Out
Make sure you brush up on your Scala and Apache Spark skills. Be ready to discuss your hands-on experience with these technologies, especially how you've designed and built large-scale distributed data pipelines. Prepare examples that showcase your problem-solving abilities in real-world scenarios.
✨Showcase Your Agile Experience
Since the role involves working in an Agile/Scrum environment, be prepared to talk about your experiences with stand-ups, sprint planning, and retrospectives. Highlight how you've collaborated with teams and contributed to continuous improvement in past projects.
✨Communicate Clearly
Strong verbal and written communication skills are a must. Practice explaining complex technical concepts in simple terms, as you'll need to collaborate with global stakeholders. Think of examples where your communication made a difference in a project.
✨Prepare for Problem-Solving Questions
Expect to face questions that test your analytical and problem-solving skills. Prepare to discuss specific challenges you've encountered in data processing and how you resolved them. This will demonstrate your ability to troubleshoot and handle performance issues effectively.