Cloud Data Engineer – Spark, Airflow & AWS
Cloud Data Engineer – Spark, Airflow & AWS

Cloud Data Engineer – Spark, Airflow & AWS

Full-Time 50000 - 65000 £ / year (est.) Home office (partial)
Scrumconnect Limited

At a Glance

  • Tasks: Build and maintain data pipelines using Spark, Airflow, and AWS.
  • Company: Join Scrumconnect Limited, a forward-thinking tech company in London.
  • Benefits: Hybrid work model, competitive salary, and opportunities for professional growth.
  • Other info: Must have active Security Check clearance and enjoy a dynamic work environment.
  • Why this job: Make an impact in cloud data engineering while working with cutting-edge technologies.
  • Qualifications: Strong analytical skills and familiarity with data governance required.

The predicted salary is between 50000 - 65000 £ per year.

Scrumconnect Limited is hiring a Data Engineer based in London, United Kingdom. The role involves building and maintaining data pipelines for a cloud-based data programme using technologies such as Apache Spark, PySpark, and AWS services.

Candidates must have active Security Check (SC) clearance and the ability to work in a hybrid model, traveling to the office three days a week.

The ideal candidate possesses strong analytical skills and familiarity with data governance in a regulated environment.

Cloud Data Engineer – Spark, Airflow & AWS employer: Scrumconnect Limited

Scrumconnect Limited is an excellent employer that fosters a dynamic and inclusive work culture, offering employees the chance to engage in innovative projects while utilising cutting-edge technologies like Apache Spark and AWS. With a strong emphasis on professional development, employees are encouraged to grow their skills in a supportive environment, all while enjoying the flexibility of a hybrid working model in the vibrant city of London.
Scrumconnect Limited

Contact Detail:

Scrumconnect Limited Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Cloud Data Engineer – Spark, Airflow & AWS

Tip Number 1

Network like a pro! Reach out to folks in the industry, especially those already working at Scrumconnect. A friendly chat can sometimes lead to insider info about the role or even a referral.

Tip Number 2

Show off your skills! If you’ve got experience with Apache Spark, PySpark, or AWS, make sure to highlight specific projects or achievements in your conversations. We want to see how you can bring value to the team!

Tip Number 3

Prepare for the interview by brushing up on data governance and compliance topics. Since this role is in a regulated environment, being able to discuss these areas confidently will set you apart from the competition.

Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining the team at Scrumconnect.

We think you need these skills to ace Cloud Data Engineer – Spark, Airflow & AWS

Apache Spark
PySpark
AWS Services
Data Pipeline Development
Data Governance
Analytical Skills
Security Clearance (SC)
Hybrid Work Model Adaptability

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights your experience with Apache Spark, PySpark, and AWS. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how your analytical skills can contribute to our cloud-based data programme. Keep it engaging and personal!

Showcase Your Security Clearance: Since active Security Check (SC) clearance is a must, make sure to mention it clearly in your application. We need to know you’re ready to hit the ground running without any delays!

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from us!

How to prepare for a job interview at Scrumconnect Limited

Know Your Tech Inside Out

Make sure you’re well-versed in Apache Spark, PySpark, and AWS services. Brush up on your knowledge of data pipelines and be ready to discuss how you've used these technologies in past projects. The more specific examples you can provide, the better!

Understand Data Governance

Since the role involves working in a regulated environment, it’s crucial to understand data governance principles. Be prepared to talk about how you’ve ensured compliance in previous roles and how you would approach data governance in this position.

Show Off Your Analytical Skills

Scrumconnect Limited is looking for strong analytical skills, so think of instances where you’ve solved complex problems using data. Prepare to explain your thought process and the impact of your solutions on previous projects.

Hybrid Work Readiness

As the role requires a hybrid model, be ready to discuss how you manage your time and productivity when working remotely. Share any tools or strategies you use to stay connected with your team and ensure smooth collaboration.

Cloud Data Engineer – Spark, Airflow & AWS
Scrumconnect Limited

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>