Senior Data Engineer - PySpark & Lakehouse (Remote) in London
Senior Data Engineer - PySpark & Lakehouse (Remote)

Senior Data Engineer - PySpark & Lakehouse (Remote) in London

London Full-Time No home office possible
Primus Connect

At a Glance

  • Tasks: Build efficient data pipelines and orchestrate workflows using Airflow.
  • Company: Dynamic data solutions company focused on innovative cloud technologies.
  • Benefits: Competitive pay, remote work flexibility, and minimal in-office requirements.
  • Why this job: Join a cutting-edge team and shape the future of data engineering.
  • Qualifications: Strong PySpark and SQL skills with cloud data engineering experience.
  • Other info: Work remotely with just one day a week in Central London.

A data solutions company is seeking a Senior Data Engineer with strong PySpark skills to build a Lakehouse platform in GCP. This hands-on role involves building efficient data pipelines and orchestrating workflows using Airflow. The candidate should have experience in cloud data engineering, especially with PySpark and SQL. The role is virtually remote, requiring only one day per week in Central London, and offers competitive pay between £500 - £550 per day, outside of IR35 regulations.

Senior Data Engineer - PySpark & Lakehouse (Remote) in London employer: Primus Connect

Join a forward-thinking data solutions company that values innovation and collaboration, offering a dynamic work culture where your expertise in PySpark and cloud data engineering will thrive. With flexible remote working arrangements and the opportunity to connect with colleagues in Central London just once a week, you can enjoy a balanced work-life while contributing to cutting-edge projects. We prioritise employee growth through continuous learning opportunities and competitive compensation, making this an ideal environment for those seeking meaningful and rewarding employment.
Primus Connect

Contact Detail:

Primus Connect Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer - PySpark & Lakehouse (Remote) in London

✨Tip Number 1

Network like a pro! Reach out to your connections in the data engineering field, especially those who work with PySpark and GCP. A friendly chat can lead to insider info about job openings that aren't even advertised yet.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your best data pipelines and projects using PySpark. This will give potential employers a taste of what you can do and set you apart from the crowd.

✨Tip Number 3

Prepare for technical interviews by brushing up on your SQL and Airflow knowledge. We recommend doing mock interviews with friends or using online platforms to get comfortable with the types of questions you might face.

✨Tip Number 4

Don’t forget to apply through our website! We’ve got loads of opportunities that match your skills, and applying directly can sometimes give you an edge over other candidates. Plus, it’s super easy!

We think you need these skills to ace Senior Data Engineer - PySpark & Lakehouse (Remote) in London

PySpark
Lakehouse Architecture
GCP (Google Cloud Platform)
Data Pipeline Development
Workflow Orchestration
Airflow
Cloud Data Engineering
SQL
Remote Work Adaptability

Some tips for your application 🫡

Show Off Your PySpark Skills: Make sure to highlight your experience with PySpark in your application. We want to see how you've used it to build data pipelines and any cool projects you've worked on that showcase your skills.

Talk About Your Cloud Experience: Since this role is all about cloud data engineering, don’t forget to mention your experience with GCP and any other cloud platforms. We love seeing how you’ve tackled challenges in the cloud!

Be Clear and Concise: When writing your application, keep it clear and to the point. We appreciate a well-structured application that gets straight to the good stuff without unnecessary fluff.

Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and we can’t wait to see what you bring to the table!

How to prepare for a job interview at Primus Connect

✨Know Your PySpark Inside Out

Make sure you brush up on your PySpark skills before the interview. Be ready to discuss specific projects where you've used PySpark to build data pipelines. Prepare to explain your thought process and the challenges you faced, as this will show your hands-on experience.

✨Showcase Your Cloud Experience

Since the role involves working with GCP, highlight any relevant cloud data engineering experience you have. Be prepared to talk about how you've leveraged cloud technologies in past projects, especially in relation to building a Lakehouse platform.

✨Familiarise Yourself with Airflow

As orchestrating workflows using Airflow is part of the job, make sure you understand its core functionalities. You might be asked to describe how you've used Airflow in previous roles, so think of examples that demonstrate your ability to manage complex data workflows.

✨Prepare Questions for Them

Interviews are a two-way street! Prepare insightful questions about their data architecture, team dynamics, or future projects. This not only shows your interest in the role but also helps you gauge if the company is the right fit for you.

Senior Data Engineer - PySpark & Lakehouse (Remote) in London
Primus Connect
Location: London

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>