Python Developer (PySpark, AWS Glue)

Python Developer (PySpark, AWS Glue)

London Full-Time 36000 - 54000 £ / year (est.) No home office possible
I

At a Glance

  • Tasks: Join our team to design and maintain scalable data pipelines using PySpark and AWS Glue.
  • Company: Be part of a dynamic organisation focused on high-impact data engineering projects.
  • Benefits: Enjoy a competitive salary, flexible working options, and ongoing training opportunities.
  • Why this job: Work with cutting-edge technology on exciting data-driven projects in a supportive environment.
  • Qualifications: 2+ years of Python experience, strong PySpark skills, and AWS knowledge required.
  • Other info: Remote work options available; perfect for tech-savvy individuals eager to grow.

The predicted salary is between 36000 - 54000 £ per year.

Salary: Up to £45,000 per annum

Location: London, M-25

Job Type: Permanent

Start Date: ASAP

About the Role

We're looking for a talented Python Developer with strong experience in PySpark and AWS Glue to join our growing data engineering team. This is a fantastic opportunity for someone looking to work on modern data pipelines in the cloud, contributing to high-impact projects across a dynamic organisation.

Key Responsibilities

  • Design, develop, and maintain scalable data pipelines using PySpark and AWS Glue
  • Work closely with data engineers, analysts, and architects to understand business requirements and implement efficient ETL processes
  • Automate data ingestion, transformation, and integration workflows
  • Optimise performance of large-scale data processing jobs
  • Write clean, reusable, and testable Python code
  • Monitor and troubleshoot pipeline performance and data quality issues
  • Ensure best practices in code quality, testing, and documentation

Requirements

Essential:

  • 2+ years of professional Python development experience
  • Strong hands-on experience with PySpark for distributed data processing
  • Proven knowledge and use of AWS Glue and other AWS data services (eg S3, Athena, Lambda)
  • Familiarity with data lake architecture and modern ETL practices
  • Experience working with structured and semi-structured data (eg, Parquet, JSON)
  • Git version control and CI/CD familiarity
  • Strong communication and collaboration skills

Desirable:

  • Experience with other AWS services like Redshift, EMR, Step Functions
  • Knowledge of SQL and data modelling principles
  • Exposure to containerisation tools (eg, Docker) or orchestration (eg, Airflow)
  • Understanding of data governance, lineage, and quality tools

What's on Offer

  • Competitive salary up to £45,000
  • Flexible working (Hybrid/Remote options available)
  • Ongoing training and development opportunities
  • Supportive team environment and modern tech stack
  • Opportunity to work on exciting data-driven projects at scale

Python Developer (PySpark, AWS Glue) employer: Intelligent Resourcing Solutions Ltd

Join a forward-thinking organisation in London that values innovation and collaboration, making it an excellent employer for Python Developers. With a competitive salary of up to £45,000, flexible working options, and a supportive team environment, you'll have the opportunity to grow your skills through ongoing training while contributing to impactful data-driven projects. Embrace a culture that prioritises modern technology and employee development, ensuring you thrive in your career.
I

Contact Detail:

Intelligent Resourcing Solutions Ltd Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Python Developer (PySpark, AWS Glue)

✨Tip Number 1

Familiarise yourself with the latest features and best practices of PySpark and AWS Glue. This will not only boost your confidence during interviews but also demonstrate your commitment to staying updated in a rapidly evolving field.

✨Tip Number 2

Engage with the data engineering community through forums, webinars, or local meetups. Networking can lead to valuable insights and connections that might help you land the job at StudySmarter.

✨Tip Number 3

Prepare to discuss specific projects where you've implemented ETL processes using PySpark and AWS Glue. Being able to share real-world examples will set you apart from other candidates.

✨Tip Number 4

Showcase your collaboration skills by discussing how you've worked with cross-functional teams in the past. Highlighting your ability to communicate effectively with data engineers and analysts will resonate well with our team at StudySmarter.

We think you need these skills to ace Python Developer (PySpark, AWS Glue)

Python Development
PySpark
AWS Glue
AWS Services (S3, Athena, Lambda)
ETL Processes
Data Pipeline Design
Data Ingestion Automation
Performance Optimisation
Clean Code Practices
Data Quality Monitoring
Version Control (Git)
CI/CD Familiarity
Structured and Semi-Structured Data Handling
Communication Skills
Collaboration Skills
Data Lake Architecture
SQL Knowledge
Data Modelling Principles
Containerisation Tools (Docker)
Orchestration Tools (Airflow)
Data Governance Understanding

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights your experience with Python, PySpark, and AWS Glue. Use specific examples of projects where you've designed or maintained data pipelines to demonstrate your skills.

Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention how your background aligns with their needs, particularly in data engineering and cloud technologies.

Showcase Relevant Projects: If you have worked on relevant projects, include them in your application. Describe your role, the technologies used, and the impact of your work, especially in optimising data processing jobs.

Highlight Collaboration Skills: Since the role involves working closely with data engineers and analysts, emphasise your communication and collaboration skills. Provide examples of how you've successfully worked in teams to achieve project goals.

How to prepare for a job interview at Intelligent Resourcing Solutions Ltd

✨Showcase Your Technical Skills

Be prepared to discuss your experience with Python, PySpark, and AWS Glue in detail. Bring examples of projects you've worked on that demonstrate your ability to design and maintain data pipelines, as well as your understanding of ETL processes.

✨Understand the Company’s Data Needs

Research the company’s data architecture and the types of projects they are involved in. This will help you tailor your answers to show how your skills can directly benefit their operations and contribute to high-impact projects.

✨Prepare for Problem-Solving Questions

Expect to face technical challenges during the interview. Practice solving common data processing problems using PySpark and be ready to explain your thought process clearly. This will showcase your analytical skills and ability to troubleshoot.

✨Emphasise Collaboration and Communication

Since the role involves working closely with data engineers, analysts, and architects, highlight your teamwork and communication skills. Share examples of how you've successfully collaborated on projects in the past to achieve common goals.

Python Developer (PySpark, AWS Glue)
Intelligent Resourcing Solutions Ltd
I
Similar positions in other companies
Europas größte Jobbörse für Gen-Z
discover-jobs-cta
Discover now
>