Freelance Data Engineer (AWS / PySpark) - Trading
Freelance Data Engineer (AWS / PySpark) - Trading

Freelance Data Engineer (AWS / PySpark) - Trading

Freelance 36000 - 60000 £ / year (est.) Home office possible
Go Premium
D

At a Glance

  • Tasks: Enhance and maintain a cloud-based data platform using Python and PySpark.
  • Company: Dynamic trading company based in London with a focus on innovation.
  • Benefits: 100% remote work, flexible hours, and long-term contract opportunities.
  • Other info: Collaborate with international teams and grow your skills in a supportive environment.
  • Why this job: Join a cutting-edge team and make an impact in the trading industry.
  • Qualifications: Experience in data engineering, Python, PySpark, and AWS required.

The predicted salary is between 36000 - 60000 £ per year.

We are looking for a Data Engineer to join the data platform team of a trading company based in London. You will work in the same team as a Lead Data Engineer already in place and focus mainly on enhancements of the data platform. The environment is strongly cloud-oriented (AWS), with Python and PySpark at the core of all data pipelines and processing. You will help keep mission-critical trading and risk data reliable, performant, and available to users across the business.

Key Responsibilities

  • Monitor and operate existing data pipelines and jobs (Python / PySpark / AWS).
  • Perform bug fixes, small evolutions, and refactoring on existing codebases.
  • Work closely with the Lead Data Engineer and business stakeholders to understand issues and prioritize fixes.
  • Maintain documentation for pipelines, workflows, and operational procedures.
  • Contribute to the development of new data ingestion and transformation pipelines in Python / PySpark on AWS.
  • Implement new features and small data products on top of the existing data platform.
  • Participate in code reviews and help improve engineering best practices (testing, CI/CD, observability).

Required Skills & Experience

  • Several years of experience as a Data Engineer (or similar role).
  • Strong hands-on skills in Python for data processing (ETL/ELT, APIs, automation).
  • Solid experience with PySpark in a production context (batch and/or streaming).
  • Professional experience on AWS (data-oriented services such as S3, Glue, EMR, Lambda, ECS, Redshift, etc.).
  • Good knowledge of SQL and relational databases.
  • Experience operating data pipelines in production: monitoring, alerting, troubleshooting, optimisation.
  • Comfortable working in remote, international teams, communicating in English (written and spoken).

Nice to Have

  • Experience in trading, financial markets, or commodities environments.
  • Knowledge of data orchestration tools (e.g. Airflow, Step Functions) and CI/CD practices.
  • Familiarity with microservices / APIs (FastAPI or similar) and containerization (Docker).
  • Exposure to data governance, security, and role-based access control in cloud environments.

Soft Skills

  • Strong sense of ownership and reliability on production systems.
  • Pragmatic and solution-oriented, able to work on both small fixes and incremental improvements.
  • Good communication skills with both technical and non-technical stakeholders.
  • Team player, comfortable pairing with a more senior engineer and following existing architecture and standards.

Practical Information

  • Location: 100% Remote (European time zones).
  • Environment: Data platform for trading & risk use cases.
  • Start date: ASAP.
  • Contract type: Long-term mission / consulting.

Freelance Data Engineer (AWS / PySpark) - Trading employer: Darwin Partners

Join a dynamic trading company that values innovation and collaboration, offering a fully remote work environment tailored for European time zones. As a Freelance Data Engineer, you'll have the opportunity to enhance a cutting-edge data platform while working alongside experienced professionals, fostering both personal and professional growth. With a strong emphasis on cloud technologies and a culture that encourages continuous learning, this role provides a meaningful chance to contribute to mission-critical data solutions in the fast-paced world of trading.
D

Contact Detail:

Darwin Partners Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Freelance Data Engineer (AWS / PySpark) - Trading

✨Tip Number 1

Network like a pro! Reach out to your connections in the data engineering field, especially those who work with AWS and PySpark. A friendly chat can lead to opportunities that aren’t even advertised yet.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your projects, especially those involving data pipelines and cloud services. This gives potential employers a taste of what you can do and sets you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on your technical knowledge and soft skills. Be ready to discuss your experience with Python, PySpark, and AWS, and don’t forget to highlight your problem-solving abilities!

✨Tip Number 4

Apply through our website! We’re always on the lookout for talented individuals like you. Plus, it’s a great way to ensure your application gets the attention it deserves.

We think you need these skills to ace Freelance Data Engineer (AWS / PySpark) - Trading

Data Engineering
Python
PySpark
AWS
ETL/ELT
APIs
SQL
Relational Databases
Data Pipeline Monitoring
Troubleshooting
Optimisation
Data Orchestration Tools
CI/CD Practices
Microservices
Containerization

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights your experience with Python, PySpark, and AWS. We want to see how your skills align with the key responsibilities mentioned in the job description, so don’t hold back on showcasing relevant projects!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re excited about this role and how your background makes you a perfect fit. We love seeing genuine enthusiasm for data engineering and trading environments.

Showcase Your Problem-Solving Skills: In your application, mention specific examples where you've tackled challenges in data pipelines or optimised processes. We appreciate candidates who can demonstrate their ownership and reliability in production systems.

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for this exciting opportunity. Don’t miss out!

How to prepare for a job interview at Darwin Partners

✨Know Your Tech Stack

Make sure you brush up on your Python and PySpark skills before the interview. Be ready to discuss specific projects where you've used these technologies, especially in a production context. Highlight any experience with AWS services like S3 or Glue, as they'll want to see how you can enhance their data platform.

✨Showcase Your Problem-Solving Skills

Prepare examples of how you've monitored and operated data pipelines in the past. Think about times when you had to troubleshoot issues or optimise performance. Being able to articulate your thought process during these situations will demonstrate your reliability and ownership of production systems.

✨Communicate Effectively

Since you'll be working with both technical and non-technical stakeholders, practice explaining complex concepts in simple terms. This will show that you can bridge the gap between different teams. Also, be prepared to discuss how you maintain documentation for workflows and operational procedures.

✨Be a Team Player

Emphasise your ability to collaborate with others, especially with the Lead Data Engineer. Share experiences where you've paired with senior engineers or contributed to code reviews. This will highlight your willingness to learn and improve engineering best practices within the team.

Freelance Data Engineer (AWS / PySpark) - Trading
Darwin Partners
Go Premium

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>