At a Glance
- Tasks: Design and implement ETL pipelines while collaborating with data teams.
- Company: Athsai is a leading technical recruitment firm focused on the UK, Europe, and California.
- Benefits: Enjoy remote work flexibility and a competitive salary of £75k.
- Why this job: Join us to make a real impact on data initiatives and enhance your tech skills.
- Qualifications: Experience with big data technologies and programming in Python, Scala, and SQL required.
- Other info: Eligibility for SC clearance is necessary for this role.
The predicted salary is between 52500 - 105000 £ per year.
About the Role
We are seeking a talented and driven Data Engineer to join a forward-thinking organisation where data sits at the heart of decision-making. You will design and build scalable, high-performance data pipelines that power analytics and business insights across the company.
This is an exciting opportunity to work with modern cloud technologies, contribute to mission-critical systems, and make a real impact in a fast-paced, collaborative environment.
What You’ll Be Doing
- Designing and implementing robust ETL pipelines to move and transform data at scale
- Building cloud-native solutions on AWS to ensure reliability and performance
- Working with PySpark, Python, and SQL to process large datasets efficiently
- Orchestrating workflows using Apache Airflow
- Leveraging AWS services such as S3, RDS, Redshift, and Lambda
- Deploying infrastructure using Terraform and Infrastructure-as-Code best practices
- Automating releases through CI/CD pipelines with GitHub Actions
- Collaborating closely with engineering and analytics teams to deliver high-quality solutions
Required Skills
- Strong experience with PySpark and AWS
- Proven background building and maintaining ETL pipelines
- Solid programming ability in Python, SQL, and Spark
- Hands-on experience with Apache Airflow
- Deep understanding of AWS data services
- Terraform for cloud deployments
- CI/CD workflow experience using GitHub Actions
#J-18808-Ljbffr
Contact Detail:
Athsai Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Make sure to showcase your hands-on experience with big data technologies during any conversations. Be ready to discuss specific projects where you've designed and implemented ETL pipelines, as this will demonstrate your practical knowledge.
✨Tip Number 2
Familiarise yourself with the specific ETL tools mentioned in the job description, such as Apache Kafka, Spark, and Airflow. Being able to speak confidently about these tools and how you've used them in past roles can set you apart from other candidates.
✨Tip Number 3
Highlight your collaborative skills by preparing examples of how you've worked with stakeholders, data architects, and data scientists in previous roles. This will show that you're not just technically skilled but also a team player.
✨Tip Number 4
If you have experience with AWS services like API Gateway, Lambda, and Redshift, be sure to mention it in discussions. Understanding cloud-native services is crucial for this role, so demonstrating your expertise here will be beneficial.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your hands-on experience with big data technologies. Emphasise your skills in ETL tools like Apache Kafka, Spark, and Airflow, as well as your programming expertise in Python, Scala, and SQL.
Craft a Strong Cover Letter: In your cover letter, express your enthusiasm for the role and how your experience aligns with the company's needs. Mention specific projects where you've designed or implemented ETL pipelines and your familiarity with AWS services.
Showcase Collaborative Skills: Since collaboration is key in this role, include examples in your application that demonstrate your ability to work effectively with stakeholders, data architects, and data scientists to enhance data quality.
Highlight SC Clearance Eligibility: If you have SC clearance or are eligible for it, make sure to mention this in your application. This can be a significant advantage and shows your readiness for the role.
How to prepare for a job interview at Athsai
✨Showcase Your Technical Skills
Be prepared to discuss your hands-on experience with big data technologies. Highlight specific projects where you've implemented ETL pipelines using tools like Apache Kafka, Spark, and Airflow. This will demonstrate your technical expertise and problem-solving abilities.
✨Demonstrate Collaboration
Since the role involves working closely with stakeholders, data architects, and data scientists, be ready to share examples of how you've successfully collaborated in past projects. Emphasise your communication skills and ability to work as part of a team.
✨Familiarise Yourself with AWS Services
Given the importance of AWS in this role, make sure you understand key services like API Gateway, Lambda, Redshift, Glue, and CloudWatch. Be prepared to discuss how you've used these services in previous roles to optimise data processes.
✨Prepare for SC Clearance Questions
As eligibility for SC clearance is required, be ready to discuss your background and any existing clearances. Understand the implications of this requirement and be honest about your eligibility status during the interview.