AWS Data Engineer

AWS Data Engineer

Full-Time 75000 £ / year No home office possible
D

At a Glance

  • Tasks: Join a dynamic team to enhance data platforms and develop advanced data pipelines.
  • Company: A leader in consumer behaviour analytics, driving innovation in data solutions.
  • Benefits: Enjoy hybrid working, competitive salary, and exciting referral schemes.
  • Why this job: Challenge yourself with cutting-edge technologies in a collaborative environment.
  • Qualifications: Proven AWS experience, strong SQL skills, and coding expertise in Python required.
  • Other info: Candidates must have the right to live and work in the UK; sponsorship not available.

A leader in consumer behaviour analytics seeks a driven AWS Data Engineer to guide data infrastructure architecture, working alongside a small talented team of engineers, analysts, and data scientists. In this role, you will enhance the data platform, develop advanced data pipelines, and integrate cutting-edge technologies like DataOps and Generative AI, including Large Language Models (LLMs).

This is an exciting opportunity for someone looking to challenge themselves in a collaborative environment, working with a breadth of diverse data and cutting-edge technologies. You will have proven experience developing AWS Cloud platforms end to end, orchestrating data using Dagster or similar as well as coding in Python and SQL.

Key Responsibilities:
  • Develop and optimize ETL/ELT processes to support data transformation and integrity for analytics.
  • Explore and evaluate new data warehousing solutions, including Snowflake, to improve data accessibility and scalability.
  • Partner with product and engineering teams to define data architecture and best practices for reporting.
  • Ensure data security, compliance, and governance across data systems.
  • Implement and maintain CI/CD pipelines to automate data workflows and enhance system reliability.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability and performance.
Essential Skills and Experience:
  • Hands-on experience with AWS services, including Lambda, Glue, Athena, RDS, and S3.
  • Strong SQL skills for data transformation, cleaning, and loading.
  • Strong coding experience with Python and Pandas.
  • Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Strong communication skills to collaborate with remote teams (US, Canada).
Nice to Have:
  • Familiarity with LLMs including fine-tuning and RAG.
  • Knowledge of Statistics.
  • Knowledge of DataOps best practices, including CI/CD for data workflows.

Additional Requirements: Candidates must have an existing and future right to live and work in the UK. Sponsorship at any point is not available.

If this sounds like the role for you then please apply today! Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes!

AWS Data Engineer employer: Datatech

As a leader in consumer behaviour analytics, our company offers an exceptional work environment that fosters collaboration and innovation. With a hybrid working model based in London, employees enjoy the flexibility of working from home while being part of a talented team dedicated to advancing data infrastructure. We prioritise employee growth through continuous learning opportunities and encourage the exploration of cutting-edge technologies, making us an attractive employer for those seeking meaningful and rewarding careers.
D

Contact Detail:

Datatech Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land AWS Data Engineer

✨Tip Number 1

Familiarise yourself with the specific AWS services mentioned in the job description, such as Lambda, Glue, and S3. Having hands-on experience with these tools will not only boost your confidence but also demonstrate your capability to potential employers.

✨Tip Number 2

Engage with online communities or forums related to AWS Data Engineering. Networking with professionals in the field can provide insights into the latest trends and technologies, which could be beneficial during interviews.

✨Tip Number 3

Consider working on personal projects that involve building data pipelines using Python and SQL. Showcasing these projects in discussions or interviews can highlight your practical skills and problem-solving abilities.

✨Tip Number 4

Brush up on your communication skills, especially for collaborating with remote teams. Being able to articulate your ideas clearly will be crucial in a hybrid working environment, so practice discussing technical concepts in simple terms.

We think you need these skills to ace AWS Data Engineer

AWS Services (Lambda, Glue, Athena, RDS, S3)
SQL for Data Transformation
Python Programming
Pandas Library
Data Pipeline Management (Dagster, Celery, Airflow)
ETL/ELT Process Development
Data Warehousing Solutions (Snowflake)
Data Security and Compliance
CI/CD Pipeline Implementation
Data Architecture Best Practices
Cross-Functional Team Collaboration
Strong Communication Skills
DataOps Best Practices
Familiarity with Large Language Models (LLMs)

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights your experience with AWS services, Python, SQL, and any relevant data pipeline tools like Dagster. Use specific examples to demonstrate your skills in developing ETL/ELT processes and working with cross-functional teams.

Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention how your background aligns with their needs, particularly your experience with data architecture and CI/CD pipelines. Show that you understand the importance of data security and governance.

Showcase Relevant Projects: If you have worked on projects involving data transformation, data warehousing solutions, or automation of data workflows, be sure to include these in your application. Highlight any experience with Generative AI or LLMs if applicable.

Proofread Your Application: Before submitting, carefully proofread your CV and cover letter for any spelling or grammatical errors. A polished application reflects your attention to detail, which is crucial for a role that involves data integrity and compliance.

How to prepare for a job interview at Datatech

✨Showcase Your AWS Expertise

Make sure to highlight your hands-on experience with AWS services like Lambda, Glue, and S3. Be prepared to discuss specific projects where you utilised these tools, as this will demonstrate your practical knowledge and ability to contribute to the team.

✨Demonstrate Your Coding Skills

Since strong coding experience in Python and SQL is essential, be ready to talk about your previous coding projects. You might even want to prepare for a coding challenge or technical questions that assess your problem-solving skills using these languages.

✨Discuss Data Pipeline Management

Familiarise yourself with data pipeline and workflow management tools like Dagster or Airflow. Be prepared to explain how you've used these tools in past roles to optimise ETL/ELT processes and improve data accessibility.

✨Communicate Effectively

Strong communication skills are crucial, especially when collaborating with remote teams. Practice articulating your thoughts clearly and concisely, and be ready to provide examples of how you've successfully worked with cross-functional teams in dynamic environments.

D
Similar positions in other companies
Europas größte Jobbörse für Gen-Z
discover-jobs-cta
Discover now
>