AWS Data Engineer Apply now

AWS Data Engineer

Stretford Full-Time 43200 - 72000 £ / year (est.)
Apply now
I

At a Glance

  • Tasks: Join our Data Engineering team to develop and maintain innovative data pipelines.
  • Company: We're a leading market research company focused on cutting-edge data solutions.
  • Benefits: Enjoy 25 days paid holiday, life insurance, performance bonuses, and a hybrid work model.
  • Why this job: Tackle complex data challenges in a fun environment using the latest Big Data technologies.
  • Qualifications: Strong AWS and Pyspark experience; knowledge of SQL, Python, and agile practices required.
  • Other info: You'll receive one-on-one coaching and support for training programs.

The predicted salary is between 43200 - 72000 £ per year.

A world market research company are looking for a passionate Data Engineer to come and join their team. You will be working in the Data Engineering team whose main function is developing maintaining and improving the end-to-end data pipeline that includes real-time data processing; extract, transform load jobs artificial intelligence and data analytics on a complex and large dataset. You must have strong AWS and Pyspeak experience. Your role will primarily work with Pyspark or Scala data transformations and maintain data pipleines on AWS infrastructure to develop innovative solutions to effectively scale and maintain the data platform. You will be working on complex data problems in a challenging and fun environment using some of the latest Big Data open-source technologies like Apache Spark as well as Amazon Web Service technologies including Elastic MapReduce Athena and Lambda to develop scalable data solutions. The role is hybrid. Great benefits * 25 days paid holiday plus bank holidays * Purchase/sale of up to 5 leave days pa – after 2 years’ service * Life insurance * Workplace pension with employer contribution * Performance based bonus scheme * Informal dress code * Cycle to work scheme * Branded company merchandise * New company laptop * One to one learning and development coaching sessions * Support and budget available for training programmes * ‘Giving back’ to charities Ideal Data Engineer * Knowledge of Serverless technologies frameworks and best practices. • Experience using AWS CloudFormation or Terraform for infrastructure automation. * Knowledge of Scala or Pyspark language such as Java or C#. * SQL or Python development experience. * High-quality coding and testing practices. * Willingness to learn new technologies and methodologies. * Knowledge of agile software development practices including continuous integration * automated testing and working with software engineering requirements and specifications. * Good interpersonal skills positive attitude willing to help other members of the team. * Experience debugging and dealing with failures on business-critical systems. * Preferable: * Exposure to Apache Spark Apache Trino * or another big data processing system. * Knowledge of streaming data principles and best practices. * Understanding of database technologies and standards. * Experience working on large and complex datasets. * Exposure to Data Engineering practices used in Machine Learning training and inference. * Experience using Git Jenkins and other CI/CD tools

AWS Data Engineer employer: Ikhoi Recruitment

Join a leading world market research company that values innovation and collaboration, offering a dynamic work culture where your contributions as an AWS Data Engineer will directly impact the development of cutting-edge data solutions. Enjoy a comprehensive benefits package including generous paid leave, life insurance, and a performance-based bonus scheme, alongside opportunities for personal growth through one-on-one coaching and training support. With a hybrid work model and a focus on the latest Big Data technologies, this role provides a unique chance to tackle complex data challenges in a fun and supportive environment.
I

Contact Detail:

Ikhoi Recruitment Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land AWS Data Engineer

✨Tip Number 1

Familiarize yourself with the specific AWS services mentioned in the job description, such as Elastic MapReduce and Lambda. Having hands-on experience or projects that showcase your skills with these technologies can set you apart from other candidates.

✨Tip Number 2

Engage with the Data Engineering community online, especially around Pyspark and Scala. Participating in forums, contributing to open-source projects, or even sharing your own insights on platforms like GitHub can demonstrate your passion and expertise.

✨Tip Number 3

Prepare to discuss your experience with infrastructure automation tools like AWS CloudFormation or Terraform. Be ready to share specific examples of how you've used these tools to streamline processes or improve efficiency in past projects.

✨Tip Number 4

Showcase your problem-solving skills by preparing for technical discussions. Think of complex data challenges you've faced and how you approached them, especially in relation to big data processing and real-time data analytics.

We think you need these skills to ace AWS Data Engineer

AWS CloudFormation
Terraform
Pyspark
Scala
SQL
Python
Apache Spark
Elastic MapReduce
Athena
Lambda
Serverless technologies
Agile software development
Continuous integration
Automated testing
Debugging
Interpersonal skills
Data Engineering practices
Machine Learning
Git
Jenkins

Some tips for your application 🫡

Understand the Role: Make sure you fully understand the responsibilities and requirements of the AWS Data Engineer position. Familiarize yourself with key technologies mentioned in the job description, such as AWS services, Pyspark, and Scala.

Tailor Your CV: Customize your CV to highlight relevant experience and skills that align with the job description. Emphasize your knowledge of AWS, data pipelines, and any experience with big data technologies like Apache Spark.

Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your ability to solve complex data problems. Mention specific projects or experiences that demonstrate your expertise in AWS and Pyspark.

Highlight Soft Skills: In addition to technical skills, emphasize your interpersonal skills and positive attitude. The company values teamwork and a willingness to help others, so make sure to convey these traits in your application.

How to prepare for a job interview at Ikhoi Recruitment

✨Showcase Your AWS Expertise

Make sure to highlight your experience with AWS services, especially those mentioned in the job description like Elastic MapReduce and Lambda. Be prepared to discuss specific projects where you utilized these technologies.

✨Demonstrate Your Pyspark Skills

Since strong Pyspark experience is a must, come ready to talk about your past work with data transformations using Pyspark or Scala. If possible, share examples of complex data problems you've solved using these tools.

✨Discuss Your Approach to Data Pipelines

Be ready to explain how you develop, maintain, and improve data pipelines. Discuss your understanding of real-time data processing and any relevant frameworks or best practices you've implemented in previous roles.

✨Emphasize Team Collaboration

The company values good interpersonal skills and a positive attitude. Share examples of how you've worked effectively in teams, helped colleagues, or contributed to a collaborative environment in your past experiences.

I
Similar positions in other companies
F
Data Infrastructure Engineer

Formula Recruitment

Manchester Full-Time
A
AWS Data Engineer

Adecco

Full-Time 60000 - 100000 £ / year (est.)
Europas größte Jobbörse für Gen-Z
discover-jobs-cta
Discover now
>