At a Glance
- Tasks: As an AWS Data Engineer, you'll design and maintain data solutions on the AWS Cloud.
- Company: Join IBM, a leader in tech innovation, working with clients globally to drive change.
- Benefits: Enjoy flexible work options, generous leave policies, and a supportive culture focused on growth.
- Why this job: Be part of a collaborative environment that values creativity and offers real impact on clients' journeys.
- Qualifications: A Bachelor's degree and hands-on experience with AWS tools are essential for this role.
- Other info: This position is full-time and requires UK residency for at least 10 years.
The predicted salary is between 28000 - 32000 £ per year.
In this role, you’ll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You’ll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you’ll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centres on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
We Offer
- Regular and frequent promotion and progression opportunities to ensure you can drive and develop your career with us
- Feedback and checkpoints throughout the year; no one off annual reviews here
- A multitude of training opportunities from classroom to e-learning, mentoring and coaching programs as well as the chance to gain industry recognized certifications
- Diversity & Inclusion as an essential and authentic component of our culture through our policies and process as well as our Employee Champion teams and support networks
- A culture where your ideas for growth and innovation are always welcome
- Internal recognition programs for peer-to-peer appreciation as well as from manager to employees
- Tools and policies to support your work-life balance from flexible working approaches, sabbatical programs, 1 month paid paternity leave, 16 weeks fully paid maternity leave and an innovative maternity returners scheme
- More traditional benefits, such as 25 days holiday, a gym discount scheme, online shopping discounts, an Employee Assistance Program, a group personal pension plan of an additional 5% of your base salary paid by us monthly to save for your future.
Your Role And Responsibilities
A data engineer with expertise in AWS toolset advises on, develops, and maintains data engineering solutions on the AWS Cloud ecosystem. They design, build, and operate batch and real-time data pipelines using AWS services such as AWS EMR, AWS Glue, Glue Catalog, and Kinesis. Additionally, they create data layers on AWS RedShift, Aurora, and DynamoDB. The data engineer also migrates data using AWS DMS and is proficient with various AWS Data Platform components, including S3, RedShift, RedShift Spectrum, AWS Glue with Spark, AWS Glue with Python, Lambda functions with Python, AWS Glue Catalog, and AWS Glue Databrew. They are experienced in developing batch and real-time data pipelines for Data Warehouse and Datalake, utilizing AWS Kinesis and Managed Streaming for Apache Kafka. They are also proficient in using open source technologies like Apache Airflow and dbt, Spark / Python or Spark / Scala on AWS Platform. The data engineer schedules and manages data services on the AWS Platform, ensuring seamless integration and operation of data engineering solutions.
Preferred Education
Bachelor's Degree
Required Technical And Professional Expertise
Current hands-on skills: AWS (lambda S3 DynamoDB etc) Cloudformation javascript
As an equal opportunities’ employer, we welcome applications from individuals of all backgrounds. However, for you to be eligible for this role, you must have the valid right to work in the UK. Unfortunately, we do not offer visa sponsorship and have no future plans to do so. You must be a resident in the UK and have been living continuously in the UK for the last 10 years. You must be able to hold or gain a UK government security clearance.
Preferred Technical And Professional Experience
cypress testing openshift containers
AWS Data Engineer employer: IBM
Contact Detail:
IBM Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land AWS Data Engineer
✨Tip Number 1
Familiarise yourself with the specific AWS services mentioned in the job description, such as AWS EMR, Glue, and Kinesis. Having hands-on experience or projects showcasing these tools can set you apart during discussions.
✨Tip Number 2
Network with current or former IBM employees on platforms like LinkedIn. Engaging with them can provide insights into the company culture and the role, plus they might even refer you internally, increasing your chances of landing an interview.
✨Tip Number 3
Prepare to discuss real-world scenarios where you've implemented data engineering solutions. Be ready to explain your thought process and the impact of your work, as this aligns with IBM's focus on meaningful change for clients.
✨Tip Number 4
Stay updated on the latest trends in cloud computing and data engineering. Being able to discuss recent developments or innovations in the field during your conversations can demonstrate your passion and commitment to continuous learning.
We think you need these skills to ace AWS Data Engineer
Some tips for your application 🫡
Understand the Role: Before applying, make sure you fully understand the responsibilities and requirements of the AWS Data Engineer position at IBM. Familiarise yourself with the specific AWS tools and technologies mentioned in the job description.
Tailor Your CV: Customise your CV to highlight relevant experience and skills that align with the job description. Emphasise your hands-on experience with AWS services, data engineering solutions, and any relevant projects you've worked on.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your understanding of IBM's culture and values. Mention how your skills can contribute to their innovative projects and client relationships.
Proofread and Edit: Before submitting your application, carefully proofread your CV and cover letter for any spelling or grammatical errors. A polished application reflects your attention to detail and professionalism.
How to prepare for a job interview at IBM
✨Showcase Your AWS Expertise
Make sure to highlight your hands-on experience with AWS services like Lambda, S3, and DynamoDB. Be prepared to discuss specific projects where you've implemented these tools, as this will demonstrate your practical knowledge and problem-solving skills.
✨Demonstrate Your Data Pipeline Skills
Since the role involves building batch and real-time data pipelines, be ready to explain your experience with AWS Glue, Kinesis, and other relevant technologies. Discuss any challenges you faced and how you overcame them to ensure smooth data flow.
✨Emphasise Your Curiosity and Innovation
IBM values curiosity and a quest for knowledge. Share examples of how you've challenged the norm or explored new ideas in your previous roles. This could include learning new technologies or suggesting improvements to existing processes.
✨Prepare for Technical Questions
Expect technical questions related to data engineering and AWS. Brush up on your knowledge of CloudFormation, Apache Airflow, and any programming languages mentioned in the job description, such as Python or Scala. Being well-prepared will help you feel more confident during the interview.