At a Glance
- Tasks: Design and build data pipelines using AWS for analytics and reporting.
- Company: Leading National Security Consultancy in London with a focus on cloud transformation.
- Benefits: Up to £85,000 salary, bonuses, and opportunities for professional growth.
- Why this job: Join a dynamic team and work with cutting-edge cloud technologies.
- Qualifications: Strong AWS skills and programming experience in Python, Java, or Scala.
- Other info: Full-time on-site role with a collaborative and innovative environment.
The predicted salary is between 60000 - 85000 £ per year.
New Permanent Opportunity for an eDV Cleared Data Engineer with AWS Cloud Experience for a leading National Security Consultancy in London. Up to £85,000 DoE plus bonuses. Active eDV required. London Location – full-time on-site when required.
Expertise required: AWS, Data Pipelines, ETL, Data Storage, and DevOps methodologies.
Role Overview: This role sits within our client’s rapidly growing Cloud Data Platforms team, part of the Insights and Data Global Practice. You will join a multidisciplinary group of data and platform specialists who deliver modern cloud-based transformation for clients across a range of sectors. In this role, you will design and build data pipelines, develop ETL/ELT processes, and create innovative data solutions using the latest cloud technologies and frameworks across AWS.
Some responsibilities include:
- Build data pipelines to ingest, process and transform data for analytics and reporting.
- Develop ETL/ELT workflows to move data efficiently into data warehouses, data lakes and lake houses using open-source and AWS tooling.
- Apply DevOps practices, including CI/CD, infrastructure as code and automation, to improve and streamline data engineering processes.
- Design effective data solutions that meet complex business needs and support informed decision-making.
Experience Required:
- Strong AWS expertise, including tools such as Glue, Lambda, Kinesis, EMR, Athena, DynamoDB, CloudWatch, SNS and Step Functions.
- Skilled in modern programming, particularly Python, Java, Scala and PySpark.
- Solid knowledge of data storage and big data technologies, including data warehouses, databases, Redshift, RDS and Hadoop.
- Experience building and managing AWS data lakes on S3 for both structured and unstructured data.
What happens next? To apply, please either click online or email directly to henry.clay-davies@searchability.com. For further information, please call 07719065951 or 0161 4166800.
Data Engineer in City of London employer: Searchability NS&D
Contact Detail:
Searchability NS&D Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in City of London
✨Network Like a Pro
Get out there and connect with people in the industry! Attend meetups, webinars, or even just grab a coffee with someone who works in data engineering. Building relationships can lead to job opportunities that aren’t even advertised.
✨Show Off Your Skills
Create a portfolio showcasing your projects, especially those involving AWS and data pipelines. Share it on platforms like GitHub or your personal website. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Ace the Interview
Prepare for technical interviews by brushing up on your AWS knowledge and coding skills. Practice common data engineering problems and be ready to discuss your past projects. Confidence and preparation are key to landing that job!
✨Apply Through Our Website
Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, you can find more roles that match your skills and interests. Let’s get you that dream job!
We think you need these skills to ace Data Engineer in City of London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your AWS experience and any relevant projects you've worked on. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our Cloud Data Platforms team. Keep it engaging and personal.
Showcase Your Technical Skills: Don’t forget to showcase your technical skills, especially in AWS and programming languages like Python or Java. We love seeing specific examples of how you've used these skills in past roles or projects.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of applications and ensures you don’t miss out on any important updates from us!
How to prepare for a job interview at Searchability NS&D
✨Know Your AWS Inside Out
Make sure you brush up on your AWS knowledge, especially the tools mentioned in the job description like Glue, Lambda, and Kinesis. Be ready to discuss how you've used these tools in past projects and how they can be applied to build efficient data pipelines.
✨Showcase Your Programming Skills
Prepare to demonstrate your programming prowess, particularly in Python, Java, or Scala. You might be asked to solve a coding problem or explain your approach to building ETL processes, so practice articulating your thought process clearly.
✨Understand Data Storage Solutions
Familiarise yourself with various data storage technologies, including data lakes and warehouses. Be prepared to discuss how you've managed structured and unstructured data, and how you would design effective data solutions for complex business needs.
✨Emphasise DevOps Practices
Since the role involves applying DevOps methodologies, be ready to talk about your experience with CI/CD, infrastructure as code, and automation. Share specific examples of how you've streamlined data engineering processes in previous roles.