At a Glance
- Tasks: Develop and enhance data pipelines on the Nexus Platform in a dynamic team.
- Company: Join a forward-thinking company focused on innovative data solutions.
- Benefits: Enjoy competitive pay, flexible working options, and opportunities for growth.
- Why this job: Make an impact by working with cutting-edge technologies in data engineering.
- Qualifications: Experience with data integration, programming in Python or Scala, and CI/CD processes.
- Other info: Collaborative Agile environment with excellent career advancement potential.
The predicted salary is between 36000 - 60000 £ per year.
You will deliver:
- Hands on development on Nexus Platform
- Monitoring daily BAU data pipelines and ensure our data solution is refreshed up to date every day
- Enhance the daily BAU process, making it easier to monitor and less likely to fail, and hands on development on Data Lake build, change and defect fix
- Building new data pipelines using existing frameworks and patterns
- Working with the team within an Agile framework using agile methodology tooling that controls our development and CI/CD release processes
- Contributing to the new Data Lake technology across the organisation to address a broad set of use cases across data science and data warehousing
Skills and Experience
Essential
- Experience with data solution BAU processes (ETL, table refresh etc.)
- Experience with integration of data from multiple data sources
- Experience in Big Data data integration technologies such as Spark, Scala, Kafka
- Experience in programming languages such as Python or Scala
- Experience using AWS, DBT and Snowflake
- Analytical and problem-solving skills, applied to data solution
- Experience of CI/CD
- Good aptitude in multi-threading and concurrency concepts
- Familiarity with the fundamentals of Linux scripting language
Desirable
- Experience of ETL technologies
- AWS exposure (Athena, Glue, EMR, Step functions)
- Experience of Snowflake and DBT
- Previous proficiency with ETL technologies (e.g. Talend, Informatica, Abinitio)
- Previous exposure to Python
- Previous exposure to own data solution BAU monitoring and enhancement
- Exposure to building applications for a cloud environment
Data Engineer in City of London employer: NewDay Ltd
Contact Detail:
NewDay Ltd Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in City of London
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, attend meetups, and connect with other Data Engineers on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data pipelines, projects, or any cool stuff you've built using Spark, Scala, or Python. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for those interviews! Brush up on your knowledge of AWS, CI/CD processes, and data integration technologies. Practise common interview questions and be ready to discuss how you've tackled challenges in your previous roles.
✨Tip Number 4
Don't forget to apply through our website! We love seeing applications come in directly, and it gives you a better chance to stand out. Plus, you'll get all the latest updates on your application status!
We think you need these skills to ace Data Engineer in City of London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with data solutions, ETL processes, and any relevant technologies like Spark or AWS. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background makes you a great fit for our team. Keep it concise but engaging – we love a good story!
Showcase Your Projects: If you've worked on any cool data projects, make sure to mention them! Whether it's building data pipelines or enhancing BAU processes, we want to know what you've done and how it relates to the role. Don’t be shy about your achievements!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy – just follow the prompts and submit your materials!
How to prepare for a job interview at NewDay Ltd
✨Know Your Data Tools
Make sure you brush up on your knowledge of the tools mentioned in the job description, like Spark, Scala, and AWS. Be ready to discuss how you've used these technologies in past projects, as this will show your hands-on experience.
✨Showcase Your Problem-Solving Skills
Prepare examples that highlight your analytical and problem-solving abilities. Think of specific challenges you've faced in data integration or pipeline monitoring and how you overcame them. This will demonstrate your capability to enhance BAU processes.
✨Familiarise Yourself with Agile Methodologies
Since the role involves working within an Agile framework, it’s crucial to understand Agile principles and practices. Be prepared to discuss your experience with Agile tooling and how it has helped streamline your development processes.
✨Ask Insightful Questions
At the end of the interview, don’t forget to ask questions! Inquire about the team’s current data challenges or how they envision the Data Lake technology evolving. This shows your genuine interest in the role and helps you gauge if it's the right fit for you.