At a Glance
- Tasks: Monitor and enhance data pipelines, ensuring daily updates and stability.
- Company: Join a forward-thinking tech company with a focus on innovation.
- Benefits: Full-time role with competitive salary and opportunities for growth.
- Why this job: Be at the forefront of data engineering and make a real impact.
- Qualifications: Experience with ETL processes, Big Data technologies, and programming skills.
- Other info: Collaborative Agile environment with mentorship opportunities.
The predicted salary is between 30000 - 50000 £ per year.
Responsibilities:
- Monitor daily BAU data pipelines and ensure our data solution is refreshed up to date every day.
- Work with the team within an Agile framework using agile methodology tooling that controls our development and CI/CD release processes.
- Get up to speed with existing data infrastructure, tools, and processes.
- Monitor and respond to IT incidents.
- Build familiarity with key pipelines/processes and support documentation, enhancing documentation.
- Begin proactive monitoring and suggest quick wins for stability and efficiency.
- Work on incident RCA along with other engineers within the team.
- Create and maintain technical documentation, RCA reports, and knowledge base articles.
- Implement monitoring dashboards and alerts for key jobs and data quality checks.
- Mentor junior engineers or act as a technical go-to for pipeline and cloud topics.
Essential Qualifications:
- Experience with data solution BAU processes (ETL, table refresh, etc.).
- Experience in Big Data data integration technologies such as Spark and Kafka.
- Experience in a programming language such as Python.
- Experience using AWS (Athena, Glue, EMR, Step functions, CloudWatch), DBT, and Snowflake.
- Analytical and problem‑solving skills applied to data solutions.
- Experience with CI/CD.
- Experience handling IT incidents.
- Experience with IT incident management processes.
- Basic understanding of event streaming.
- Basic knowledge of Scala (nice to have).
- Experience of ETL technologies.
- Previous exposure to own data solution BAU monitoring and enhancement.
- Exposure to Grafana.
Desired Skills:
- Experience with data solution BAU processes (ETL, table refresh, etc.).
- Experience in Big Data data integration technologies such as Spark and Kafka.
- Experience in a programming language such as Python.
- Experience using AWS (Athena, Glue, EMR, Step functions, CloudWatch), DBT, and Snowflake.
- Analytical and problem‑solving skills applied to data solutions.
- Experience with CI/CD.
- Experience handling IT incidents.
- Experience with IT incident management processes.
- Basic understanding of event streaming.
- Exposure to Grafana.
Employment type: Full‑time
Data Engineer in City of London employer: NewDay
Contact Detail:
NewDay Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in City of London
✨Tip Number 1
Get familiar with the company’s data infrastructure before your interview. Dive into their tech stack and understand how they use tools like AWS, Spark, and Kafka. This will show you’re genuinely interested and ready to hit the ground running!
✨Tip Number 2
Network with current employees or alumni who work in similar roles. They can provide insider tips on the interview process and what the team values most. Plus, it’s a great way to get your name out there!
✨Tip Number 3
Prepare for technical questions by brushing up on your programming skills, especially in Python. Practice common data engineering problems and be ready to discuss your past experiences with ETL processes and incident management.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who take that extra step to connect with us directly.
We think you need these skills to ace Data Engineer in City of London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with data solutions and technologies mentioned in the job description. We want to see how your skills align with our needs, so don’t be shy about showcasing your expertise in ETL processes, AWS, and Big Data tools!
Craft a Compelling Cover Letter: Your cover letter is your chance to tell us why you’re the perfect fit for the Data Engineer role. Share specific examples of your past experiences, especially those that relate to monitoring data pipelines and incident management. Let your personality shine through!
Showcase Your Problem-Solving Skills: In your application, highlight instances where you've tackled challenges in data solutions or improved processes. We love candidates who can think critically and suggest quick wins for stability and efficiency, so make sure to include those stories!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it gives you a chance to explore more about StudySmarter and what we stand for!
How to prepare for a job interview at NewDay
✨Know Your Data Tools
Make sure you brush up on your knowledge of the data tools mentioned in the job description, like Spark, Kafka, and AWS services. Be ready to discuss how you've used these technologies in past projects and how they can be applied to the role.
✨Showcase Your Problem-Solving Skills
Prepare examples of how you've tackled data-related challenges in previous roles. Highlight your analytical skills and how you've implemented solutions that improved efficiency or stability in data processes.
✨Familiarise Yourself with Agile Methodologies
Since the role involves working within an Agile framework, it’s crucial to understand Agile principles and practices. Be prepared to discuss your experience with Agile tooling and how it has helped you in your previous projects.
✨Prepare for Technical Questions
Expect technical questions related to ETL processes, CI/CD, and incident management. Brush up on your understanding of these concepts and be ready to explain them clearly, as well as how you've applied them in real-world scenarios.