At a Glance
- Tasks: Monitor and enhance data pipelines while collaborating in an Agile team.
- Company: Join a forward-thinking tech company focused on data solutions.
- Benefits: Full-time role with competitive salary and opportunities for growth.
- Why this job: Make a real impact by optimising data processes and mentoring others.
- Qualifications: Experience with ETL, Big Data tools, and programming languages like Python.
- Other info: Dynamic environment with a focus on innovation and teamwork.
The predicted salary is between 36000 - 60000 £ per year.
Responsibilities
- Monitor daily BAU data pipelines and ensure our data solution is refreshed up to date every day.
- Work with the team within an Agile framework using agile methodology tooling that controls our development and CI/CD release processes.
- Get up to speed with existing data infrastructure, tools, and processes.
- Monitor and respond to IT incidents.
- Build familiarity with key pipelines/processes and support documentation, enhancing documentation.
- Begin proactive monitoring and suggest quick wins for stability and efficiency.
- Work on incident RCA along with other engineers within the team.
- Create and maintain technical documentation, RCA reports, and knowledge base articles.
- Implement monitoring dashboards and alerts for key jobs and data quality checks.
- Mentor junior engineers or act as a technical go-to for pipeline and cloud topics.
Essential Qualifications
- Experience with data solution BAU processes (ETL, table refresh, etc.).
- Experience in Big Data data integration technologies such as Spark and Kafka.
- Experience in a programming language such as Python.
- Experience using AWS (Athena, Glue, EMR, Step functions, CloudWatch), DBT, and Snowflake.
- Analytical and problem‑solving skills applied to data solutions.
- Experience with CI/CD.
- Experience handling IT incidents.
- Experience with IT incident management processes.
- Basic understanding of event streaming.
- Basic knowledge of Scala (nice to have).
- Experience of ETL technologies.
- Previous exposure to own data solution BAU monitoring and enhancement.
- Exposure to Grafana.
Desired Skills
- Experience with data solution BAU processes (ETL, table refresh, etc.).
- Experience in Big Data data integration technologies such as Spark and Kafka.
- Experience in a programming language such as Python.
- Experience using AWS (Athena, Glue, EMR, Step functions, CloudWatch), DBT, and Snowflake.
- Analytical and problem‑solving skills applied to data solutions.
- Experience with CI/CD.
- Experience handling IT incidents.
- Experience with IT incident management processes.
- Basic understanding of event streaming.
- Exposure to Grafana.
Employment type: Full‑time
Data Engineer employer: NewDay
Contact Detail:
NewDay Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Get familiar with the company’s data infrastructure before your interview. Dive into their tech stack and understand how they use tools like Spark, Kafka, and AWS. This will show you’re genuinely interested and ready to hit the ground running.
✨Tip Number 2
Practice your problem-solving skills! Be prepared to discuss how you've tackled data issues in the past. Think of specific examples where you’ve improved data pipelines or handled IT incidents – this is your chance to shine!
✨Tip Number 3
Network with current employees or join relevant online communities. Engaging with others in the field can give you insider tips about the company culture and what they value in a Data Engineer. Plus, it might just lead to a referral!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who take that extra step to connect with us directly.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with data solutions and technologies mentioned in the job description. We want to see how your skills align with our needs, so don’t be shy about showcasing your expertise in ETL processes, AWS, and Big Data tools!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how you can contribute to our team. We love seeing enthusiasm and a good understanding of our Agile framework, so let that personality come through!
Showcase Problem-Solving Skills: In your application, highlight specific examples where you've tackled data-related challenges. We’re keen on analytical minds, so share any experiences with incident management or enhancing data pipelines that demonstrate your problem-solving prowess.
Apply Through Our Website: We encourage you to apply directly through our website for the best chance of getting noticed. It’s the easiest way for us to keep track of your application and ensures you’re considered for the role. Plus, we love seeing candidates who take that extra step!
How to prepare for a job interview at NewDay
✨Know Your Data Tools
Make sure you brush up on your knowledge of the data tools mentioned in the job description, like Spark, Kafka, and AWS services. Being able to discuss your experience with these technologies will show that you're ready to hit the ground running.
✨Showcase Your Problem-Solving Skills
Prepare examples of how you've tackled data-related challenges in the past. Whether it's an incident you managed or a process you improved, having concrete examples will demonstrate your analytical skills and ability to think on your feet.
✨Familiarise Yourself with Agile Methodologies
Since the role involves working within an Agile framework, it’s crucial to understand Agile principles and practices. Be ready to discuss how you've applied Agile methodologies in previous roles and how they can enhance team collaboration and efficiency.
✨Prepare for Technical Questions
Expect technical questions related to ETL processes, CI/CD, and data monitoring. Brush up on your programming knowledge, especially in Python, and be prepared to explain your thought process when solving technical problems.