At a Glance
- Tasks: Support and enhance our Enterprise Data Warehouse while automating data processes.
- Company: Join Bottomline, a leader in leveraging data for strategic insights and business performance.
- Benefits: Enjoy flexible working arrangements, competitive salary, and opportunities for professional growth.
- Why this job: Be part of a collaborative team that values innovation and offers impactful work in data engineering.
- Qualifications: 3+ years in data warehouse support; expertise in ETL tools and cloud services required.
- Other info: Remote or hybrid work options available from our Theale office.
The predicted salary is between 36000 - 60000 £ per year.
Bottomline is looking for a Data Warehouse/DevOps Technical Support Engineer to grow with us either remotely or in a Hybrid work environment out of our Theale, UK office! We are a leading company dedicated to harnessing the power of data to drive strategic insights and enhance business performance. We are looking for a skilled Data Warehouse Technical Support Engineer to join our team and support our data infrastructure.
Position Overview: We are seeking a proactive and technically skilled DevOps / Technical Support Engineer to join our Data Engineering & Analytics team. This role is critical in ensuring the reliable delivery, scalability, and operational excellence of our Enterprise Data Warehouse (EDW) platforms, data ingestion pipelines, and reporting solutions. You will work closely with Data Engineers, Architects, and business stakeholders to support, automate, and optimize the data ecosystem using DevOps principles, Infrastructure as Code (IaC), and modern CI/CD practices.
How you’ll contribute:
- EDW Development & Support: Support and enhance the Enterprise Data Warehouse (EDW) on Snowflake or similar platforms, including development, performance tuning, maintenance, and schema management. Design, build, and maintain ETL/ELT pipelines using Talend or equivalent tools, ensuring efficient data ingestion, transformation, and delivery. Develop and manage CI/CD pipelines for ETL/ELT jobs, automating deployment, testing, and delivery processes. Deliver and maintain Disaster Recovery (DR) processes and solutions, including EDW backup, restore, and failover capabilities. Collaborate on root cause analysis (RCA) and permanent resolution of EDW and pipeline-related incidents.
- Infrastructure & Automation: Implement and manage Infrastructure as Code (IaC) for data platform components using tools such as Terraform, AWS CloudFormation, or similar. Provision, configure, and maintain cloud resources (AWS EC2, S3, IAM, RDS, etc.) to support EDW and data processing workloads. Ensure scalability, availability, and cost-efficiency of cloud infrastructure in alignment with business needs. Support and enhance data mesh platforms like Denodo, Starburst, or equivalents for federated data access.
- Monitoring, Reliability & Incident Management: Design and implement monitoring, logging, and alerting frameworks for data pipelines and EDW systems to ensure high availability and reliability. Lead or contribute to incident response, performing root cause analysis (RCA), corrective action, and continuous improvement initiatives. Maintain and enforce SLAs and operational best practices for EDW and reporting platforms.
- Reporting & Analytics Support: Support business intelligence tools (e.g., Power BI, Tableau) by ensuring reliable and accurate data availability from EDW sources. Optimize queries, reporting datasets, and integrations to improve performance and usability for data consumers.
- Documentation & Knowledge Sharing: Document technical processes, CI/CD workflows, deployment runbooks, incident response procedures, and environment configurations. Conduct knowledge transfer sessions and training for development, support, and operations teams.
What will make you successful:
- Strong hands-on experience with Talend, Informatica, or other ETL/ELT tools.
- Expertise in Snowflake or equivalent cloud data warehouse platforms.
- Proficiency in AWS cloud services (EC2, S3, CloudWatch, RDS, IAM) and IaC tools such as Terraform or CloudFormation.
- Knowledge of CI/CD pipelines for data workloads using tools like GitLab CI/CD, Jenkins, or AWS CodePipeline.
- Experience supporting and automating EDW backup, restore, and disaster recovery processes.
- Solid understanding of DevOps best practices, including automation, configuration management, and continuous delivery.
- Familiarity with Data Mesh technologies (Denodo, Starburst) is a plus.
- Strong analytical and troubleshooting skills for performing root cause analysis (RCA) of incidents.
- Good working knowledge of Power BI or similar BI tools.
- Strong communication and documentation skills to support cross-functional collaboration.
Preferred Certifications:
- AWS Certified Solutions Architect / DevOps Engineer.
- Snowflake or Talend certifications.
- ITIL Foundation or relevant service management certification.
Education and Experience:
- Bachelor’s degree in computer science, Information Technology, Data Engineering, or a related field.
- 3+ years of experience in data warehouse support, data engineering, or a similar role.
- Strong analytical and problem-solving skills with attention to detail.
- Excellent communication skills, capable of explaining technical concepts to non-technical audiences.
- Ability to work independently and manage multiple tasks in a fast-paced environment.
What We Offer:
- Competitive salary and benefits package.
- Opportunities for professional growth and advancement.
- A collaborative and innovative work environment.
- Flexible working arrangements.
Data Warehouse/DevOps Technical Support Engineer employer: Bottomline
Contact Detail:
Bottomline Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Warehouse/DevOps Technical Support Engineer
✨Tip Number 1
Familiarise yourself with the specific tools mentioned in the job description, such as Snowflake, Talend, and AWS services. Having hands-on experience or even personal projects showcasing your skills with these technologies can set you apart from other candidates.
✨Tip Number 2
Engage with the data engineering community online. Join forums, attend webinars, or participate in relevant discussions on platforms like LinkedIn. This not only helps you stay updated on industry trends but also allows you to network with professionals who might refer you to opportunities.
✨Tip Number 3
Prepare to discuss your problem-solving skills in detail. Be ready to share specific examples of how you've handled incidents or optimised processes in previous roles. This will demonstrate your analytical abilities and your fit for the technical support aspect of the role.
✨Tip Number 4
Showcase your understanding of DevOps principles and CI/CD practices during any interviews or networking opportunities. Being able to articulate how you've applied these concepts in real-world scenarios will highlight your readiness for the responsibilities of this position.
We think you need these skills to ace Data Warehouse/DevOps Technical Support Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data warehousing, DevOps, and technical support. Use keywords from the job description to demonstrate that you meet the specific requirements of the role.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your understanding of the company's mission. Mention specific projects or experiences that align with the responsibilities outlined in the job description.
Showcase Technical Skills: In your application, clearly list your technical skills related to ETL/ELT tools, cloud services, and CI/CD practices. Provide examples of how you've used these skills in previous roles to solve problems or improve processes.
Highlight Soft Skills: Don't forget to mention your communication and collaboration skills. The role requires working closely with various teams, so emphasise your ability to explain technical concepts to non-technical audiences and your experience in cross-functional teamwork.
How to prepare for a job interview at Bottomline
✨Showcase Your Technical Skills
Be prepared to discuss your hands-on experience with ETL/ELT tools like Talend and cloud platforms such as Snowflake. Highlight specific projects where you implemented these technologies, focusing on the challenges you faced and how you overcame them.
✨Understand DevOps Principles
Familiarise yourself with DevOps best practices, especially around CI/CD pipelines. Be ready to explain how you've applied these principles in past roles, particularly in automating deployment and ensuring operational excellence.
✨Prepare for Scenario-Based Questions
Expect questions that assess your problem-solving skills, such as how you would handle a data pipeline failure or optimise an ETL process. Use the STAR method (Situation, Task, Action, Result) to structure your responses effectively.
✨Communicate Clearly and Confidently
Since the role involves collaboration with various stakeholders, practice explaining technical concepts in simple terms. This will demonstrate your ability to bridge the gap between technical and non-technical audiences, which is crucial for success in this position.