At a Glance
- Tasks: Join our Data Platform Team to build and maintain a world-class data platform.
- Company: Checkout.com is a leading fintech empowering businesses in the digital economy.
- Benefits: Enjoy hybrid working, amazing snacks, and a supportive community.
- Why this job: Be part of a diverse team innovating for the future of payments.
- Qualifications: Strong engineering background with experience in streaming technologies and cloud-based stacks.
- Other info: We promote equal opportunities and support your success in a collaborative environment.
The predicted salary is between 48000 - 72000 Β£ per year.
Checkout.com is one of the most exciting fintechs in the world. Our mission is to enable businesses and their communities to thrive in the digital economy. We are the strategic payments partner for some of the best known fast-moving brands globally such as Wise, Hut Group, Sony Electronics, Homebase, Henkel, Klarna and many others. Purpose-built with performance and scalability in mind, our flexible cloud-based payments platform helps global enterprises launch new products and create experiences customers love. We empower passionate problem-solvers to collaborate, innovate and do their best work. That is why we are on the Forbes Cloud 100 list and a Great Place to Work accredited company. We are building diverse and inclusive teams around the world because that is how we create even better experiences for our merchants and our partners.
Checkout.com is looking for an ambitious Senior Data Engineer to join our Data Platform Team. Our teamβs mission is to build a world-class data platform that powers our products and analytics. The Data Platform team ensures internal stakeholders can easily collect, store, process and utilise data to build reports or products aiming to solve business problems. Our focus is on maximising the amount of time business stakeholders spend on solving business problems and minimising time spent on technical details around implementation, deployment, and monitoring of their solutions.
The core tech stack we use is based on AWS, using:
- Kafka as our message transport
- Flink for (near) real time processing
- Datahub as our catalog
- Snowflake as our warehouse
- Airflow for scheduling
- DBT for data transformation
- Montecarlo for monitoring
The platform encompasses the end-to-end, first for real time / streaming use cases but also for our analytical / warehouse needs. We are building for scale. Much of what we design and implement today will be the technology/infrastructure which will serve hundreds of teams and petabyte-level volumes of data.
Key Responsibilities
- You will work as part of the team to build enablement components across the platform, as well as monitor and support the capabilities we offer.
- Develop and maintain documentation for data systems and processes.
- Participate in code and design reviews and provide constructive feedback.
- Wherever possible, automate workflows and processes; we aim for the platform to be as self-sustaining as possible.
- Stay up-to-date with the latest data and streaming engineering technologies and trends.
- Use that knowledge and subject matter expertise to mentor the more junior members of the team, and work with other application teams to provide guidance and best practice.
- Build lightweight tooling and associated reference patterns to foster the adoption of the platform by enabling upstream teams and systems to easily publish and manipulate data and deploy applications using industry best practices.
- Implement all the necessary infrastructure to enable end users to build, host, monitor and deploy their own applications.
- Provide consultancy across the technology organisation to drive the adoption of the platform and unlock use-cases.
- Promote data quality and governance as a first class citizen of the platform.
Qualifications
- Strong engineering background with a track record of implementing and owning components of a data platform.
- Experience working with stream technologies, ideally Kafka, but Kinesis, Pulsar or similar would also be applicable.
- Experience designing and implementing stream processing applications (kStreams, kSQL, Flink, Spark Streaming).
- Experience with Data Warehousing tools like Snowflake / Bigquery / Databricks, and building pipelines on these.
- Experience working with modern cloud-based stacks such as AWS, Azure or GCP.
- Excellent programming skills with at least one of Python, Java, Scala or C#.
- You are a mentor, raising the bar for your colleagues.
- You are a collaborator, always ready to dive in and partner to solve tough problems.
- You are a listener, and seek to understand the underlying problems before pitching solutions.
- You are able to drive through best practices by taking teams and organisations as a whole with you.
- You are a thought leader; we would love to see articles, podcasts, meetups or conference talks if you have done them.
Additional Information
Hybrid Working Model: All of our offices globally are onsite 3 times per week (Tuesday, Wednesday, and Thursday). We have worked towards enabling teams to work collaboratively in the same space while also being able to partner with colleagues globally. During your days at the office, we offer amazing snacks, breakfast, and lunch options in all of our locations.
We believe in equal opportunities. We work as one team. Wherever you come from, however you identify, and whichever payment method you use. Our clients come from all over the world - and so do we. Hiring hard-working people and giving them a community to thrive in is critical to our success. When you join our team, we will empower you to unlock your potential so you can do your best work. We would love to hear how you think you could make a difference here with us. We want to set you up for success and make our process as accessible as possible. So let us know in your application, or tell your recruiter directly, if you need anything to make your experience or working environment more comfortable. We will be happy to support you.
Senior Data Engineer - Streaming employer: Checkout.com
Contact Detail:
Checkout.com Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land Senior Data Engineer - Streaming
β¨Tip Number 1
Familiarise yourself with the core tech stack mentioned in the job description, especially AWS, Kafka, and Flink. Having hands-on experience or projects that showcase your skills with these technologies will make you stand out.
β¨Tip Number 2
Engage with the data engineering community by attending meetups or webinars focused on streaming technologies. This not only helps you stay updated but also allows you to network with professionals who might have insights into the hiring process at Checkout.com.
β¨Tip Number 3
Showcase your mentoring abilities by sharing your knowledge through blogs or talks. Highlighting your experience in guiding junior team members can demonstrate your leadership qualities, which is a key aspect of the role.
β¨Tip Number 4
Research Checkout.comβs culture and values, particularly their focus on collaboration and innovation. Be prepared to discuss how your personal values align with theirs during any interviews or conversations with the team.
We think you need these skills to ace Senior Data Engineer - Streaming
Some tips for your application π«‘
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with streaming technologies like Kafka and Flink. Use specific examples that demonstrate your skills in building and maintaining data platforms.
Craft a Compelling Cover Letter: In your cover letter, express your passion for data engineering and how your background aligns with Checkout.com's mission. Mention any experience you have with cloud-based stacks and your approach to mentoring junior team members.
Showcase Your Projects: If you've worked on notable projects related to data platforms or streaming applications, include them in your application. Highlight your role, the technologies used, and the impact of your work.
Demonstrate Continuous Learning: Mention any recent courses, certifications, or conferences you've attended that relate to data engineering or streaming technologies. This shows your commitment to staying updated with industry trends and best practices.
How to prepare for a job interview at Checkout.com
β¨Showcase Your Technical Expertise
Be prepared to discuss your experience with streaming technologies like Kafka, Flink, and data warehousing tools such as Snowflake. Highlight specific projects where you've implemented these technologies and the impact they had on the business.
β¨Demonstrate Problem-Solving Skills
During the interview, share examples of how you've tackled complex data engineering challenges. Emphasise your approach to understanding underlying problems before proposing solutions, as this aligns with the company's collaborative culture.
β¨Emphasise Mentorship and Collaboration
Since the role involves mentoring junior team members, be ready to discuss your experience in guiding others. Share instances where you've collaborated with cross-functional teams to drive best practices and improve processes.
β¨Stay Current with Industry Trends
Show your passion for the field by discussing recent trends in data engineering and streaming technologies. Mention any articles, podcasts, or conferences you've engaged with, as this demonstrates your commitment to continuous learning and thought leadership.