At a Glance
- Tasks: Build and maintain core datasets, APIs, and data pipelines for impactful insights.
- Company: Join Kpler, a dynamic tech company transforming global trade information.
- Benefits: Competitive salary, inclusive culture, and opportunities for professional growth.
- Why this job: Leverage cutting-edge technology to make a real difference in the commodities and energy sectors.
- Qualifications: 6-8 years in data engineering, strong programming skills, and experience with APIs.
- Other info: Embrace a supportive environment that values diverse perspectives and innovation.
The predicted salary is between 36000 - 60000 £ per year.
At Kpler, we are dedicated to helping our clients navigate complex markets with ease. By simplifying global trade information and providing valuable insights, we empower organisations to make informed decisions in commodities, energy, and maritime sectors. Since our founding in 2014, we have focused on delivering top-tier intelligence through user-friendly platforms. Our team of over 700 experts from 35+ countries works tirelessly to transform intricate data into actionable strategies, ensuring our clients stay ahead in a dynamic market landscape. Join us to leverage cutting-edge innovation for impactful results and experience unparalleled support on your journey to success.
Build and maintain Kpler's core datasets (vessels characteristics, companies, geospatial data). You will be responsible for creating and maintaining REST APIs, streaming pipelines (Kafka Stream), and Spark batch pipelines. The role is responsible for end-to-end ownership of development work, beginning with a clear understanding of assigned tickets and requirements. This includes designing and implementing functionality across APIs and data processing components, ensuring deployments to development environments are tested and reviewed by peers and product stakeholders.
The role emphasizes strong testing practices through unit, integration, and functional tests aligned with defined scenarios, along with thorough validation to ensure compliance and quality. After delivery, the role monitors performance, alerts, and SLOs to ensure the functionality operates reliably and optimally in production.
Responsibilities- Deliver well-documented, maintainable code following Test-Driven Development (TDD) principles, ensuring comprehensive unit, integration, and end-to-end testing.
- Provide technical leadership within the team, helping to elevate engineering capabilities.
- Design, operate, and document versioned RESTful APIs using FastAPI and JVM-based frameworks, ensuring secure and scalable service delivery.
- Implement and enforce data schema evolution and versioning strategies to support reliable data integration across systems.
- Develop and maintain batch and streaming data pipelines using technologies such as Kafka and Spark, managing backpressure, orchestration, retries, and data quality.
- Take ownership of system performance by improving latency and throughput, applying effective partitioning strategies for databases and Parquet/Iceberg files, defining indexing approaches, and creating efficient query execution plans.
- Ensure system reliability by instrumenting services with metrics, logs, and traces; contributing to CI/CD pipelines, automated testing, incident response, and root cause analysis.
- Collaborate closely with Product and cross-functional teams to deliver business outcomes, define test scenarios, and contribute to roadmap planning.
- Uphold engineering quality through clean code and sound architecture, active participation in code reviews, and adherence to Agile development practices.
- Provide technical mentorship and guidance to team members, supporting knowledge sharing and engineering excellence.
- 6-8 years of experience in data-focused software engineering roles.
- Strong programming skills in Scala (or JVM) experience with Python preferred.
- Proven experience designing and operating RESTful APIs, including secure and versioned interfaces.
- Solid understanding of data modeling, schema evolution, versioning and serialization technologies such as Avro or Protobuf.
- Hands-on experience with SQL and NoSQL databases, including query optimization and performance tuning.
- Experience building and maintaining batch or streaming data systems, with knowledge of streaming patterns and reliability concerns.
- Familiarity with caching strategies, CI/CD pipelines, and modern monitoring and alerting practices.
- Proficiency with Git-based workflows, code reviews, and Agile development methodologies.
- Strong sense of ownership, with pragmatic problem-solving skills, constructive critique and the ability to deliver end-to-end solutions.
- Excellent communication skills and fluency in English, with the ability to collaborate effectively across product and engineering teams.
- Experience with Apache Airflow for workflow orchestration.
- Exposure to cloud platforms (preferably AWS) and infrastructure as code using Terraform.
- Experience with Docker and Kubernetes in production environments.
- Hands-on knowledge of Kafka and event-driven or microservices architectures.
- Familiarity with JVM build and tooling ecosystems such as Gradle or Maven.
We are a dynamic company dedicated to nurturing connections and innovating solutions to tackle market challenges head-on. If you thrive on customer satisfaction and turning ideas into reality, then you’ve found your ideal destination. Are you ready to embark on this exciting journey with us?
We make things happen. We act decisively and with purpose, going the extra mile. We build together. We foster relationships and develop creative solutions to address market challenges. We are here to help. We are accessible and supportive to colleagues and clients with a friendly approach.
Our People PledgeDon’t meet every single requirement? Research shows that women and people of colour are less likely than others to apply if they feel like they don’t match 100% of the job requirements. Don’t let the confidence gap stand in your way, we’d love to hear from you! We understand that experience comes in many different forms and are dedicated to adding new perspectives to the team.
Kpler is committed to providing a fair, inclusive and diverse work environment. We believe that different perspectives lead to better ideas, and better ideas allow us to better understand the needs and interests of our diverse, global community. We welcome people of different backgrounds, experiences, abilities and perspectives and are an equal opportunity employer.
Senior Data Engineer employer: Kpler group
Contact Detail:
Kpler group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Network like a pro! Reach out to current employees at Kpler on LinkedIn or other platforms. Ask them about their experiences and any tips they might have for your application process. Personal connections can make a huge difference!
✨Tip Number 2
Prepare for the interview by brushing up on your technical skills. Since you’ll be working with REST APIs, Kafka, and Spark, make sure you can discuss your experience confidently. Practice coding challenges and system design questions to impress the interviewers.
✨Tip Number 3
Showcase your problem-solving skills during interviews. Be ready to discuss past projects where you tackled complex data challenges. Use the STAR method (Situation, Task, Action, Result) to structure your answers and highlight your impact.
✨Tip Number 4
Don’t forget to follow up after your interview! A simple thank-you email expressing your appreciation for the opportunity can leave a lasting impression. It shows your enthusiasm for the role and keeps you fresh in their minds.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with REST APIs, data pipelines, and any relevant technologies like Kafka and Spark. We want to see how your skills align with what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to Kpler's mission. Be sure to mention any specific projects or achievements that showcase your expertise.
Showcase Your Problem-Solving Skills: In your application, don’t just list your skills—show us how you've used them to solve real-world problems. Whether it's improving system performance or implementing TDD practices, we love seeing concrete examples of your impact.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands. Plus, it shows us you're serious about joining our team at Kpler!
How to prepare for a job interview at Kpler group
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially Scala, RESTful APIs, and data pipelines. Brush up on your knowledge of Kafka and Spark, as these are crucial for the role. Being able to discuss your hands-on experience with these tools will show that you’re ready to hit the ground running.
✨Demonstrate Problem-Solving Skills
Prepare to discuss specific challenges you've faced in previous roles and how you tackled them. Use the STAR method (Situation, Task, Action, Result) to structure your answers. This will help you showcase your pragmatic problem-solving skills and your ability to deliver end-to-end solutions.
✨Emphasise Testing Practices
Since the role highlights the importance of testing, be ready to talk about your experience with Test-Driven Development (TDD) and the types of tests you've implemented. Share examples of how thorough testing has improved the reliability and performance of your projects.
✨Show Your Collaborative Spirit
Kpler values teamwork, so highlight your experience working with cross-functional teams. Discuss how you’ve collaborated with product managers and other engineers to achieve business outcomes. This will demonstrate that you can contribute positively to their team culture.