At a Glance
- Tasks: Join our Data Engineering team to design and develop scalable data systems for trading operations.
- Company: GSR is a leading market-making firm in the dynamic world of cryptocurrency trading.
- Benefits: Enjoy competitive salary, healthcare, 30 days holiday, and free lunches in the office.
- Why this job: Be part of a fast-paced environment shaping the future of cryptocurrency with innovative technology.
- Qualifications: 8+ years in data engineering, strong skills in Java, Python, SQL; cloud experience required.
- Other info: We value diversity and operate a meritocracy, ensuring equal opportunities for all.
The predicted salary is between 54000 - 84000 £ per year.
Founded in 2013, GSR is a leading market-making and programmatic trading company in the exciting and fast-evolving world of cryptocurrency trading. With more than 200 employees in 5 countries, we provide billions of dollars of liquidity to cryptocurrency protocols and exchanges on a daily basis. We build long-term relationships with cryptocurrency communities and traditional investors by offering exceptional service, expertise and trading capabilities tailored to their specific needs. GSR works with token issuers, traders, investors, miners, and more than 30 cryptocurrency exchanges around the world. In volatile markets we are a trusted partner to crypto native builders and to those exploring the industry for the first time. Our team of veteran finance and technology executives from Goldman Sachs, Two Sigma, and Citadel, among others, has developed one of the world’s most robust trading platforms designed to navigate issues unique to the digital asset markets. We have continuously improved our technology throughout our history, allowing for our clients to scale and execute their strategies with the highest level of efficiency. Working at GSR is an opportunity to be deeply embedded in every major sector of the cryptocurrency ecosystem.
About the role: This role sits within GSR’s global Data Engineering team, where you’ll contribute to the design and development of scalable data systems that support our trading and business operations. You’ll work closely with stakeholders across the firm to build and maintain pipelines, manage data infrastructure, and ensure data is reliable, accessible, and secure. It’s a hands-on engineering position with scope to shape the way data is handled across the business, working with modern tools in a fast-moving, high-performance environment.
Your responsibilities may include:
- Build and maintain scalable, efficient ETL/ELT pipelines for both real-time and batch processing.
- Integrate data from APIs, streaming platforms, and legacy systems, with a focus on data quality and reliability.
- Design and manage data storage solutions, including databases, warehouses, and lakes.
- Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads.
- Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency.
- Implement data governance, access controls, and security measures in line with best practices and regulatory standards.
- Develop observability and anomaly detection tools to support Tier 1 systems.
- Work with engineers and business teams to gather requirements and translate them into technical solutions.
- Maintain documentation, follow coding standards, and contribute to CI/CD processes.
- Stay current with new technologies and help improve the team’s tooling and infrastructure.
What We’re Looking For:
- 8+ years of experience in data engineering or a related field.
- Strong programming skills in Java, Python and SQL; familiarity with Rust is a plus.
- Proven experience designing and maintaining scalable ETL/ELT pipelines and data architectures.
- Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services.
- Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch.
- Strong understanding of data governance, security, and best practices for data quality.
- Effective communicator with the ability to work across technical and non-technical teams.
Additional Strengths:
- Experience with orchestration tools like Apache Airflow.
- Knowledge of real-time data processing and event-driven architectures.
- Familiarity with observability tools and anomaly detection for production systems.
- Exposure to data visualization platforms such as Tableau or Looker.
- Relevant cloud or data engineering certifications.
What we offer:
- A collaborative and transparent company culture founded on Integrity, Innovation and Performance.
- Competitive salary with two discretionary bonus payments a year.
- Benefits such as Healthcare, Dental, Vision, Retirement Planning, 30 days holiday and free lunches when in the office.
- Regular Town Halls, team lunches and drinks.
- A Corporate and Social Responsibility program as well as charity fundraising matching and volunteer days.
GSR is proudly an Equal Employment Opportunity employer. We do not discriminate based upon any applicable legally protected characteristics such as race, religion, colour, country of origin, sexual orientation, gender, gender identity, gender expression or age. We operate a meritocracy, all aspects of people engagement from the decision to hire or promote as well as our performance management process will be based on the business needs and individual merit, competence in the role.
Data Engineer London, Singapore employer: GSR Markets Limited
Contact Detail:
GSR Markets Limited Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer London, Singapore
✨Tip Number 1
Familiarise yourself with the latest data engineering tools and technologies mentioned in the job description, such as Apache Flink and AWS services. Being able to discuss these tools confidently during your interview will show that you're proactive and well-prepared.
✨Tip Number 2
Network with current or former employees of GSR on platforms like LinkedIn. Engaging with them can provide you with insider knowledge about the company culture and expectations, which can be invaluable during interviews.
✨Tip Number 3
Prepare to discuss specific projects where you've built scalable ETL/ELT pipelines. Be ready to explain your thought process, challenges faced, and how you overcame them, as this will demonstrate your hands-on experience and problem-solving skills.
✨Tip Number 4
Stay updated on trends in the cryptocurrency market and how they impact data engineering. Showing your understanding of the industry during your interview can set you apart from other candidates and highlight your genuine interest in the role.
We think you need these skills to ace Data Engineer London, Singapore
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with ETL/ELT pipelines and cloud platforms. Use keywords from the job description to demonstrate your fit for the role.
Craft a Strong Cover Letter: Write a cover letter that showcases your passion for cryptocurrency and data engineering. Mention specific projects or experiences that align with GSR's needs, such as your work with big data tools or cloud-native services.
Showcase Technical Skills: In your application, clearly outline your programming skills in Java, Python, and SQL. If you have experience with Rust or orchestration tools like Apache Airflow, be sure to include that as well.
Highlight Communication Skills: Since the role requires collaboration across technical and non-technical teams, emphasise your communication skills. Provide examples of how you've successfully worked with diverse teams in previous roles.
How to prepare for a job interview at GSR Markets Limited
✨Know Your Data Engineering Fundamentals
Make sure you have a solid understanding of data engineering principles, especially around ETL/ELT processes. Be prepared to discuss your experience with building scalable data pipelines and how you've tackled challenges in previous roles.
✨Familiarise Yourself with Relevant Technologies
Since the role involves working with tools like Apache Flink and AWS, brush up on these technologies. Be ready to explain how you've used them in past projects and how they can be applied to GSR's operations.
✨Demonstrate Problem-Solving Skills
Prepare to discuss specific examples where you've monitored, troubleshot, or optimised data pipelines. Highlight your analytical skills and how you approach problem-solving in high-pressure environments.
✨Communicate Effectively
As the role requires collaboration with both technical and non-technical teams, practice articulating complex concepts in simple terms. Show that you can bridge the gap between different stakeholders and understand their needs.