At a Glance
- Tasks: Design and build a cutting-edge Databricks lakehouse platform for advanced analytics.
- Company: Fast-growing organisation on a mission to become data-driven.
- Benefits: Hybrid work model, competitive salary, and opportunities for continuous learning.
- Why this job: Shape data infrastructure that directly impacts real-world outcomes in a greenfield environment.
- Qualifications: Experience with Databricks, Python, SQL, and a collaborative mindset.
- Other info: Join a dynamic team and contribute to innovative data solutions.
Hybrid (Cambridge, 1 day/week on-site)
Qualifications, skills, and all relevant experience needed for this role can be found in the full description below.
We’re working with an innovative, fast-growing organisation that is on a mission to become truly data-driven. They’re looking for a Senior Data Engineer to play a key role in designing and building a new Databricks-based lakehouse platform — the foundation for unlocking advanced analytics and AI solutions across the business.
This is a fantastic opportunity for a product-minded engineer who thrives in a greenfield environment and wants to shape data infrastructure that will directly impact real-world outcomes.
What you’ll be doing:
Designing and shaping the foundations of a Databricks lakehouse platform (Unity Catalog, dimensions, facts, governance).
Writing clean, performant Python and SQL, with strong use of Spark/PySpark.
Integrating 3rd-party tools and connectors (CRM systems, discovery platforms) into a cohesive ecosystem.
Taking ownership of components from inception to production — ensuring reliability and scalability.
Championing best practices: CI/CD, testing, modular code, and clean architecture.
Collaborating with stakeholders across R&D, Commercial, and beyond to solve real-world problems with elegant data solutions.
Creating clear, concise technical documentation and contributing to continuous improvement initiatives.What we’re looking for:
Hands-on experience building Databricks lakehouses in cloud environments (AWS preferred).
Strong background in data engineering within scaling or greenfield settings.
A “data-as-a-product” mindset: quality checks, version control, documentation, and measurable business value.
Solid engineering skills: Git, CI, automated tests, alerting, and monitoring.
Someone collaborative, adaptable, and motivated by continuous learning.
Excited to establish foundational patterns that others will follow.If you’re excited about building trusted, scalable data platforms from the ground up and want to see your work make a visible impact, we’d love to hear from you
Senior Data Engineer | Cambridge | Greenfield Project employer: SoCode Limited
Contact Detail:
SoCode Limited Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer | Cambridge | Greenfield Project
✨Tip Number 1
Network like a pro! Reach out to people in your industry on LinkedIn or at local meetups. You never know who might have the inside scoop on job openings or can put in a good word for you.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those related to Databricks and data engineering. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by practising common questions and scenarios specific to data engineering. Think about how you would approach building a lakehouse platform and be ready to discuss your thought process.
✨Tip Number 4
Apply through our website! We’re always on the lookout for talented individuals like you. Plus, it’s a great way to ensure your application gets seen by the right people.
We think you need these skills to ace Senior Data Engineer | Cambridge | Greenfield Project
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Senior Data Engineer role. Highlight your hands-on experience with Databricks and any relevant projects you've worked on, especially in greenfield settings.
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how you can contribute to our mission. Share specific examples of how you've tackled challenges in previous roles, particularly around building scalable data solutions.
Showcase Your Technical Skills: Don’t forget to mention your proficiency in Python, SQL, and Spark/PySpark. We want to see how you've applied these skills in real-world scenarios, so include any relevant projects or achievements that demonstrate your expertise.
Apply Through Our Website: We encourage you to apply directly through our website for the best chance of getting noticed. It’s the easiest way for us to keep track of your application and ensure it reaches the right people!
How to prepare for a job interview at SoCode Limited
✨Know Your Databricks Inside Out
Make sure you brush up on your knowledge of Databricks and lakehouse architecture. Be ready to discuss how you've used it in past projects, especially in cloud environments like AWS. This will show that you’re not just familiar with the tools but can also leverage them effectively.
✨Showcase Your Coding Skills
Prepare to demonstrate your Python and SQL skills during the interview. You might be asked to solve a problem or write some code on the spot, so practice writing clean, efficient code beforehand. Highlight any experience you have with Spark/PySpark as well!
✨Emphasise Collaboration and Communication
Since this role involves working with various stakeholders, be ready to share examples of how you've successfully collaborated in the past. Discuss how you’ve communicated complex technical concepts to non-technical team members, as this will showcase your ability to bridge gaps between teams.
✨Prepare for Real-World Problem Solving
Think about specific challenges you've faced in previous roles and how you solved them. Be prepared to discuss how you would approach real-world problems using data solutions, as this aligns perfectly with the company's mission to become data-driven.