At a Glance
- Tasks: Design secure integrations between AWS, Databricks, and enterprise data platforms.
- Company: Join a forward-thinking tech company with a hybrid work culture.
- Benefits: Competitive pay, flexible working, and opportunities for professional growth.
- Other info: Dynamic role with excellent career advancement potential.
- Why this job: Be at the forefront of cloud technology and make a real impact.
- Qualifications: Experience with AWS, Databricks, and Terraform is essential.
The predicted salary is between 60000 - 80000 £ per year.
Job Type: Contract - Inside IR35
Location: Leeds and London - Hybrid
Job Description
Platform Engineer AWS, Databricks and Unity Catalog SC Eligible | Foundry experience: Desirable | Core skills: AWS, Databricks, Unity Catalog, Terraform
Role Overview
Support secure integration between AWS-hosted data sources, Databricks workspaces and downstream enterprise data platforms. The role covers data held in S3, registered through Databricks Hive Metastore, governed or exposed through Unity Catalog, and consumed through approved connector-based patterns.
Key Responsibilities
- Design and support secure integration across AWS S3, Databricks and downstream data platforms.
- Configure least-privilege access to S3 using IAM roles, bucket policies, KMS permissions and approved access controls.
- Support Databricks access patterns where S3-backed data is registered through Hive Metastore and exposed or migrated into Unity Catalog.
- Configure and validate Unity Catalog objects, including catalogs, schemas, tables, views, grants, storage credentials and external locations.
- Support downstream connectivity to Databricks using SQL Warehouse, tables, views or connector-based integration patterns.
- Develop Terraform/IaC for AWS IAM, S3, KMS and Databricks-related configuration.
- Troubleshoot access, metadata, schema, query and connectivity issues across AWS, Databricks and downstream integrations.
- Document lineage across AWS S3, Hive Metastore, Unity Catalog and downstream consumption layers.
Required Skills and Experience
- Strong hands-on experience with AWS IAM, S3, KMS and secure cloud access patterns.
- Hands-on Databricks on AWS experience, including Hive Metastore, Delta tables, SQL Warehouse and workspace access controls.
- Practical Unity Catalog experience, including grants, external locations, storage credentials and governed data access.
- Terraform or similar Infrastructure as Code experience.
- Good understanding of Delta, Parquet, CSV and JSON data formats.
- Strong troubleshooting, documentation and stakeholder communication skills.
Desirable Skills
- Palantir Foundry experience, especially Databricks connector, Data Connection, datasets, syncs, projects and permissions.
- Secure data-sharing, data governance, lineage and access approval experience.
- Python, PySpark or SQL development experience.
- CI/CD experience using GitLab, GitHub Actions, Azure DevOps or similar.
Key Deliverables
- Secure AWS S3 and Databricks access design.
- Hive Metastore to Unity Catalog integration approach.
- Terraform/IaC modules or scripts for repeatable configuration.
- Source-to-target lineage and access-control documentation.
- Operational runbook for access, schema, sync and connectivity issues.
Platform Engineer AWS, Databricks and Unity Catalog employer: StackStudio Digital Ltd.
Contact Detail:
StackStudio Digital Ltd. Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Platform Engineer AWS, Databricks and Unity Catalog
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your projects, especially those involving AWS, Databricks, and Terraform. This gives potential employers a taste of what you can do beyond your CV.
✨Tip Number 3
Prepare for interviews by brushing up on common technical questions related to AWS, Databricks, and Unity Catalog. Practise explaining your thought process clearly, as communication is key in tech roles.
✨Tip Number 4
Don’t forget to apply through our website! We’ve got some fantastic opportunities waiting for you, and applying directly can sometimes give you a leg up in the hiring process.
We think you need these skills to ace Platform Engineer AWS, Databricks and Unity Catalog
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with AWS, Databricks, and Unity Catalog. We want to see how your skills match the job description, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about this role and how your background makes you a perfect fit. We love seeing enthusiasm and a personal touch!
Showcase Your Technical Skills: When filling out your application, be specific about your hands-on experience with Terraform, IAM, and other key technologies mentioned in the job description. We’re looking for concrete examples that demonstrate your expertise.
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy to do!
How to prepare for a job interview at StackStudio Digital Ltd.
✨Know Your Tech Stack
Make sure you’re well-versed in AWS, Databricks, and Unity Catalog. Brush up on your knowledge of IAM roles, S3 configurations, and Terraform. Being able to discuss specific projects where you've used these technologies will show your hands-on experience.
✨Prepare for Scenario Questions
Expect questions that ask how you would handle specific integration scenarios or troubleshoot issues. Think about past experiences where you’ve designed secure access or resolved connectivity problems, and be ready to explain your thought process.
✨Showcase Your Documentation Skills
Since documentation is key in this role, be prepared to discuss how you document lineage and access controls. Bring examples of your previous work if possible, as this will demonstrate your attention to detail and organisational skills.
✨Communicate Clearly
Strong communication skills are essential, especially when discussing technical concepts with stakeholders. Practice explaining complex ideas in simple terms, and be ready to engage in discussions about data governance and secure access patterns.