At a Glance
- Tasks: Design and manage cutting-edge data platforms using cloud-native architecture.
- Company: Join a forward-thinking tech company in Basildon, UK.
- Benefits: Competitive contract pay and the chance to work onsite with a dynamic team.
- Other info: Opportunity for professional growth in a fast-paced, tech-driven setting.
- Why this job: Make an impact by architecting innovative data solutions in a collaborative environment.
- Qualifications: Strong experience in solution architecture and hands-on expertise in Snowflake.
The predicted salary is between 60000 - 80000 £ per year.
Location: Dublin, Nenagh Ireland / Basildon, United Kingdom
Work Mode: Fully Onsite
Engagement Type: Contract
Key Responsibilities / Skill Requirements:
- Strong experience in Solution Architecture and Cloud-Native Architecture Design
- Hands-on expertise in Snowflake Architecture
- Experience with Data Lakehouse architectures, including Apache Iceberg
- Proficiency in designing and managing S3-based Data Lakes
- Experience with Apache Airflow / MWAA for workflow orchestration
- Strong knowledge of dbt (data build tool) for data transformation
- Experience with EMR / PySpark for large-scale data processing
- Ability to create Architecture Design Documents (ADDs)
- Deep understanding of cloud-native design patterns
- Experience in performance tuning across:
- Snowflake
- Airflow
- Iceberg
Data Platform Solution Architect employer: N Consulting Limited
Contact Detail:
N Consulting Limited Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Platform Solution Architect
✨Tip Number 1
Network like a pro! Reach out to folks in your industry on LinkedIn or at local meetups. We all know that sometimes it’s not just what you know, but who you know that can land you that dream job.
✨Tip Number 2
Prepare for those interviews by practising common questions related to Solution Architecture and Cloud-Native Design. We recommend doing mock interviews with friends or using online platforms to get comfortable with your responses.
✨Tip Number 3
Showcase your skills! Create a portfolio or GitHub repository that highlights your projects, especially those involving Snowflake or Data Lakehouse architectures. We want to see your hands-on expertise in action!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Data Platform Solution Architect
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Solution Architecture and Cloud-Native Design. We want to see how your skills align with the key responsibilities listed in the job description.
Showcase Relevant Projects: Include specific projects where you've worked with Snowflake, Apache Iceberg, or any of the other technologies mentioned. This helps us understand your hands-on expertise and how you can contribute to our team.
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data platforms and how your background makes you a great fit for this role. Be genuine and let your personality shine through!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from us!
How to prepare for a job interview at N Consulting Limited
✨Know Your Architecture
Make sure you brush up on your knowledge of Solution Architecture and Cloud-Native Architecture Design. Be ready to discuss your hands-on experience with Snowflake Architecture and Data Lakehouse architectures, especially Apache Iceberg. Prepare examples from your past work that showcase your expertise.
✨Showcase Your Workflow Skills
Since experience with Apache Airflow or MWAA is crucial, be prepared to talk about how you've used these tools for workflow orchestration. Bring specific examples of projects where you implemented these technologies and the impact they had on performance and efficiency.
✨Demonstrate Data Transformation Proficiency
Highlight your proficiency in dbt (data build tool) for data transformation. Discuss how you've used it to streamline processes and improve data quality. If possible, share metrics or outcomes that illustrate your success in this area.
✨Prepare for Technical Questions
Expect technical questions related to performance tuning across Snowflake, Airflow, and Iceberg. Brush up on best practices and be ready to explain your approach to ensuring platform reliability, scalability, and observability. This will show your deep understanding of cloud-native design patterns.