Cloud Data Architect (AWS)

Cloud Data Architect (AWS)

London Full-Time 72000 - 108000 Β£ / year (est.) No home office possible
G

At a Glance

  • Tasks: Design and implement cloud data solutions using AWS and other technologies.
  • Company: Join a leading tech firm focused on innovative cloud solutions.
  • Benefits: Enjoy competitive pay, training opportunities, and a collaborative work environment.
  • Why this job: Be part of a dynamic team shaping the future of data architecture.
  • Qualifications: 16-18+ years in DWBI, Big Data, and Cloud Technologies required.
  • Other info: This is a senior-level role based in London, with a focus on presales activities.

The predicted salary is between 72000 - 108000 Β£ per year.

16-18+ years of total experience in DWBI, Big Data, Cloud Technologies.

Implementation experience and hands-on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks.

Must have hands-on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent).

In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc.

Excellent ETL skills, Data Modeling Skills.

Ability to define the monitoring, alerting, deployment strategies for various services.

Experience providing solutions for resiliency, failover, monitoring etc.

Good to have working knowledge of Jenkins, Terraform, StackDriver or any other DevOps tools.

Design and implement effective database solutions and models to store and retrieve data.

Examine and identify database structural necessities by evaluating client operations, applications, and programming.

Ability to recommend solutions to improve new and existing database systems.

Assess data implementation procedures to ensure they comply with internal and external regulations.

Prepare accurate database design and architecture reports for management and executive teams.

Oversee the migration of data from legacy systems to new solutions.

Educate staff members through training and individual support.

Strong knowledge of database structure systems and data mining.

Knowledge of systems development, including system development life cycle, project management approaches and requirements, design and testing techniques.

Proficiency in data modeling and design, including SQL development and database administration.

Ability to implement common data management and reporting technologies, as well as the Columnar and NoSQL databases, data visualization, unstructured data, and predictive analytics.

Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache Spark, Beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF.

Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools.

This is a senior-level, in-house opportunity focused on Data Architecture, Cloud technologies, and Presales activities, including contributions to RFPs and RFQs. The role requires working five days a week from Client’s London office.

Cloud Data Architect (AWS) employer: Gazelle Global

As a leading employer in the tech industry, we offer an exceptional work environment that fosters innovation and collaboration, particularly for our Cloud Data Architect role based in London. Our commitment to employee growth is evident through continuous training opportunities and a culture that encourages knowledge sharing, while our competitive benefits package ensures a rewarding work-life balance. Join us to be part of a dynamic team that values your expertise and empowers you to make impactful contributions in the rapidly evolving field of cloud technologies.
G

Contact Detail:

Gazelle Global Recruiting Team

StudySmarter Expert Advice 🀫

We think this is how you could land Cloud Data Architect (AWS)

✨Tip Number 1

Network with professionals in the cloud data architecture field, especially those who have experience with AWS and GCP. Attend industry meetups or webinars to connect with potential colleagues and learn about the latest trends and technologies.

✨Tip Number 2

Showcase your hands-on experience with big data processing services like Apache Spark or Beam by contributing to open-source projects or creating your own projects. This practical experience can set you apart from other candidates.

✨Tip Number 3

Familiarise yourself with the specific tools mentioned in the job description, such as Jenkins, Terraform, and various database solutions. Consider taking online courses or certifications to deepen your knowledge and demonstrate your commitment to continuous learning.

✨Tip Number 4

Prepare to discuss your previous projects in detail during interviews, focusing on your role in designing and implementing database solutions. Highlight your problem-solving skills and how you've contributed to improving existing systems.

We think you need these skills to ace Cloud Data Architect (AWS)

Cloud Technologies (AWS, GCP, Azure)
Big Data Processing (Apache Spark, Beam)
Data Warehousing and Business Intelligence (DWBI)
ETL Skills
Data Modelling Skills
Database Design and Architecture
Monitoring and Alerting Strategies
Resiliency and Failover Solutions
DevOps Tools (Jenkins, Terraform, StackDriver)
SQL Development
Database Administration
Data Migration from Legacy Systems
Data Visualization
Unstructured Data Management
Predictive Analytics
Project Management Approaches
System Development Life Cycle (SDLC)
Communication Skills
Training and Support for Staff

Some tips for your application 🫑

Tailor Your CV: Make sure your CV highlights your experience with Cloud technologies, especially AWS and any other hyperscalers you've worked with. Emphasise your hands-on experience with Big Data processing services and relevant tools like Apache Spark and Databricks.

Craft a Strong Cover Letter: In your cover letter, explain why you're passionate about Cloud Data Architecture and how your skills align with the job requirements. Mention specific projects where you've implemented solutions using the technologies listed in the job description.

Showcase Relevant Projects: Include a section in your application that details specific projects you've worked on that relate to the role. Highlight your experience with data migration, database design, and any training you've provided to staff members.

Highlight Soft Skills: Don't forget to mention your soft skills, such as communication and teamwork. Since this role involves educating staff and working closely with management, showcasing these abilities can set you apart from other candidates.

How to prepare for a job interview at Gazelle Global

✨Showcase Your Cloud Expertise

Make sure to highlight your hands-on experience with AWS and at least one other hyperscaler like GCP or Azure. Be prepared to discuss specific projects where you've implemented cloud technologies, especially in Big Data processing.

✨Demonstrate ETL and Data Modelling Skills

Prepare examples of your ETL processes and data modelling techniques. Discuss how you've designed effective database solutions and the impact they had on previous projects, focusing on your ability to improve existing systems.

✨Discuss Monitoring and Resiliency Strategies

Be ready to explain how you define monitoring and alerting strategies for cloud services. Share your experiences with ensuring resiliency and failover in your past roles, as this is crucial for the position.

✨Prepare for Technical Questions

Expect technical questions related to tools like Apache Spark, Kafka, and data migration strategies. Brush up on your knowledge of these technologies and be ready to provide detailed answers or even walk through a problem-solving scenario.

Cloud Data Architect (AWS)
Gazelle Global
G
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>