AWS Cloud Data Architect

AWS Cloud Data Architect

London Full-Time 72000 - 108000 Β£ / year (est.) No home office possible
G

At a Glance

  • Tasks: Design and implement cloud data solutions using AWS and other technologies.
  • Company: Join a leading tech firm focused on innovative cloud solutions.
  • Benefits: Enjoy a competitive salary, training opportunities, and a vibrant office culture.
  • Why this job: Be part of a dynamic team shaping the future of data architecture.
  • Qualifications: 16-18+ years in DWBI, Big Data, and Cloud Technologies required.
  • Other info: This is a senior-level role based in London, with a focus on presales activities.

The predicted salary is between 72000 - 108000 Β£ per year.

16-18+ years of total experience in DWBI, Big Data, Cloud Technologies.

Implementation experience and hands-on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks.

Must have hands-on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent).

In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc.

Excellent ETL skills, Data Modeling Skills.

Ability to define the monitoring, alerting, deployment strategies for various services.

Experience providing solutions for resiliency, failover, monitoring etc.

Good to have working knowledge of Jenkins, Terraform, StackDriver or any other DevOps tools.

Design and implement effective database solutions and models to store and retrieve data.

Examine and identify database structural necessities by evaluating client operations, applications, and programming.

Ability to recommend solutions to improve new and existing database systems.

Assess data implementation procedures to ensure they comply with internal and external regulations.

Prepare accurate database design and architecture reports for management and executive teams.

Oversee the migration of data from legacy systems to new solutions.

Educate staff members through training and individual support.

Strong knowledge of database structure systems and data mining.

Knowledge of systems development, including system development life cycle, project management approaches and requirements, design and testing techniques.

Proficiency in data modeling and design, including SQL development and database administration.

Ability to implement common data management and reporting technologies, as well as the Columnar and NoSQL databases, data visualization, unstructured data, and predictive analytics.

Mandatory Skills: GCP, AWS, Azure, Big data, Apache Spark, Beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF.

This is a senior-level, in-house opportunity focused on Data Architecture, Cloud technologies, and Presales activities, including contributions to RFPs and RFQs. The role requires working five days a week from Client’s London office.

AWS Cloud Data Architect employer: Gazelle Global

As an esteemed employer, we offer a dynamic work environment in our London office that fosters innovation and collaboration among talented professionals. Our commitment to employee growth is evident through continuous training opportunities and involvement in cutting-edge projects, particularly in cloud technologies and data architecture. With a strong focus on work-life balance and a culture that values diversity and inclusion, we ensure that our team members thrive both personally and professionally.
G

Contact Detail:

Gazelle Global Recruiting Team

StudySmarter Expert Advice 🀫

We think this is how you could land AWS Cloud Data Architect

✨Tip Number 1

Network with professionals in the cloud data architecture field, especially those who have experience with AWS and GCP. Attend industry meetups or webinars to connect with potential colleagues and learn about the latest trends and technologies.

✨Tip Number 2

Showcase your hands-on experience with specific tools mentioned in the job description, like Apache Spark or Databricks, during informal conversations or networking events. This can help you stand out as a candidate who is not only knowledgeable but also practically skilled.

✨Tip Number 3

Familiarise yourself with the company's projects and values by researching StudySmarter's recent initiatives. This will allow you to tailor your discussions and demonstrate how your skills align with their goals during interviews.

✨Tip Number 4

Prepare to discuss your experience with data migration and database design in detail. Be ready to share specific examples of challenges you've faced and how you overcame them, as this will highlight your problem-solving abilities and expertise in the field.

We think you need these skills to ace AWS Cloud Data Architect

Cloud Technologies (AWS, GCP, Azure)
Big Data Processing (Apache Spark, Beam)
Data Warehousing and Business Intelligence (DWBI)
ETL Skills
Data Modelling Skills
Database Design and Architecture
Monitoring and Alerting Strategies
Data Migration from Legacy Systems
SQL Development
Database Administration
Knowledge of Big Query, Redshift, Synapse
Experience with Pub Sub, Kinesis, MQ, Event Hubs
Familiarity with Kafka, Dataflow, Airflow, ADF
DevOps Tools (Jenkins, Terraform, StackDriver)
Data Management and Reporting Technologies
Predictive Analytics
Strong Communication Skills
Project Management Approaches
Training and Support for Staff

Some tips for your application 🫑

Tailor Your CV: Make sure your CV highlights your experience with AWS, GCP, and Azure. Emphasise your hands-on experience with Big Data processing services like Apache Spark and your proficiency in ETL and data modelling.

Craft a Compelling Cover Letter: In your cover letter, explain why you are passionate about cloud technologies and data architecture. Mention specific projects where you've successfully implemented solutions using the required technologies.

Showcase Relevant Projects: Include examples of past projects that demonstrate your ability to design and implement database solutions. Highlight any experience with migration from legacy systems and your role in ensuring compliance with regulations.

Highlight Soft Skills: Since this role involves educating staff and contributing to presales activities, make sure to mention your communication skills and any experience in training or mentoring others in cloud technologies.

How to prepare for a job interview at Gazelle Global

✨Showcase Your Hands-On Experience

Make sure to highlight your practical experience with AWS and at least one other hyperscaler. Be prepared to discuss specific projects where you implemented cloud technologies, especially in Big Data processing services like Apache Spark or Beam.

✨Demonstrate Your ETL and Data Modelling Skills

Prepare examples that showcase your expertise in ETL processes and data modelling. Discuss how you've designed effective database solutions and the impact of your work on previous projects.

✨Understand the Business Needs

Research the company and understand their business model. Be ready to explain how your skills can help them improve their database systems and meet compliance regulations, as well as how you can contribute to their presales activities.

✨Prepare for Technical Questions

Expect technical questions related to cloud technologies, data architecture, and DevOps tools. Brush up on your knowledge of tools like Jenkins, Terraform, and data visualisation techniques to demonstrate your comprehensive understanding of the field.

AWS Cloud Data Architect
Gazelle Global
G
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>