At a Glance
- Tasks: Build enterprise-scale data platforms and pipelines for analytics and AI.
- Company: Join bigspark, a fast-growing company transforming businesses with data and AI.
- Benefits: Enjoy competitive salary, generous leave, bonuses, and private medical cover.
- Why this job: Make a real impact by harnessing data to drive positive change.
- Qualifications: 3+ years in data engineering with strong programming skills in Python, Scala, or Java.
- Other info: Hybrid work model with opportunities for career growth in a dynamic environment.
The predicted salary is between 36000 - 60000 Β£ per year.
We are creating a world of opportunity for businesses by responsibly harnessing data and AI to enable positive change. We adapt to our clients' needs and then bring our engineering, development and consultancy expertise. Our people and our solutions ensure they head into the future equipped to succeed.
Our clients include Tier 1 Banking and Insurance clients, we have also been listed in the Sunday Times Top 100 Fastest Growing Private Companies.
The Role
We're looking for a Data Engineer to develop enterprise-scale data platforms and pipelines that power analytics, AI, and business decision-making. You'll work in a hybrid capacity which may require up to 2 days per week on client premises.
What You'll Do
- Develop highly available, scalable batch and streaming pipelines (ETL/ELT) using modern orchestration frameworks.
- Integrate and process large, diverse datasets across hybrid and multi-cloud environments.
What You'll Bring
- 3+ years commercial data engineering experience
- Strong programming skills in Python, Scala, or Java, with clean coding and testing practices.
- Big Data & Analytics Platforms: Hands-on experience with Apache Spark (core, SQL, streaming), Databricks, Snowflake, Flink, Beam.
- Data Lakehouse & Storage Formats: Expert knowledge of Delta Lake, Apache Iceberg, Hudi, and file formats like Parquet, ORC, Avro.
- Streaming & Messaging: Experience with Kafka (including Schema Registry & Kafka Streams), Pulsar, AWS Kinesis, or Azure Event Hubs.
- Data Modelling & Virtualisation: Knowledge of dimensional, Data Vault, and semantic modelling; tools like Denodo or Starburst/Trino.
- Cloud Platforms: Strong AWS experience (Glue, EMR, Athena, S3, Lambda, Step Functions), plus awareness of Azure Synapse, GCP BigQuery.
- Databases: Proficient with SQL and NoSQL stores (PostgreSQL, MySQL, DynamoDB, MongoDB, Cassandra).
- Orchestration & Workflow: Experience with Autosys/CA7/Control-M, Airflow, Dagster, Prefect, or managed equivalents.
- Observability & Lineage: Familiarity with OpenLineage, Marquez, Great Expectations, Monte Carlo, or Soda for data quality.
- DevOps & CI/CD: Proficient in Git (GitHub/GitLab), Jenkins, Terraform, Docker, Kubernetes (EKS/AKS/GKE, OpenShift).
- Security & Governance: Experience with encryption, tokenisation (e.g., Protegrity), IAM policies, and GDPR compliance.
- Linux administration skills and strong infrastructure-as-code experience.
In return, you will receive:
- Competitive salary
- Generous Annual Leave
- Discretionary Annual Bonus
- Pension Scheme
- Life Assurance
- Private Medical Cover (inc family)
- Permanent Health Insurance Cover / Income Protection
- Employee Assistance Programme
- A Perkbox account
Data Engineer in Glasgow employer: bigspark
Contact Detail:
bigspark Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land Data Engineer in Glasgow
β¨Tip Number 1
Network like a pro! Reach out to folks in the industry on LinkedIn or at local meetups. You never know who might have the inside scoop on job openings or can put in a good word for you.
β¨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving data pipelines and analytics. This gives potential employers a taste of what you can do beyond just your CV.
β¨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and practical tests. We recommend practicing coding challenges on platforms like LeetCode or HackerRank to sharpen your skills.
β¨Tip Number 4
Apply through our website! Itβs the best way to ensure your application gets seen by the right people. Plus, it shows youβre genuinely interested in joining our team at bigspark.
We think you need these skills to ace Data Engineer in Glasgow
Some tips for your application π«‘
Tailor Your CV: Make sure your CV reflects the skills and experience mentioned in the job description. Highlight your data engineering experience, especially with tools like Apache Spark and AWS, to show us youβre the right fit!
Craft a Compelling Cover Letter: Use your cover letter to tell us why youβre passionate about data engineering and how your background aligns with our mission at bigspark. A personal touch can really make you stand out!
Showcase Your Projects: If you've worked on any relevant projects, whether in a professional or personal capacity, be sure to mention them. We love seeing practical examples of your skills in action!
Apply Through Our Website: We encourage you to apply directly through our website. Itβs the best way for us to receive your application and ensures you donβt miss out on any important updates from us!
How to prepare for a job interview at bigspark
β¨Know Your Tech Stack
Make sure youβre well-versed in the technologies mentioned in the job description. Brush up on your skills in Python, Scala, or Java, and be ready to discuss your experience with Apache Spark, Kafka, and cloud platforms like AWS. Being able to talk confidently about these tools will show that youβre a great fit for the role.
β¨Showcase Your Problem-Solving Skills
Prepare to discuss specific challenges you've faced in previous roles and how you overcame them. Use the STAR method (Situation, Task, Action, Result) to structure your answers. This will help demonstrate your analytical thinking and ability to develop scalable data pipelines.
β¨Understand the Companyβs Vision
Research bigspark and their approach to harnessing data and AI. Be ready to explain how your skills can contribute to their mission of enabling positive change for businesses. Showing that you align with their values will make a strong impression.
β¨Ask Insightful Questions
Prepare thoughtful questions about the team, projects, and company culture. This not only shows your interest in the role but also helps you gauge if the company is the right fit for you. Questions about their data governance practices or future tech stack developments can spark engaging conversations.