At a Glance
- Tasks: Design and develop innovative data architectures to drive AI and analytics.
- Company: Bechtel, a global leader in infrastructure and engineering.
- Benefits: Flexible telework options, competitive pay, and career growth opportunities.
- Why this job: Shape the future of data-driven infrastructure and make a real impact.
- Qualifications: 5+ years in data architecture, strong skills in Python or Scala, and Azure experience.
- Other info: Join a diverse team committed to innovation and inclusivity.
The predicted salary is between 36000 - 60000 ÂŁ per year.
Extraordinary teams building inspiring projects: Since 1898, we have helped customers complete more than 25,000 projects in 160 countries on all seven continents that have created jobs, grown economies, improved the resiliency of the world's infrastructure, increased access to energy, resources, and vital services, and made the world a safer, cleaner place. Differentiated by the quality of our people and our relentless drive to deliver the most successful outcomes, we align our capabilities to our customers' objectives to create a lasting positive impact. We serve the Infrastructure; Nuclear, Security & Environmental; Energy; Mining & Metals, and the Manufacturing and Technology markets. Our services span from initial planning and investment, through start-up and operations.
The Infrastructure AI and Data program is a cornerstone initiative designed to transform how our Global Business Unit (GBU) manages, governs, and leverages data to drive innovation and operational excellence. As we scale digital capabilities across complex engineering and delivery environments, the ability to harness data effectively is critical to enabling advanced analytics, AI-driven insights, and seamless collaboration across diverse stakeholders.
The Data Solutions Architect will play a pivotal role in shaping the Unified Data Platform (UDP) built on Azure Databricks; the data foundation that underpins this transformation. This position is not just about building pipelines—it’s about architecting a robust, scalable, and secure data ecosystem that empowers decision-making and accelerates AI adoption across the business. By integrating heterogeneous data sources into a unified lakehouse architecture and ensuring governance at every layer, this role will help unlock the full potential of data for predictive analytics, machine learning, and digital delivery solutions.
Operating within the PIIM (Project Innovation & Information Management) functional team and reporting to the Innovation & AI Manager, the Data Solutions Architect also engages closely with functional SMEs within the Infrastructure GBU (e.g. Engineering, Construction, Estimating) to understand the data requirements of the business. If you are passionate about designing modern data platforms, enabling AI-driven workflows, and influencing the future of infrastructure delivery through data innovation, this is your opportunity to make a lasting impact.
This position is designated as part-time telework per our global telework policy and may require at least three days of in-person attendance per week at the assigned office or project. Weekly in-person schedules will be determined by the individual and their supervisor, in consultation with functional or project leadership.
Major Responsibilities:- Lead the design, development, and deployment of enterprise data architectures and pipelines within the Unified Data Platform (UDP) on Azure Databricks.
- Collaborate with functional SMEs, data engineers, and AI specialists to integrate heterogeneous data sources and enable AI-driven use cases.
- Implement Medallion Architecture and real-time data synchronization to improve accuracy, timeliness, and scalability of project data exchanges.
- Establish robust governance frameworks—including data quality checks, lineage tracking, and semantic standardization—using Unity Catalog and Bechtel’s Clarity tool.
- Drive innovation in data modeling and architecture to support advanced analytics, machine learning, and digital delivery solutions.
- Bachelor’s degree in computer science or related field.
- 5+ years of experience in data architect, data engineering, or related role.
- Expertise in modern data platforms for structured and semi-structured data, preferably lakehouse architectures such as Databricks.
- Strong programming skills in Python or Scala and experience with Azure cloud services.
- Proven ability to design and optimize scalable data pipelines and architectures.
- Working knowledge of stream processing, queueing systems, and highly scalable data stores.
- Ability to coordinate enterprise solution deployment across diverse stakeholders.
- Hands-on experience with Databricks or Spark, including Medallion Architecture implementation.
- Experience with ML frameworks (TensorFlow, PyTorch, Sci-kit Learn) and MLOps pipelines (MLflow, AzureML).
- Familiarity with EPC (Engineering, Procurement & Construction) sector data and ontology-driven governance models.
- Knowledge of graph-based models (e.g., Neo4j) for linking entities and relationships in AI contexts.
For decades, Bechtel has worked to inspire the next generation of employees and beyond! Because our teams face some of the world's toughest challenges, we offer robust benefits to ensure our people thrive. Whether it is advancing careers, delivering programs to enhance our culture, or providing time to recharge, Bechtel has the benefits to build a legacy of sustainable growth.
Diverse teams build the extraordinary: As a global company, Bechtel has long been home to a vibrant multitude of nationalities, cultures, ethnicities, and life experiences. This diversity has made us a more trusted partner, more effective problem solvers and innovators, and a more attractive destination for leading talent. We are committed to being a company where every colleague feels that they belong-where colleagues feel part of 'One Team,' respected and rewarded for what they bring, supported in pursuing their goals, invested in our values and purpose, and treated equitably.
Bechtel is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity and expression, age, national origin, disability, citizenship status (except as authorized by law), protected veteran status, genetic information, and any other characteristic protected by federal, state or local law.
We’re creating a space where all people, including those with disabilities, can advance their careers and optimize their possibilities through a fair and inclusive workplace. If reasonable adjustments are needed to apply for an open position, please contact us to ensure we can provide an environment where each and every candidate can thrive.
Data Solutions Architect employer: Bechtel Oil, Gas & Chemicals Incorporated
Contact Detail:
Bechtel Oil, Gas & Chemicals Incorporated Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Solutions Architect
✨Tip Number 1
Network like a pro! Reach out to people in your industry, especially those who work at Bechtel or similar companies. A friendly chat can open doors and give you insights that job descriptions just can't.
✨Tip Number 2
Show off your skills in real-time! If you get the chance, participate in hackathons or data challenges. This not only sharpens your skills but also gives you something tangible to discuss during interviews.
✨Tip Number 3
Prepare for those tricky interview questions! Research common questions for Data Solutions Architects and practice your answers. We want you to feel confident and ready to impress!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you're genuinely interested in joining our team.
We think you need these skills to ace Data Solutions Architect
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Solutions Architect role. Highlight your experience with Azure Databricks, data architectures, and any relevant projects that showcase your skills in building scalable data ecosystems.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data innovation and how your background aligns with our mission at StudySmarter. Be genuine and let your personality come through!
Showcase Relevant Projects: If you've worked on projects involving AI-driven workflows or advanced analytics, make sure to mention them. We love seeing real-world applications of your skills, so don’t hold back on the details!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy to do!
How to prepare for a job interview at Bechtel Oil, Gas & Chemicals Incorporated
✨Know Your Data Inside Out
As a Data Solutions Architect, you’ll need to demonstrate your expertise in data architectures and pipelines. Brush up on your knowledge of Azure Databricks and Medallion Architecture. Be ready to discuss specific projects where you've implemented these technologies and how they contributed to the success of the project.
✨Showcase Your Collaboration Skills
This role involves working closely with functional SMEs and other stakeholders. Prepare examples of how you've successfully collaborated in past projects, especially in integrating diverse data sources. Highlight your ability to communicate complex technical concepts to non-technical team members.
✨Prepare for Technical Questions
Expect to face technical questions related to data modelling, machine learning frameworks, and cloud services. Review key concepts in Python or Scala programming, as well as your experience with MLOps pipelines. Practising coding challenges can also help you feel more confident during the interview.
✨Understand the Company’s Vision
Familiarise yourself with Bechtel's mission and values, especially their commitment to innovation and sustainability. Be prepared to discuss how your skills and experiences align with their goals, particularly in driving AI adoption and enhancing operational excellence through data.