‹ Back to all jobs

Cloud Data ArchitectRemote

LocationBeaverton, OR 97005 - United States
Work TypeContract/Temp
Positions1 Position
Salary RangeUS$80 - 90 per hour
Published At:5 days ago
  • Architect
  • Data Modelling
  • Cloud Platforms

Remote, US - Must work EST hours.

Category: Technology
  • Innovative Technology; High Quality Products, Self-Empowerment
  • Globally Responsible; Sustainable Products, Diversity of Thought
  • Celebration of Sports; If You Have a Body, You are an Athlete

Title: Cloud Data Architect

Location: Beaverton, OR

Duration: 10 month contract

NIKE, Inc. does more than outfit the world's best athletes. It is a place to explore potential, obliterate boundaries and push out the edges of what can be. The company looks for people who can grow, think, dream and create. Its culture thrives by embracing diversity and rewarding imagination. The brand seeks achievers, leaders and visionaries. At Nike, it’s about each person bringing skills and passion to a challenging and constantly evolving game.

Job Summary

We are seeking a highly skilled Cloud Data Architect to design, implement, and optimize comprehensive data warehousing solutions on the cloud, with a strong focus on Databricks Lakehouse. The ideal candidate will possess a deep understanding of cloud architecture, data warehousing principles, dimensional modeling, and big data technologies. This role requires strategic thinking, technical expertise, and the ability to collaborate with cross-functional teams translate business & non-functional requirements into data model for analytics and technical solution aligned with technology strategy. It would include dimensional models, data flows and other solution details to ensure data integrity, performance, and scalability while aligning with business objectives.

Key Responsibilities

Strategic Data Architecture:

  • Develop and maintain the data and solution architecture for robust, scalable, and secure cloud-based data warehousing solutions, including Databricks Lakehouse.
  • Lead the strategic design of data models, schemas, and data flows to support enterprise business intelligence and advanced analytics needs.
  • Ensure the architecture aligns with business goals, delivering high availability, disaster recovery, and security.

Solution Implementation and Management:

  • Oversee the implementation of comprehensive data warehousing solutions using cloud data platforms and Databricks Lakehouse.
  • Architect and implement ETL/ELT processes for efficient data ingestion, transformation, and loading, particularly within the Databricks environment.
  • - Optimize the performance of data warehouses and Lakehouse solutions to handle large-scale data processing and complex data consumption patterns.
  • Design and implement operation observability, deliver operational metrics for data products.

Technical Leadership and Collaboration:

  • Act as a subject matter expert and advisor to data engineers, analysts, and business stakeholders.
  • Lead and mentor technical teams in best practices, architectural standards, and emerging technologies.
  • Communicate and present architecture solutions and strategies to both technical and non-technical stakeholders.

Continuous Innovation and Improvement:

  • Stay abreast of the latest advancements in cloud technologies, data warehousing, and Databricks Lakehouse architecture.
  • Conduct regular performance tuning, capacity planning, and cost optimization.
  • Implement robust data governance, quality, and compliance measures to maintain data integrity and security.

Skills and Qualifications

Technical Expertise:

  • Cloud Data Analytics Platforms: Deep expertise designing data architecture and solutions architecture using Lakehouse pattern in cloud platforms such as Databricks, AWS, Google Cloud Platform, or Microsoft Azure. AWS preferred.
  • Data Warehousing: Advanced proficiency in cloud-based data warehousing technologies (e.g. Databricks, AWS Redshift, Snowflake, BigQuery, etc ) and Databricks Lakehouse.
  • ETL/ELT Tools: Strong experience with ETL/ELT tools like Databricks DLT, Spark based ETL, etc
  • Database Management: Strong knowledge of SQL and Lakehouse databases, with expertise in logical and dimensional data modeling, schema design, and normalization/denormalization techniques.

Analytical and Strategic Thinking:

  • Strong analytical and problem-solving skills to diagnose issues and design innovative solutions.
  • Ability to analyze complex business requirements and translate them into scalable and efficient technical specifications.

Leadership and Soft Skills:

  • Excellent leadership, communication, and interpersonal skills to effectively collaborate with cross-functional teams and stakeholders.
  • Strong project management skills to oversee multiple projects, ensure timely delivery, and meet organizational goals.
  • Ability to work independently and lead teams in a fast-paced, dynamic environment.

Experience:

  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Minimum of 10 years of experience in data warehousing, with at least 5 years in cloud environments.
  • Proven track record of designing and implementing complex cloud data warehousing solutions, including extensive experience with Databricks Lakehouse.

Certifications (Preferred):

  • AWS Certified Solutions Architect
  • AWS Certified Data Engineer
  • Databricks Certification (e.g., Databricks Certified Data Engineer Associate/Professional)
  • Published on 02 Jul 2024, 4:50 PM