‹ Back to all jobs

Data EngineerRemote

LocationBeaverton, OR 97005 - United States
Work TypeContract/Temp
Positions2 Positions
Salary RangeUS$61 - 66 per hour
Published At:a day ago
  • SQL
  • AWS
  • DevOps
  • ETL
  • Azure
  • Data Engineering
  • Jira
  • Airflow
  • NOSQL
  • Hackolade

Remote, US

Category: Technology
  • Innovative Technology; High Quality Products, Self-Empowerment
  • Globally Responsible; Sustainable Products, Diversity of Thought
  • Celebration of Sports; If You Have a Body, You are an Athlete

Title: Data Engineer

Location: Beaverton, OR

Duration: 2 Year contract

NIKE, Inc. does more than outfit the world's best athletes. It is a place to explore potential, obliterate boundaries and push out the edges of what can be. The company looks for people who can grow, think, dream and create. Its culture thrives by embracing diversity and rewarding imagination. The brand seeks achievers, leaders and visionaries. At Nike, it’s about each person bringing skills and passion to a challenging and constantly evolving game.

WHAT YOU WILL WORK ON

  • Establishes database management systems, standards, guidelines and quality assurance for database deliverables
  • Conceptual design, logical database, capacity planning, external data interface specification, data loading plan, data maintenance plan and security policy
  • Documents and communicates database design
  • Evaluates and installs database management systems
  • Codes complex programs and derives logical processes on technical platforms
  • Builds windows, screens and reports
  • Assists in the design of user interface and business application prototypes
  • Participates in quality assurance and develops test application code in client server environment
  • Provides expertise in devising, negotiating and defending the tables and fields provided in the database
  • Adapts business requirements, developed by modeling/development staff and systems engineers, and develops the data, database specifications, and table and element attributes for an application
  • Determines appropriateness of data for storage and optimum storage organization
  • Determines how tables relate to each other and how fields interact within the tables for a relational model

WHAT YOU BRING

  • Bachelor’s degree or higher or combination of relevant education, experience, and training in Computer Science
  • 6+ years experience in Data Engineering
  • 4+ years of experience working with Python specifically related to data processing with proficiency in Python Libraries - Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, Json
  • 3+ years of experience in Data Warehouse technologies – Databricks and Snowflake
  • Strong Data Engineering Fundamentals (ETL, Modelling, Lineage, Governance, Partitioning & Optimization, Migration)
  • Strong Databricks-specific skills (Apache Spark, DB SQL, Delta Lake, Delta Share, Notebooks, Workflows, RBAC, Unity Catalog, Encryption & Compliance)
  • Strong SQL (SQL, performance, Stored Procedures, Triggers, schema design) skills and knowledge of one of more RDBMS and NoSQL DBs like MSSQL/MySQL and DynamoDB/MongoDB/Redis
  • Cloud Platform Expertise: AWS and/or Azure
  • Experience in one or more ETL tools like Apache Airflow/AWS Glue/Azure Data Factory/Talend/Alteryx
  • Excellent knowledge of coding and architectural design patterns
  • Passion for troubleshooting, investigation and performing root-cause analysis
  • Excellent written and verbal communication skills
  • Ability to multitask in a high energy environment
  • Agile methodologies and knowledge of Git, Jenkins, GitLab, Azure DevOps and tools like Jira/Confluence

Nice to have:

  • Tools like - Collibra, Hackolade
  • Migration Strategy and Tooling
  • Data Migration Tools: Experience with migration tools and frameworks or custom-built solutions to automate moving data from Snowflake to Databricks
  • Testing and Validation: Ensuring data consistency and validation post-migration with testing strategies like checksums, row counts, and query performance benchmarks
  • Published on 17 Oct 2024, 4:58 AM