Remote, US
Category: Data and Analytics
- Innovative Technology; High Quality Products, Self-Empowerment
- Globally Responsible; Sustainable Products, Diversity of Thought
- Celebration of Sports; If You Have a Body, You are an Athlete
WHAT YOU WILL DO
- Developing data pipelines with Spark, SQL, and Python
- Creating and maintaining code, scripts, configurations, dependencies, and infrastructure needed for reliable data pipelines
- Converting SQL scripts to PySpark programs
- Building workflows and DAGs with orchestration tools like Airflow
- Managing testing and quality assurance processes
- Leading data migration initiatives
- Supporting production data pipelines and systems
- Continuously improving data infrastructure and pipelines
WHAT YOU BRING
- Building scalable data processing systems with Spark and SQL
- Python programming and PySpark
- Airflow or similar workflow orchestration tools
- Infrastructure-as-code tools like Terraform
- CI/CD pipelines and GitHub
- Monitoring platforms to track pipeline health
- Leading testing processes and quality assurance
- Published on 03 Apr 2025, 5:40 PM