Back to all jobs

ETL DeveloperOn-Site

LocationMelbourne VIC, Australia
Work TypeAny Employment
Positions2 Positions
Published At:3 days ago
  • SQL
  • ETL
  • Jenkins
  • CI/CD
  • Unix
Job no: DEYAM

Are you a passionate ETL Developer with expertise in cloud-based data engineering, automation, and large-scale data processing?

Join a dynamic team where you'll play a key role in designing, developing, and optimizing enterprise-level data solutions.

  • Work on cutting-edge technologies in cloud, big data, and automation.
  • Collaborate with a high-performing team in a fast-paced environment.
  • Shape the future of data engineering with innovative solutions.

What You’ll Do

  • Design, develop, test, and maintain ETL/ELT data pipelines.
  • Develop high-quality, scalable code in Python, SQL, Unix Shell, and PySpark.
  • Work with AWS & Azure, leveraging services like S3, EC2, EMR, Lambda, Redshift, Databricks Delta Lake, and Terraform.
  • Implement CI/CD pipelines using Jenkins, Docker, Kubernetes, Ansible, and GitHub.
  • Optimize data models and database performance for Data Warehouses and Data Lakes.
  • Collaborate with cross-functional teams and mentor junior developers.
  • Troubleshoot, debug, and enhance existing systems for better efficiency.

What We’re Looking For

  • MUST Have 8+ years of technical experience in banking (financial services experience is a plus).
  • Experience and skills in Home Lending (Desirable)
  • Strong background in ETL/ELT, data extraction, transformation, and integration.
  • Hands-on expertise in cloud-based data engineering (AWS & Azure).
  • Experience with big data technologies (Hadoop, Spark, Hive, Snowflake, Redshift).
  • Solid knowledge of SQL, PL/SQL, Postgres, MySQL, Oracle, or DB2.
  • Familiarity with Data Modelling (Star Schema, Data Vault 2.0).
  • Data migration experience, SSIS, SSRS (Desirable)
  • Strong problem-solving, analytical, and communication skills.
  • Experience working in Agile development environments.
  • Experience with Power BI and data visualization tools.
  • Exposure to streaming data processing and serverless architectures.
  • Hands on programming experience in writing Python, SQL, Unix Shell scripts, Pyspark scripts, in a complex enterprise environment
  • Experience in Terraform, Kubernetes and Docker
  • Experience with Source Control Tools - Github or BitBucket
  • Published on 24 Apr 2025, 5:07 AM