Lead Engineer

Contract Type:

Contract

Location:

Sydney, New South Wales, Australia

Industry:

Information & Communication Technology (ICT)

Salary:

$90.00 - $100.00 Hourly

Contact Email:

katrinag@whizdom.com.au

Date Published:

08-Dec-2025

Lead Engineer

Data Engineer – Data Platforms

 

About our client:
Our client is a leader in professional services and consulting, providing a wide range of services including strategy, consulting, technology and systems integration, and operations to a range of clients across Government, Financial Services, Utilities and more.


About the role:
An opportunity exists for a Big Data Engineer to work with one of their Big 4 Banking clients.

The Data Engineer is responsible for building and supporting the big data platform. Strong DDEP experience and previous experience working for a Big 4 bank is essential, as is strong Apache Spark experience.


Key Responsibilities
  • Design, Develop, test and support future-ready data solutions for customers across industry verticals
  • Produce Detailed Solution Design and support data engineers and testers as a technical lead in a data project
  • Supports and maintains data engineering solutions on Azure Cloud ecosystem
  • Develop, test and support end-to-end batch and near real-time data flows/pipelines
  • Business Intelligence & Database Specialist with experience in database development, data modelling, ETL processes, and BI reporting.
  • Develop and demonstrate Proof of Concepts and Working Demos
  • Support and collaborate with other internal/external consultants in consulting, workshops and delivery engagements

Mandatory Requirements:
  • Strong Hands on experience in Python and PySpark.
  • Experience using Data Frames.
  • Experience developing data engineering solutions on Azure Cloud ecosystem
  • Hands-on coding experience in Kafka, Scala would be an added advantage. 
  • Must have leadership qualities applicable for the project team and beyond
  • Must understand and navigate large organisation’s governance and processes with ease
  • Must have exceptional inter-personal skills and able to present solution design with confidence in large technical forums
  • Designed and optimized Azure Data Factory pipelines, ensuring seamless data integration and automation for financial and operational reporting.
  • Implemented Azure Synapse Analytics & Blob Storage, improving data accuracy and removing manual process
  • Optimized SQL queries and data models, 
  • Minimum of 3+ years of hands-on development, testing and administration experience with Big Data/Data Lake services in cloud and on-premise
  • Understanding of various data modelling concepts and techniques
  • Solid understanding of containerisation, virtualisation, infrastructure and networking
  • Diverse experience in software development methodologies and project delivery frameworks such as Agile Sprints, Kanban, Waterfall etc

What's on offer?
This contract is available for an initial 6-month term with likely extension on a long-term project.

Located in Sydney, this role offers a hybrid working arrangement.


How to Apply:
 
Please upload your resume to apply. We will be in touch with further instructions for suitably skilled candidates. Please note that you will be required to complete selection criteria to complete your application for this role.

Call Katrina Gabriel on 0489 923 756 or email katrinag@whizdom.com.au for any further information. 

Candidates will need to be willing to undergo pre-employment screening checks which may include, ID and work rights, security clearance verification and any other client requested checks.
Apply Now

Share this job

Interested in this job?
Save Job
Create Alert

Similar Jobs

SCHEMA MARKUP ( This text will only show on the editor. )