Data Engineer
Job Title: | Data Engineer |
Contract Type: | Contract |
Location: | Canberra |
Industry: | |
Salary: | Competitive |
Start Date: | 2020-04-24 00:00:00 |
Reference: | V-37404 |
Contact Name: | Joanne Finchett |
Contact Email: | joanne@whizdom.com.au |
Job Published: | April 26, 2020 08:30 |
Job Description
The Role:
The Data Engineer applicant will demonstrate an understanding of big data technologies focusing on streaming data ingestion pipelines to normalise, enrich and process complex data types.
The person will have broad experience in operating and supporting Hadoop related technologies specialising in data normalisation, serialisation and storage. They will be comfortable operating as a hands-on data engineer within an agile scrum context.
The successful applicant must be a motivated self-starter with strong technical abilities and initiative. They will operate as a peer in a high performing technical team, and require strong organisational skills and attention to detail, and be comfortable working in a dynamic and agile environment.
The Data Engineer’s major responsibilities include:
- Contribute to the optimisation and maintenance of data ingestion pipelines including both stream and batch technologies.
- Develop and manage data storage table and message schemas for various event formats.
- Support Hadoop technologies to ensure data is stored and accessed efficiently.
- Ensure the effective implementation of role-based access controls across data lake components through technical enforcement and audit.
- Identify, report and remediate platform issues through proactive monitoring of health and performance metrics.
- Identify process improvements and contribute to data ingestion standards and definition of best practice.
- Communicate and report data ingestion and analysis related risks and issues to the Project Manager.
- Work under broad direction as a motivated, self-starter.
Skills and Experience Required:
Essential Criteria
- Demonstrated technical experience working with Hadoop environments or other similar technologies - 15%
- Experience ensuring high quality data characteristics (complete, valid, timely, unique, accurate, consistent) - 15%
- Working experience with common cyber security datasets including the ability to identify, extract and enrich features - 15%
- Able to rapidly acquire new technical skills in an ever changing environment - 10%
- Knowledge of data lake technologies to ensure the efficient storage and analysis of various datasets - 15%
- A high level of proficiency with the Linux Operating System - 10%
- Experience with data ingestion strategy for the data lake components - 10%
- Excellent analytical thinking and troubleshooting skills - 10%
Desirable criteria
Weighting
- Experience in streaming platform technologies such as Kafka, Flink and Spark - 20%
- Experience developing software in languages such as Java and Python - 10%
- Familiarity with common events generated and ingested to support cyber security use cases - 20%
- Experience in a range of ICT technologies; client and server software, infrastructure components, and deployment technologies (CI/CD tools) - 10%
- High order experience using Jira and Confluence for documentation and tasking -10%
- Experience leading or managing highly technical teams to achieve business outcomes under pressure - 10%
- Able to manage multiple, concurrent tasks with competing demands - 10%
- High attention to detail with solid documentation skills and good client communication skills - 10%
Canberra / Offsite based. 12 Month Contract with 1 x 12 Month extension options
Security Requirements:
Must be an Australian Citizen. NV1 Security Clearance is required
How to Apply:
Applications close 7 May 2020
Please upload your resume to apply. Please note you may need to complete selection criteria to complete this application process. We will be in touch with instructions for suitably skilled candidates
Call Jo Finchett 1300 944 936 for any further information
Get similar jobs like these by email
By submitting your details you agree to our T&C's