Big Data/Hadoop Developer (Adelaide based)



Work type:



Information & Communication Technology


Good hourly/daily rate!

Job reference:


Start date:


Contact email:


Kapil Arora

  • Australian Citizens only - Federal Government agency
  • Adelaide based, long-term role
  • Must have strong Hadoop skills
Our client is a large Australian Federal Government department and are currently looking for an experienced Big Data Architect/Hadoop Developer to help leading the design and development of data flows and several components. This is a long-term contract opportunities for a large project based in Adelaide.

Essential requirements of the role are:
  • Experience in leading design and development of data flows using Hadoop ecosystem components
  • Ability to rapidly design, prototype and implement architectural patterns on Big Data Platforms
  • Work in multi-disciplinary teams and subject matter experts to deliver data-driven solutions
  • Significant experience and background in developing in Java/Scala, Python, Kafka, Spark, HBase, Impala/Hive
  • Experience with source code management tools (e.g. Gitlab), test automation and continuous integration technologies and methodologies.
  • Cloudera platform administration, engineering and management experience (optional)
  • 2+ years developing solutions in Big Data platforms (Cloudera stack highly desirable)

If the role sounds of interest, please apply by clicking the "APPLY FOR THIS ROLE" button.

Share this job:

help your friends find their dream job: