Connecting...

Hadoop Developer

Location:

Victoria

Work type:

Full Time

Classification:

Information & Communication Technology

Salary:

Attractive salary package + Benefits

Job reference:

JO-1704-508765

Start date:

2017-08-09

Contact email:

ronald.tran@talentinternational.com

Advertiser:

Ronald Tran

  • Exciting opportunities to do different stuff every day
  • Dynamic and fast growing organisation
  • Diverse role with opportunity to make a significant impact
We are seeking a Big Data Developer (Hadoop) for large scale Data & Analytics of large enterprises. Our client is building a team to support their next generation data platform initiatives. We are we're looking for a Big Data Developer (Hadoop) who can lead the charge and take their existing architecture, design, and cloud based data engineering to the next level.
 
In this role, you will have the opportunity to shape the future of Advanced Data Analytics.

As the Big Data Developer, you will be responsible for owning the design and architecture strategy of the big data platform. This is a critical role, which will focus on delivering scalable solutions that unify and link all of their existing Data assets across the company.
 
This role will provide technical leadership and become a mentor to others on the Team. This role will be expected to have experience in designing and delivering scalable solutions using Technologies under the Hadoop ecosystem (Spark, Kafka, Flume, Oozie, HBase).

JOB RESPONSIBILITIES
  • Partnering closely with business analysts and data scientists to identify data sources relevant to solving business problems and help design the optimal combination of data sources and analytical techniques for each problem.
  • Working with large data sets from multiple sources utilising big data tools and techniques to prepare data sources for efficient analysis and insight generation.
  • Understanding the quality of data sourced,its management, and liaising with data scientists and analysts to manage the impact of data quality issues.
  • Driving the collection of new data and the refinement of existing data sources.
  • Developing best practices for instrumentation and experimentation and communciate those to solution delivery teams. 
JOB REQUIREMENTS
  • Bachelor's degree in Computer Science or related field.
  • Minimum 2 years experience developing on the Hadoop platform.
  • Minimum of 1.5 years experience working with Apache Hadoop framework.
  • Extensive data modelling and data analysis experience
  • SQL on Massively Parallel Processing (MPP) relational databases
  • Solid experience in Apache Spark, Hive.
  • Experience using Sqoop, Kafka, Flume, Oozie, HBase, and Apache Phoenix.
  • Strong communication skills.
  • Ability to work independently or collaboratively.
  • Detail oriented with strong organization and prioritization skills.
  • Demonstrated ability to make decisions in technology selection and implementation approach based on long-term strategic objectives, while taking into consideration short-term implications for ongoing or planned implementations.
  • Demonstrated ability to apply technology in solving business problems and proven ability to communicate with both technical and non-technical audiences.
  • In-depth knowledge of software development technology, principles, methods, tools, and practices and industry standards and trends in Big Data space.

If you posses the above skills, are interested in a challenging and rewarding opportunity with an exciting and fast growing organisation please apply below or call Ronald Tran on 03 9236 7737

Share this job:

help your friends find their dream job: