Connecting...

Big Data Hadoop Architect - Brisbane

Location:

Victoria

Work type:

Full Time

Classification:

Information & Communication Technology

Salary:

Attractive salary and benefits

Job reference:

JO-1708-511752

Start date:

2017-10-10

Contact email:

peter.li@talentinternational.com

Advertiser:

Peter Li

We are seeking a Big Data Architect (Hadoop) for large scale Data & Analytics projects of a large Tier 1 enterprise. Our client is building a team to support their next generation data platform initiatives. We are looking for a Big Data Architect (Hadoop) who can lead the charge and take their existing architecture, design, and cloud based data engineering to the next level.
 
In this role, you will have the opportunity to shape the future of Advanced Data Analytics. We are seeking highly skilled Technologists who can map Technology capabilities to Business Objectives.
 
As the Big Data Architect, you will be responsible for owning the design and architecture strategy the big data platform. This is a critical role, which will focus on delivering scalable solutions that unify and link all of their existing Data assets across the company.
 
This role will provide technical leadership and become a mentor to others on the Team. This role will be expected to have experience in designing and delivering scalable solutions using Technologies under the Hadoop ecosystem (Spark, Kafka, Flume, Oozie, HBase).
  • Exciting opportunities to do different stuff every day
  • Dynamic and fast growing organisation
  • Diverse role with opportunity to make a significant impact
JOB RESPONSIBILITIES
  • Architect and lead the Development of a highly scalable Next Generation Data Platform.
  • Lead, manage, and mentor the WB Data Engineering Team in Big Data technologies.
  • Work with other teams to ensure Delivery targets are met.
  • Designing and implementing complex scalable statistical models such as but not limited to Recommendation and Classification models.
  • Designing and implementing solutions that meet with security compliance guidelines.
JOB REQUIREMENTS
  • Bachelor's degree in Computer Science or related field.
  • Minimum 2 years experience Architecting and Delivering Solutions on the Hadoop platform.
  • Minimum of 1.5 years experience working with Apache Hadoop framework.
  • Solid experience in Apache Spark, Hive.
  • Experience using Sqoop, Kafka, Flume, Oozie, HBase, and Apache Phoenix.
  • Strong communication skills.
  • Ability to work independently or collaboratively.
  • Detail oriented with strong organization and prioritization skills.
  • Demonstrated ability to make decisions in technology selection and implementation approach based on long-term strategic objectives, while taking into consideration short-term implications for ongoing or planned implementations.
  • Demonstrated ability to apply technology in solving business problems and proven ability to communicate with both technical and non-technical audiences.
  • In-depth knowledge of software development technology, principles, methods, tools, and practices and industry standards and trends in Big Data space.
If you posses the above skills, are interested in a challenging and rewarding opportunity with an exciting and fast growing organisation please apply below or call Peter Li on 03 9602 4222

Share this job:

help your friends find their dream job: