Connecting...

Big Data Engineer

Location:

Melbourne C B D

Work type:

Full Time

Classification:

Information & Communication Technology

Salary:

$100000.00 - $180000 per annum

Job reference:

JO-1810-524796

Start date:

2018-10-31

Contact email:

melissa.haddad@talentinternational.com

Advertiser:

Melissa Haddah

About the role

The Big Data Engineer will expand and optimise our clients' data and data pipeline architecture, as well as optimise their data flow and collection for cross functional teams. Your responsibilities include:

* Build robust, efficient and reliable data pipelines consisting of diverse data sources to ingest and process data into AWS based data lake platform


* Design and develop real time streaming and batch processing pipeline solutions


* Assemble large, complex data sets that meet functional / non-functional business requirements.


* Design, develop and implement data pipelines for data migration & collection, data analytics and other data movement solutions.


* Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.


* Build DevOps pipeline


* Work with data and analytics experts to strive for greater functionality in our data systems

About the Candidates

They will have the ability to optimise data systems and build them from the ground up. They will support software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.


Essential skills and experience:


* Proven working experience as Big Data engineer for 2+ years preferably in building data lake solution by ingesting and processing data from various source systems


* Experience with multiple Big data technologies and concepts such as HDFS, Hive, MapReduce, Spark, Spark streaming and NoSQL DB like HBase etc


* Experience with specific AWS technologies (such as S3, Redshift, EMR, and Kinesis)


* Hands-on experience in Big Data ETL tools like Informatica big data management or Talend big data management will be a huge plus


* Experience in one or more of Java, Scala, python and bash.


* Ability to work in team in diverse/ multiple stakeholder environment


* Experience in working in a fast-paced Agile environment


* BS in Computer Science, Statistics, Informatics, Information Systems or another quantitative field


Preferable skills and experience


* Knowledge of and/or experience with Big Data integration and streaming technologies (e.g. Kafka, Flume, etc.)


* Experience in building data ingestion framework for enterprise data lake is highly desirable


* Experience of CI/CD pipeline using Jenkins


* Knowledge of building self-contained applications using Docker, Kubernetes or similar technologies




What we can offer?


Our client is a world leader in technology enabled change. We can offer;


* Formal training with industry recognized certifications


* The ability to interact with peers in a sharing and inclusive community


* Exciting and challenging projects


* A well-structured and tailored career framework


* A culture of collaboration and recognition


* Excellent remuneration

If this sounds like you, apply now or call Melissa Haddad on (03) 9236 7732




 

Share this job:

help your friends find their dream job: