Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 3-5 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
Minimum of 3-5 years of hands-on experience in Big Data working in Hadoop Map Reduce(Java),Hadoop Streaming, Hive, Pig, Oozie in any of the Hadoop Distribution
Ability to provide direction specific to enterprise needs
Experience standing up and using Big Data environments to implement data analytics
Define technical requirements, technical and data architectures for the big data ecosystem on AWS
Ingest and transform data from multiple sources
Work with vendor platform providers and engineering peers to keep abreast of trends, products, frameworks, and applications.
Experience with big data ecosystem and tools engineering, development, operations
Strong understanding of operating platform stacks including Redhat Linux, Windows, OpenStack.
Experience in developing strategies, roadmaps and designs for large-scale organizations. Enabling the enterprise the ability for rapid growth and scalable solutions that do not require extensive manual intervention.
In depth knowledge of scripting tools and configuration management software (Python, Powershell, Perl, Saltstack, etc..) to enable extensive automation of our products and technologies for provisioning and management of the systems.
Effectively identify and manage stakeholder engagement and impacts across the enterprise
Engage executive stakeholders appropriately to review progress and obtain input, validation and approval of key decisions
Proven ability to collaborate across a large organisation to effectively realise outcomes