Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 3-5 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
Job Description
Hands on experience in Batch (must) and Streaming (preferable) processing on Cloudera data platform tool-stack: Must have: Hadoop, Spark, HIVE, Oozie and Python/Pyspark as the programming language Preferable: Scala, Hbase, Java, Shell scripting
Min 3 years of hands-on project experience on Big Data platform – preferable Cloudera/Hortonworks Min 2 years of Cloud experience – preferable on Azure Min 2 end to end data lake project delivery experience A very clear understanding of data warehousing processes, methodologies and supporting capabilities Be able to propose solutions and clearly articulate pros and cons of various technologies and platforms in the Big Data world To be able to benchmark systems, analyse system bottlenecks. To be able to document use cases, solutions, and recommendations. To have excellent written and verbal communication skills To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution To be able to work in teams, as a big data environment is developed in a team of employees with different disciplines To be able to work in a fast-paced agile development environment.