Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 1-3 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
As an Pipeline Engineer you will be involved in:
• Cataloging and governing the data, enabling access to trusted and compliant data at scale across the enterprise.
• Ingest data from various sources such as on-premises databases or data warehouses, SaaS applications, IoT sources, and streaming applications into a cloud data lake.
• Integrating the data by cleansing, enriching, and transforming it by creating zones such as a landing zone, enrichment zone and an enterprise zone.
• Applying data quality rules to cleanse and manage data while making it available across the organization to support DataOps.
• Preparing the data to ensure that refined and cleansed data moves to a cloud data warehouse for enabling self-service analytics and data science use cases.
• Stream processing to derive insights from real-time data coming from streaming sources such as Kafka and then moving it to a cloud data warehouse for analytics consumption.
We are looking for experience in the following skills:
Required:
• 3 years of experience in ETL development or data engineering
• Ability to understand and articulate requirements to technical and non-technical audiences
• Stakeholder management and communication skills, including prioritizing, problem solving and interpersonal relationship building
• Demonstrated experience in deployment / processing data on any cloud ecosystem such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud, and Snowflake for both batch and real-time processing
• Strong SQL skills
• Experience in marketing leading ETL software (e.g. Datastage, Informatica, SSIS or Talend)
Desirable:
• Experience in Document databases
• Experience in AWS PaaS and Python, R or Scala scripting
• Experience in data modelling
• Excellent presentation and communication skills, both written and verbal
• Ability to problem solve and architect in an environment with unclear requirements.
• Experience in AWS PaaS environment and its native services such as Glue, Lambda and S3
Job Qualifications
• Bachelor’s degree in IT, Computer Science, Engineering or relevant work experience.
• A client-centric, outcome-driven and quality-focused team player.
• Ability to work effectively independently as well as in a team environment.
• Enjoy problem-solving in the different domains and industries.
• Australian Citizenship would be required for some project roles to comply with security clearance requirements
Qualification:
• Bachelor’s degree, Computer Science, Software Engineering, or a related field
• Australian Citizenship would be required for some project roles to comply with security clearance requirements