Educational requirements: Bachelor
English requirements: Proficient English
Requirements for skilled employment experience for years: 5-8 years
Required residence status: Permanent resident, Citizen
Accept remote work: Accept all the time
Relevant skills:
• Bachelor’s degree and/or relevant work experience.
• Active Certification in any track on GCP.
• Several years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties -Spark, Hive, Databrick, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, CloudBigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github.
• Have data transformation abilities across large complex relational datasets using languages like SQL.
• Build ETL pipelines using GCP services such as dataflow, google cloud storage, big query, python etc. Build and deploy datasets using terraform.
• Working knowledge of Apache Airflow.
• Several years of experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud.