Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 3-5 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
Our business is focused on developing our people and offerings across the following: Cloud & Technology - Cloud Services (AWS, Azure, Google Cloud Platform), DevOps and Automation inc HashiCorp Data & Analytics - BI/DWH, Big Data Analytics, Data Engineering, Data Science and Advanced Analytics (Snowflake) Digital - Full stack Development, UI/UX Design, Mobile Development Customer Engagement - Marketing Campaigns, Campaign Analysis, Customer Insights Artificial Intelligence - Natural Language Processing, Computer Vision, Machine Learning, Deep Learning Cyber Security - DevSecOps, Penetration Testing, Multi-Cloud Security, Security Posture Analysis
The Role
Development of end to end data pipelines. We are particularly interested in Google Cloud, Azure, AWS or Snowflake experience. Advising on data architecture, data models, data migration, integration and pipelines and data analysis and visualisation. Implementing solutions for the establishment of data management capabilities including data models and structures, database and data storage infrastructure, master and metadata management, data quality, data integration, data warehousing, data transformation, data analysis and data governance. Development and execution of Data migrations. Supporting pre-sales activity to promote Servian, our capabilities and value to current and prospective clients.
We are looking for Consultants with SOME OR ALL of the following experience:
A strong analytical thinker and problem solver with thought leadership and commercial awareness. Experienced building end to end data pipelines using on-premise or cloud based data platforms. Experienced with hands-on delivery of solutions which include databases, advanced SQL and software development in languages such as Python, Scala, Java, TSQL, PL/SQL to name a few. Knowledgeable in relational and big data architectures, data warehousing, data integration, data modelling, data optimisation and data analysis techniques. Interested and knowledgeable in Big Data technologies and Apache ecosystem technologies such as Beam, Spark, Kafka, Hive, Airflow, NiFi, databases, integration, master data management, quality assurance, data wrangling and data governance technologies. Knowledgeable in Cloud Data Warehouse services. Experience in Google BigQuery, Snowflake, AWS Redshift, Azure SQL DWH or Azure Databricks is highly desirable. Experienced with public cloud platforms and cloud infrastructure, which is essential. Exposed to ETL/ELT and governance tools (inc Talend, Informatica, Matillion, FiveTran, IBM Datastage, Collibra) Interested in AI and ML technologies and principles. Able to migrate and transform large, complex datasets from diverse sources, structures and formats, modeled to support analysis for access to quality actionable insights.