Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 1-3 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
Position Summary:
We are currently looking for talented Data Engineers to help our clients deliver automation solutions.
You will be working on exciting and interesting projects, being involved in implementing strategies and delivering on cloud solutions in Google Cloud, Azure and AWS.
Our consultants are natural problem solvers - they are curious, ambitious and experts in their fields. You will have the opportunity to collaborate and learn from them. You will be working on exciting and interesting projects, working with our clients to understand their needs and to help shape the engagements that allow Cognizant to provide services that will meet those needs.
If you have a passion for learning new technologies, want to be tech agnostic, and work with like minded technologists, Cognizant is the place to grow your career and have fun in the process!
You will be involved in:
Development of end to end data pipelines. We are particularly interested in Google Cloud, Azure, AWS or Snowflake experience. Advising on data architecture, data models, data migration, integration and pipelines and data analysis and visualization. Implementing solutions for the establishment of data management capabilities including data models and structures, database and data storage infrastructure, master and metadata management, data quality, data integration, data warehousing, data transformation, data analysis and data governance. Development and execution of Data migrations. Supporting pre-sales activity to promote Cognizant, our capabilities and value to current and prospective clients.
You will be:
A strong analytical thinker and problem solver with thought leadership and commercial awareness. Experienced building end to end data pipelines using on-premise or cloud based data platforms. Experienced with hands-on delivery of solutions which include databases, advanced SQL and software development in languages such as Python, Scala, Java, TSQL, PL/SQL to name a few. Knowledgeable in relational and big data architectures, data warehousing, data integration, data modelling, data optimisation and data analysis techniques. Interested and knowledgeable in Big Data technologies and Apache ecosystem technologies such as Beam, Spark, Kafka, Hive, Airflow, NiFi, databases, integration, master data management, quality assurance, data wrangling and data governance technologies. Knowledgeable in Cloud Data Warehouse services. Experience in Google BigQuery, Snowflake, AWS Redshift, Azure SQL DWH or Azure Databricks is highly desirable. Experienced with public cloud platforms and cloud infrastructure, which is essential. Exposed to ETL/ELT and governance tools (inc Talend, Informatica, Matillion, FiveTran, IBM Datastage, Collibra) Interested in AI and ML technologies and principles. Able to migrate and transform large, complex datasets from diverse sources, structures and formats, modeled to support analysis for access to quality actionable insights.