Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 3-5 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
We are specifically searching for Data Architects, Enterprise Data Architect, Data Governance Architect & Data Platform Architect and have several assignments that can further your career and that you’d be proud to put your name to.
New tools you get to play with: Delta Lake (Databricks), Iceberg (Apache) & Hudi (Apache)
Successful Architects and Engineers in our team are passionate about unlocking and exploring data, helping our customers understand the possibilities it holds. Technically, we are searching for Big Data Architects who have experience working with Hadoop and AWS/GCP/Azure, and also have knowledge of creating cloud architectural solutions for Big Data. Please apply if you have experience with most of these:
Designing and implementing data architecture including event-driven capabilities Exposure to CI/CD tools to test and deploy data pipelines Familiarity with AWS/Azure/GCP core services Proficient in Hadoop and related technologies including HDFS, Spark, Impala and Hive Coding in Java, Scala and/or Python Experience with 2nd gen Big Data open source tools preferred