HRO Digital
Czym będziesz się zajmować? Work with a collaborative team of varied disciplines, skills, and experienceWorking on a new project to enable existing platform on another Cloud providerAnalyse existing GCP and BigQuery based solutionsDesign and implement Azure Databricks based solutionsWork on integration mechanisms for copying large volumes of data between Cloud providersBuild and execute complex ETL workflows on Azure based platform with extensive use of Azure DatabricksWork on automation tool for Big Data artefacts migration between Cloud providersGenerate synthetic data for scaled performance testingUse python programming language and variety of OSS for implementing smooth E2E migration process and utilitiesWork closely with developers, product owners, and other stakeholders to ensure quality standardsIdentify performance bottlenecks and optimising system performance
Kogo poszukujemy? Experience working with at least one Cloud provider, preferably MS AzureExperience working complex ETL pipelinesDeep understanding of Big Data technologies, ideally Spark SQL and/or BigQuery SQLExperience working with Databricks data warehouse, ideally Azure Databricks.Experience in Python codingGood understanding of CI/CD process and automation toolsA strong understanding of the technical architecture of complex ETL solutionsSeasoned problem-solving skills and flexibilityAbility to work under time pressureExperience working in Agile teamsStrong self-organisation and inter-team communication
Opublikowana | 4 miesiące temu |
Wygasa | za 2 dni |
Tryb pracy | Pełny etat |
Źródło | ![]() |
Milczenie jest przytłaczające. Wysyłasz aplikacje jedna po drugiej, ale Twoja skrzynka odbiorcza pozostaje pusta. Nasze AI ujawnia ukryte bariery, które utrudniają Ci dotarcie do rekruterów.
Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.