Data Engineers - Remote Poland or Romania

Data Engineers - Remote Poland or Romania

Strategicsiq

Praca zdalna

Poland (Remote)
Bucharest
B2B
Data Engineer
ETL
ELT
GCP
Data Vault
API Integration
BigQuery
Event-Driven Architecture
ML Engineering
SQL

Hexjobs Insights

Position as Senior Data Engineer/ML Engineer focused on scalable ELT/ETL pipelines and cloud architectures on GCP. Key skills include Data Vault modeling and API integrations. Remote work available.

Słowa kluczowe

Data Engineer
ETL
ELT
GCP
Data Vault
API Integration
BigQuery
Event-Driven Architecture
ML Engineering
SQL

Role OverviewWe are seeking a highly skilled Senior Data Engineer / ML Engineer with strong expertise in building scalable ELT/ETL pipelines, cloud-native data architectures, and API-driven integrations. The ideal candidate will have hands-on experience in Google Cloud Platform (GCP), Data Vault modeling, and the ability to translate complex business logic into robust engineering pipelines and data models.Key ResponsibilitiesDesign, develop, and maintain scalable ELT/ETL data pipelines on GCP.Build and optimize data engineering pipelines for structured, semi-structured, and streaming data.Implement and manage ML engineering workflows for production deployment.Develop and maintain REST/SOAP APIs for system integrations.Architect and implement Data Vault models for enterprise data warehousing.Design event-driven architectures using Pub/Sub and streaming frameworks.Integrate third-party systems into enterprise data ecosystems.Collaborate with business stakeholders to translate business requirements into scalable data models and pipelines.Ensure adherence to security best practices (IAM, VPC Service Controls, encryption, service accounts).Optimize SQL queries and data performance in BigQuery.Support reporting and analytics solutions (Power BI preferred).Required Skills & ExpertiseStrong experience building scalable ELT/ETL pipelines, ML engineering workflows, and API integrations (REST/SOAP).Hands-on expertise within the GCP ecosystem: Dataform, Cloud Run, Pub/Sub, BigQuery (advanced SQL), GCS, and Firestore.Solid understanding of Data Vault modeling and enterprise data architecture principles.Experience designing event-driven and streaming architectures (Kafka, Pub/Sub).Working knowledge of GKE, Spark, and Java-based data processing solutions.Familiarity with MCP integration and TMF protocols.Strong ability to translate business requirements into scalable data models and engineering pipelines.Experience implementing cloud security best practices including IAM, VPC Service Controls, service accounts, and encryption.Exposure to Power BI for reporting and analytics is an added advantage.

Wyświetlenia: 16
Opublikowana10 dni temu
Wygasaza 2 miesiące
Rodzaj umowyB2B
Źródło
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Data Engineers - Remote Poland or Romania"

Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.