Data Engineer - Location 360

Digital Hub Warsaw at Bayer

Warszawa, Ochota
21000 zł/mth.
Hybrydowa
🐍 Python
Go
Kafka
☁️ AWS SQS
🔍 Google Cloud Pub/Sub
GitHub
☁️ AWS
GCP
Hybrydowa

Requirements

Expected technologies

Python

Go

Kafka

AWS SQS

Google Cloud Pub/Sub

GitHub

AWS

GCP

Our requirements

  • Bachelor's degree in Computer Science, Engineering, or relevant job experience
  • Strong proficiency in Python and/or Go programming languages
  • Strong experience with PostgreSQL/PostGIS and BigQuery, including schema design and query optimization.
  • Hands-on experience with event-driven and streaming data architectures using platform and services such as Kafka, AWS SQS, and Google Cloud Pub/Sub
  • Experience with building RESTful APIs using common frameworks
  • Experience with utilizing Docker to build and deploy containerized applications
  • Experience in cloud platforms such as GCP and AWS, including native data and compute services such as Bigquery, Aurora, GCS/S3, GKE/EKS, GCE/EC2, Cloud Functions/Lambda
  • Experience with code versioning and dependency management systems such as GitHub
  • Excellent problem-solving skills and the ability to work effectively in a fast-paced, collaborative environment.
  • Strong communication skills and the ability to articulate technical concepts to non-technical stakeholders.
  • Preferred:
  • Highly proficient in Golang or Python with a strong track record of building and maintaining production data pipelines and backend systems
  • Hands-on experience working with Kubernetes(K8s) for orchestrating and managing containerized data services and workflows.
  • Familiarity with CI/CD practices and tools such as GitHub Actions, Terraform, Google Cloud Build, ArgoCD
  • Experience with object-oriented design, coding and testing patterns, and implementing complex data projects in a large-scale data infrastructure.
  • Solid understanding of geospatial data concepts. Experience with data processing and analysis using geospatial libraries and tools.
  • Experience with monitoring and logging tools such as Grafana, Prometheus, ELK stack, or equivalent.
  • Familiarity with cloud-based machine learning services and platforms such as Google Cloud Vertex AI or AWS SageMaker. Experience with deploying invoking model endpoints.
  • Solid understanding of networking concepts, security principles, and best practices for cloud environments.
  • Experience working with customers and developers to deliver full-stack development solutions; the ability to translate customer requirements into technical requirements in an Agile environment.

Your responsibilities

  • Design, build, deploy and support cloud-based and open-source solutions for geospatial data handling
  • Implement scalable data pipelines in Python and Go for ingestion, transformation, and delivery of structured and unstructured geospatial data.
  • Build highly scalable APIs for accessing geospatial data and initiating geospatial processing and analysis.
  • Develop event-driven data processing solutions using Kafka, AWS SQS, and Google Cloud Pub/Sub to orchestrate multi-stage spatial workflows.
  • Integrate and manage data flows across cloud platforms such as AWS and GCP, databases such as PostgreSQL/PostGIS and BigQuery, and cloud storage such as AWS S3 and Google Cloud Storage
  • Leverage Kubernetes (K8s) for deploying and managing containerized applications and workflows.
  • Work closely with SREs and platform engineers to integrate cloud infrastructure with CI/CD pipelines and deployment workflows.
  • Collaborate with data engineers and SREs to optimize and monitor data pipelines and services for performance, reliability, scalability, and cost-effectiveness.
  • Provide technical support, including incident response, troubleshooting and resolution for production issues in data pipelines and API services.
  • Ensure compliance with company and industry standards and best practices for data security and regulatory requirements.
  • Stay updated on emerging data engineering technologies and data infrastructures; evaluate their potential impact and application in our systems and processes.
  • Provide technical leadership and mentorship to junior data engineers. Forster a culture of knowledge sharing and continuous learning.
Wyświetlenia: 1
Opublikowana10 dni temu
Wygasaza 2 dni
Tryb pracyHybrydowa
Źródło
Logo
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Data Engineer - Location 360"