Mid-Level Data Engineer – Cloud Data Pipelines

Mid-Level Data Engineer – Cloud Data Pipelines

ITDS Polska Sp. z o.o.

21000 - 24150 PLN / HOUR
Kraków
Kraków, Lesser Poland
Hybrydowa
B2B
Python
Java
Google Dataflow
ETL
Google Cloud
SQL
NoSQL
JSON
Parquet
data governance

Hexjobs Insights

Stanowisko: Inżynier Danych. Obowiązki: budowanie i optymalizacja przepływów danych. Wymagania: 4+ lata doświadczenia, znajomość Pythona, Java, Google Dataflow. Korzyści: pakiet medyczny, elastyczne godziny, rozwój w branży finansowej.

Słowa kluczowe

Python
Java
Google Dataflow
ETL
Google Cloud
SQL
NoSQL
JSON
Parquet
data governance

Benefity

  • Stabilna i długotrwała współpraca z bardzo dobrymi warunkami
  • Możliwość rozwijania umiejętności w branży finansowej
  • Uczestnictwo w wydarzeniach społecznych i szkoleniach
  • Dostęp do atrakcyjnego pakietu medycznego
  • Elastyczne godziny pracy

Technologies we use

About the project

Your responsibilities

  • Develop and implement efficient data pipelines for collecting, transforming, and storing data across various platforms, ensuring reliable data flow.
  • Integrate data from a range of sources including cloud platforms, databases, APIs, and external services.
  • Troubleshoot and optimize existing pipelines for performance and scalability.
  • Implement ETL processes to convert raw data into valuable insights for analytics and reporting.
  • Collaborate with cross-functional teams to understand data needs and support application requirements, including weekend or non-office hours support.
  • Build scalable, automated workflows capable of handling large data volumes with high reliability and low latency.
  • Set up monitoring and alert systems to minimize downtime and maximize pipeline performance.
  • Document data flows, architecture, and processing logic to ensure maintainability and transparency.

Our requirements

  • 4+ years of experience as a Dataflow Engineer, Data Engineer, or similar, working with large datasets and distributed systems.
  • Proficiency in programming languages such as Python and Java.
  • Hands-on experience with data pipeline orchestration tools, especially Google Dataflow.
  • Experience working with cloud data platforms like Google Cloud (BigQuery, Dataflow).
  • Strong expertise in ETL frameworks, real-time data streaming, and processing.
  • Familiarity with data formats like JSON and Parquet.
  • Knowledge of SQL and NoSQL databases, along with best practices in data governance, quality, and security.
  • Excellent troubleshooting skills for complex data issues.
  • Strong communication skills to effectively collaborate with both technical and non-technical stakeholders.

Optional

  • Certifications or additional experience with Google Cloud services or data engineering tools.

This is how we organize our work

This is how we work

What we offer

  • Stable and long-term cooperation with very good conditions
  • Enhance your skills and develop your expertise in the financial industry
  • Work on the most strategic projects available in the market
  • Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
  • Participate in Social Events, training, and work in an international environment
  • Access to attractive Medical Package
  • Access to Multisport Program
  • Access to Pluralsight
  • Flexible hours

Benefits

#GETREADY to meet with us!

ITDS’s Whistleblower Procedure

Wyświetlenia: 6
Opublikowana29 dni temu
Wygasaza 1 dzień
Rodzaj umowyB2B
Tryb pracyHybrydowa
Źródło
Logo
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Mid-Level Data Engineer – Cloud Data Pipelines"

Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.