Big Data Engineer

Big Data Engineer (Praca zdalna)

Allegro

Warszawa
14200 - 20200 PLN / miesiąc
PERMANENT, B2B
PERMANENT
💼 B2B
📊 Big Data
GCP
Spark
Kafka
🐍 Python
Scala
Java
📊 data ingestion
Linux
clean code
TDD

Podsumowanie

Poszukiwany Big Data Engineer do pracy w Allegro w Warszawie. Elastyczne godziny pracy, wynagrodzenie 14 200 PLN - 20 200 PLN lub 18 400 PLN - 25 450 PLN w zależności od doświadczenia, roczna premia.

Słowa kluczowe

Big DataGCPSparkKafkaPythonScalaJavadata ingestionLinuxclean codeTDD

Benefity

  • Elastyczne godziny pracy
  • Roczna premia
  • Możliwość nauki technologii backendowych i AI
  • Cafeteria świadczeń
  • Dofinansowanie do kursów językowych
  • Nowoczesne biuro

Opis stanowiska

Important things for you  Flexible working hours in an office first model (4/1) that depend on you and your team. Starting later or finishing earlier? No problem! Work hours keep pace with our lifestyles and can start between 7 a.m. and 10 a.m. The salary range for this position depending on the skill set is as follows (contract of employment, tax-deductible cost): Data Engineer: PLN 14 200 - 20 200  Senior Data Engineer: PLN 18 400 - 25 450 Annual bonus (depending on your annual assessment and the company's results) Our team is based in Warsaw. About the team As part of the Data & AI area, we implement projects based on the practical 'data science' and 'artificial intelligence' applications of an unprecedented scale in Poland. Data & AI is a group of over 150 experienced engineers organized into over a dozen teams with various specializations. Some of them build dedicated tools for creating and launching BigData processes or implementing ML models for the entire organization. Others work closer to the client and are responsible for the implementation of the search engine, creating recommendations, building a buyer profile or developing an experimental platform. There are also research teams in the area whose aim is to find solutions to non-trivial problems requiring the use of machine learning. We are looking for BigData engineers who want to build a highly scalable and fault-tolerant data ingestion for millions of Allegro customers. The platform collects 5 billion clickstream events every day (up to 150k / sec) from all Allegro sites and Allegro mobile applications. This is a hybrid solution using a mix on-premise and Google Cloud Platform (GCP) services like Spark, Kafka, Beam, BigQuery, Pubsub or Dataflow.We are looking for people who Are programming in languages such as Scala or Java, Python Strong understanding of distributed systems, data storage, and processing framework like dbt, Spark or Apache Beam Have knowledge of GCP (especially Dataflow and Composer) or other public cloud environments like Azure or AWS Use good practices (clean code, code review, TDD, CI/CD) Navigate efficiently within Unix/Linux systems Possess a positive attitude and team-working skills Are eager for personal development and keeping their knowledge up to date Know English at B2 level  What we offer Possibility to learn and work with backend (Spring, Kotlin) and AI technologies within the team. Well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms) A wide selection of varied benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers) English classes that we pay for related to the specific nature of your job Macbook Pro / Air (depending on the role) or Dell with Windows (if you don't like Macs) and other gadgets that you may need Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things Hackathons, team tourism, training budget and an internal educational platform (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues) If you want to learn more, check it out Why is it worth working with us At Allegro, you will be responsible for processing petabytes of data and billions of events daily You will become a participant in one of the largest projects of building a data platform in GCP Your development will align with the latest technological trends based on open source principles (data mesh, data streaming) You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needs You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.tech Once a year, you can take advantage of the opportunity to work in a different team or more often if there’s an internal business need (known as team tourism)  Send in your CV and see why it is #dobrzetubyć (#goodtobehere)

Zaloguj się, aby zobaczyć pełny opis oferty

Wyświetlenia: 17
Opublikowanaokoło miesiąc temu
Wygasaza około 2 miesiące
Rodzaj umowyPERMANENT, B2B
Źródło
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Big Data Engineer"

Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.