Excellent experience in the Data Engineering Lifecycle. You will have created data pipelines which take data through all layers from generation, ingestion, transformation and serving. Senior stakeholder management skills.
Experience of modern Software Engineering principles and experience of creating well tested and clean applications.
Strong experience in Hadoop on premises distribution especially Cloudera.
Strong experience in one or more cloud service provider. AWS, Azure or GCP (GCP Experience is preferable.)
Extensive experience using Python, Pyspark and the Python Ecosystem with good exposure to python libraries.
Strong experience in SQL and building Data Analytics.
Proven experience building robust production data pipelines using Airflow preferred.
Optional
Experience developing in other languages e.g. Scala/Java.
Your responsibilities
Work on engaging projects with one of the largest banks in the world, on projects that will transform the financial services industry.
Develop new and enhance existing financial and data solutions, having the opportunity to work on exciting greenfield projects as well as on established Tier1 bank applications adopted by millions of users.
Work on automating and optimising data engineering processes, develop robust and fault tolerant data solutions both on cloud and on-premise deployments.
You’ll be involved in digital and data transformation processes through a continuous delivery model.
Learn and work with specialised data and cloud technologies to widen the skill set.
Have an opportunity to learn and work with specialised data and cloud technologies to widen the skill set.