to strengthen our engineering unit in Zurich to support our banking client in a hybrid working environment.
In this role, you will be responsible for designing, implementing and maintaining data-driven applications, engineering robust data pipelines and contributing to all stages of the software development lifecycle.
This role focuses on understanding and bridging the architectural and functional gaps between legacy systems and the new data platform. It requires a combination of data engineering expertise and solution architecture capabilities, particularly around lineage, compliance and integration strategy.
Would you like to be a part of this ambitious and innovative project? We look forward to receiving your application.
Responsibilities
Lead and execute Gap analysis between Informatica PowerCenter-based data flows and the new S3/Kafka/Hive platform
Understand application-level integration, system interfaces and data exchange logic
Design data ingestion and transformation strategies aligned with metadata, historization and business domains
Model and document data flows, lineage and mapping from legacy DWH to the new architecture
Collaborate with business and platform stakeholders to define target state architecture for key use cases
Support transition of data products, schemas and pipelines into scalable, governed frameworks
Evaluate opportunities to simplify, consolidate and modernize across data zones and ETL flows
Recommend future-proof design choices based on architecture, security and platform readiness
Requirements
Solid experience in data engineering and data architecture within complex on-premise environments
Knowledge of Informatica PowerCenter, ETL architecture and integration patterns
Hands-on understanding of Kafka, S3-based data lakes, Hive Metastore and Postgres
Deep knowledge of schema evolution, historization and data governance models
Good knowledge of SQL, Python and data mapping/documentation using industry best practices
Familiarity with data modeling frameworks, metadata-driven ingestion and common data models
Working experience in regulated environments (preferably banking or finance)
Understanding of storage lifecycle and network segmentation (C3/C4)
Fluent English skill is a must, German language skill is a significant advantage
Nice to have
Experience with Trino, Apache Ranger or Dataiku
Experience with data mesh, data fabric or federated governance design
Background in legacy system modernization (e.g., mainframe decommissioning, host data feed migration)
Prior exposure to gap/impact analysis frameworks, business requirements analysis and stakeholder workshops
Working knowledge of Airflow or cloud-native alternatives
We offer
5 weeks of vacation
EPAM Employee Stock Purchase Plan (ESPP)
Enhanced parental leave
Extended pension plan
Daily sickness allowance insurance
Employee assistance program
Global business travel medical and accident insurance
Learning and development opportunities including in-house training and coaching, professional certifications, over 22,000 courses on LinkedIn Learning Solutions and much more
+ All benefits and perks are subject to certain eligibility requirements
Please note that any offers will be subject to appropriate background checks
We do not accept CVs from recruiting or staffing agencies
For this position, we are able to consider applications from the following:
+ Swiss nationals
+ EU/EFTA nationals
+ Third-country nationals based in Switzerland with an appropriate work permit
+ Displaced people from Ukraine who are currently in Switzerland and hold, or have already applied for, S permits
Beware of fraud agents! do not pay money to get a job
MNCJobs.ch will not be responsible for any payment made to a third-party. All Terms of Use are applicable.