Are you an experienced Big Data professional eager to tackle complex challenges in a dynamic banking environment? EPAM is seeking a dedicated
Senior Big Data System Engineer
to maintain, optimize and enhance distributed systems and data pipelines for one of our strategic clients in the financial sector. In this role, you will operate global data platform components, manage applications and integrate cutting-edge tools to deliver scalable and resilient solutions.
We offer a hybrid work model with a mix of remote and on-site work at our client's office in Zurich.
Responsibilities
Operate and maintain core Global Data Platform components, including VM Servers, Kubernetes and Kafka, to ensure smooth system performance and functionality
Manage data and analytics applications, such as Apache stack components, Dataiku, Collibra and other essential tools
Automate infrastructure configurations, security components and CI/CD pipelines to drive efficiency and eliminate manual intervention in data pipelines
Develop robust solutions to enhance platform resiliency, implement health checks, monitoring, alerting and self-recovery mechanisms for data operations
Focus on improving data pipeline quality by addressing accuracy, timeliness and recency in ELT/ETL execution
Embed Agile and DevSecOps best practices into delivery processes, ensuring iterative development and deployment of integrated solutions
Collaborate with stakeholders like enterprise security, digital engineering and cloud operations to align on effective solution architectures
Keep track of and evaluate emerging technologies across the Big Data ecosystem to deliver innovative features and capabilities
Requirements
Demonstrated 5+ years of experience in building or designing fault-tolerant, large-scale distributed systems, showcasing an ability to manage complex infrastructure and operations
Mastery of distributed data technologies such as data lakes, delta lakes, data meshes, data lakehouses and real-time streaming platforms
Expert knowledge of tools like Kafka, Kubernetes and Spark, as well as formats like S3/Parquet for scalable data solutions
Proficiency in Python and Java programming, or alternatives such as Scala/R, coupled with adeptness in Linux/Unix scripting
Experience in managing Docker (Harbor), VM setup/scaling, Kubernetes pod management and CI/CD pipelines
Knowledge of configuration management tools like Jinja templates, puppet scripts and best practices in firewall rules setup
Strong understanding of DevOps practices for scalable deployment and automation strategies
Fluency in English is essential, German language skills are considered an asset for collaboration
Familiarity with financial services and their unique data and regulatory challenges is an advantage
We offer
5 weeks of vacation
EPAM Employee Stock Purchase Plan (ESPP)
Enhanced parental leave
Extended pension plan
Daily sickness allowance insurance
Employee assistance program
Global business travel medical and accident insurance
Learning and development opportunities including in-house training and coaching, professional certifications, over 22,000 courses on LinkedIn Learning Solutions and much more
+ All benefits and perks are subject to certain eligibility requirements
Please note that any offers will be subject to appropriate background checks
We do not accept CVs from recruiting or staffing agencies
For this position, we are able to consider applications from the following:
+ Swiss nationals
+ EU/EFTA nationals
+ Third-country nationals based in Switzerland with an appropriate work permit
+ Displaced people from Ukraine who are currently in Switzerland and hold, or have already applied for, S permits
Beware of fraud agents! do not pay money to get a job
MNCJobs.ch will not be responsible for any payment made to a third-party. All Terms of Use are applicable.