with strong expertise in enterprise data platforms and lakehouse management. In this role, you'll design, build, and operate scalable data architectures that enable analytics, machine learning, and data-driven decision-making.
Your Responsibilities
Design, implement, and optimize enterprise-scale lakehouse platforms
Develop data ingestion pipelines (event mesh, ETL/ELT) and support data warehousing (Snowflake, Redshift, BigQuery)
Work with big data frameworks (Hadoop, Spark) for large-scale processing
Support data science and ML use cases (TensorFlow, PyTorch, Scikit-learn)
Integrate lakehouse solutions across cloud platforms (Databricks, AWS, Azure, GCP, OCI)
Apply Infrastructure as Code (Terraform, CloudFormation)
Manage and optimize SQL/NoSQL databases for performance and scalability
Ensure adherence to security, governance, and best practices
Proven experience in data engineering and ETL/ELT processes
Strong knowledge of data warehousing and big data ecosystems
Hands-on experience with cloud data platforms (Databricks, AWS, Azure, GCP)
Skilled in Python, R, SQL, and IaC tools (Terraform, CloudFormation)
Certifications (e.g. Databricks, AWS, Azure, GCP) are a plus
* German: C1 (professional fluency) | English: B1 (technical)
Beware of fraud agents! do not pay money to get a job
MNCJobs.ch will not be responsible for any payment made to a third-party. All Terms of Use are applicable.