About Us
ERGO Technology & Services S.A. (ET&S), a member of the Munich Re and ERGO Group, is delivering integrated IT and business services to international markets. Our expertise lies in providing advanced IT services, with a focus on modern, business-driven technology solutions. On the business side, we also support the Group in various end-to-end insurance processes, including finance, operations, and underwriting. With offices in Warsaw and Gdansk, and strong global partnerships, we foster a dynamic, multicultural environment that promotes diversity and international opportunities.
About the role
Join the Data Integration Team and help shape a next-generation integration platform at ERGO Technology & Services. This is not a traditional integration role. You will operate in a true DevOps ownership model where you design, build, and run what you create.
Our platform is fully cloud-native on Azure, leveraging Kubernetes (AKS), event-driven architecture, serverless components, and modern workflow orchestration. You will work on business-critical integrations that connect applications, services, and data streams across global insurance operations.
This role combines hands-on engineering, architectural thinking, and stakeholder collaboration. You will not only implement solutions but actively influence how integration is done across the organization.
How you will get the job done
- building and evolving a cloud-native integration platform at enterprise scale
- working on high-impact, mission-critical workflows across global business units
- operating in a true “you build it, you run it” environment with end-to-end ownership
- shaping integration standards, patterns, and architecture, not just implementations
- being part of a modern engineering culture focused on automation, quality, and continuous improvement
- contributing to future-facing integration use cases, including AI and data-driven workflows
designing, building, and operating scalable, resilient integration solutions using Azure services, Kubernetes, and messaging platforms
- developing and orchestrating workflow-driven integrations using DAG-based frameworks such as Argo Workflows
- implementing event-driven architectures using technologies like Azure Service Bus, Event Hub, or Kafka
- ensuring observability, reliability, and performance of data and process flows
- collaborating closely with engineers, architects, and business stakeholders to translate requirements into robust technical solutions
- acting as a technical advisor, guiding teams on integration patterns, orchestration strategies, and best practices
- communicating complex technical concepts clearly to non-technical stakeholders
- contributing to a DevOps-first environment with infrastructure-as-code, automated deployments, and continuous delivery
- continuously improving platform capabilities, engineering practices, and operational excellence
having end-to-end ownership: you design it, you build it, you run it
- focusing on automation, reliability, and scalability
- working in a collaborative, international, and agile environment in continuous learning, feedback, and improvement culture
Skills and experience you will need
- fluent English
- event-driven and workflow-centric architecture mindset
- strong experience with Kubernetes (AKS preferred) and cloud-native architectures
- solid hands-on experience with Azure, especially integration and messaging services
- proven experience with workflow orchestration systems (Argo Workflows, Temporal, Airflow) and DAG-based patterns
- experience with event streaming and messaging technologies (Azure Service Bus, Kafka, Event Hub, or similar)
- strong Python skills for automation, workflow development, and scripting
- good understanding of DevOps practices and toolchains (CI/CD, IaC, monitoring, GitOps)
- ability to communicate effectively with both technical and non-technical stakeholders
Nice to have
- experience with observability tools (e.g., Prometheus, Grafana, OpenTelemetry)
- exposure to serverless architectures and microservices design
- experience in large-scale enterprise environments
- interest or experience in AI-driven integration or data platform use cases