Kafka DevOps Engineer
For our international banking client, we are looking for a Senior DevOps Data Engineer (Kafka)
… to build with us the Strategic Data Exchange between Lending core systems and surrounding systems. . The primary focus of this team is on the processes concerning data delivery to Lending internal applications, to the Wholesale Bank Data Lake and data delivery on Regulations.
Data is becoming more important every day. Your contribution to the Strategic Data Integration will be critical to realize our ambitioned Lending data platform, with high quality and timely data availability, moving from batch to real-time. This way enabling excellent data consumption possibilities to meet our ever increasing client- and regulatory demands on data.
We need your help in designing and building this new exchange and building bridges towards other squads in order to realize end-to-end delivery across Lending- and other Tribes. We value Agile, self-organization and craftsmanship. We are driven professionals who enjoy shaping the future of this place.
Needed skills & experience
We are looking for someone with an easy-to-work-with, mature and no-nonsense mentality. Someone who is an open and honest communicator, who values working as part of a team, who is willing and able to coach or train other developers and who is aware of developments and trends in the industry and corporate ecosystem.
Are you also passionate about a (not so distant) future that most data processing is done in a streaming fashion, not scared off by complex data, and enjoy developing complex components in Java? Then please read on.
On the more technical side you must have 9+ years of relevant experience in data engineering and especially must have experience in the following fields:
1. Agile / Scrum.
2. Track record in building larger corporate systems.
3. Kafka Streaming API.
4. Kafka, Schema Registry and Kafka Connect, using the Confluent framework.
5. Java 8 or higher backend development.
6. CI / CD tooling: Azure DevOps, Maven, CheckMarx, Git, Ansible.
7. Running and managing a Kafka cluster and related components.
8. Linux (bash) scripting capabilities.
9. Data Integration techniques.
10. Oracle Sql 12c or higher.
Next to these must haves, we appreciate you to have knowledge of the following:
1. Oracle RDBMS 12c or higher.
2. Database Change Data Capture.
3. Logging and monitoring with Grafana, Elastic, Kibana, Prometheus or Logstash.
4. Data modelling.
5. Oracle Data Integrator 12c.
6. Experience in a complex, corporate environment.
7. Experience in Lending, Financial systems.
8. Issue trackers like JIRA, ServiceNow.
9. Collaboration tooling like Confluence.
What we offer to you
• Work on something that has great significance to the bank.
• Being part of the squad shaping the future way of development.
• An enthusiastic team in an informal, dynamic environment.