Job Description:
Job Title- Partner Data_Big Data/ Cloud Engineer
Location- Pune
Role Description
Our team is part of the area Technology, Data and Innovation (TDI) Private Bank. Within TDI, Partnerdata is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on on premise cloud, restful services and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Big Data and Google Cloud area.
What we’ll offer you
As part of our flexible scheme, here are just some of the benefits that you’ll enjoy
- Best in class leave policy
- Gender neutral parental leaves
- 100% reimbursement under childcare assistance benefit (gender neutral)
- Sponsorship for Industry relevant certifications and education
- Employee Assistance Program for you and your family members
- Comprehensive Hospitalization Insurance for you and your dependents
- Accident and Term life Insurance
- Complementary Health screening for 35 yrs. and above
Your key responsibilities
- You are responsible for the implementation of the project requirements on the enterprise analytics platform (Cloudera, Hadoop, Spark) in the whole SDLC chain
- You are responsible for the design of the business requirements, establish the process architecture and define enhancements for current functionalities
- You are responsible for the support of the migration of current functionalities to Google Cloud
- You are responsible for the stability of the application landscape and support software releases
- You also support in L3 topics and application governance
- You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot)
Your skills and experience
- You have experience with databases (HDFS, BigQuery, etc.) and development preferably for Big Data/ GCP technologies
- Your architectural skills for big data solutions, especially interface architecture allow a fast start
- You have experience in at least: Spark, Maven, Artifactory, Hadoop Ecosystem, Teamcity, Bitbucket
- You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes
- You can work very well in teams but also independent and are constructive and target oriented
- Your English skills are very good and you can both communicate professionally but also informally in small talks with the team.
How we’ll support you
- Training and development to help you excel in your career
- Coaching and support from experts in your team
- A culture of continuous learning to aid progression
- A range of flexible benefits that you can tailor to suit your needs
About us and our teams
Please visit our company website for further information:
https://www.db.com/company/company.htm
We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.