Deutsche Bank Hiring Data Engineer Salary : 22 LPA - 27 LPA Freshers / Experienced Candidates Can Apply

 Deutsche Bank Hiring Data Engineer Salary : 22 LPA - 27 LPA Freshers /  Experienced Candidates Can Apply 




Company : Deutsche Bank 

Job Category :-  Data Engineer 





Job Title- Data Tribe | Cloud Data Engineer

Location- Pune, India


NTT DATA Hiring Data Engineer Salary : 6 LPA - 7 LPA Freshers / Experienced Bachelor's degree or equivalent in Computer Science, Engineering  Candidates Can Apply 


Role Description

Our Technology Data and Innovation (TDI) teams are responsible for the bank's complete information technology (IT) infrastructure as well as for developing, implementing and protecting the software required to support the bank's entire business. We move more than €1.6 trillion through the bank's platforms every day. Every day, we enable millions of customers to use our broad range of banking services across all contact channels.

Our technology platforms - with award-winning trading systems and mobile banking apps - help Deutsche Bank deliver high-quality products to its customers.

The Data tribe is part of TDI Private Bank Germany and covers the complete data value chain, including not only the mobilization, integration, enrichment and storage of data. Furthermore it provides tools, services and data analytics to enable new use cases, business capabilities and directly creates value for Deutsche Bank.

Data products enable reaching strategic goals in driving the cloud move of complex BI infrastructure, reducing the number of warehouses, simplifying our architecture and bringing down costs, while at the same time developing cutting edge data technology like automated near time machine learning processes.

As part of technology transformation, we are moving our on-premise data platforms to Google Cloud in order to offer more stability, scalability and find more value out of data which will further drive our business growth.

Your Team will focus on the delivery and maintenance of applications which enable frontends by providing intelligent data deliveries, e.g. KPI calculation logic and even operationalization of machine learning algorithms together with analytical squads. Your team will be involved in the entire data journey starting from identifying data sources, analyzing data sources, data sourcing, ETL using GCP services, data warehousing, data transformation, data cataloging, data governance and dashboarding.



What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoyBest in class leave policy
Gender neutral parental leaves
100% reimbursement under childcare assistance benefit (gender neutral)
Sponsorship for Industry relevant certifications and education
Employee Assistance Program for you and your family members
Comprehensive Hospitalization Insurance for you and your dependents
Accident and Term life Insurance
Complementary Health screening for 35 yrs. and above



Your key responsibilitiesGet involved in the full lifecycle of software products and services.
Support and consult the feature squads from design through go-live, including the operation.
Participate in system and architecture design including non-functional requirements.
Build and improve monitoring solutions and aim to automate fully to achieve zero-ops and utmost reliability.
Continuously explore the market w.r.t. new technological approaches to achieve the goals of the site reliability squad and the entire tribe.
Flexibility to work in APAC/EMEA Shift along with on-call as required



Your skills and experience

Mandatory SkillsHands-on development work building scalable data engineering pipelines and other data engineering/modelling work using Java/Python.
Excellent knowledge of SQL and NOSQL databases.
Experience working in a fast-paced and Agile work environment.
Working knowledge of public cloud environment.



Preferred SkillsExperience in Dataflow (Apache Beam)/Cloud Functions/Cloud Run or Similar services in Public Cloud
Knowledge of workflow management tools such as Apache Airflow/Composer.
Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub).
Knowledge of GCS Buckets, Google Pub Sub, BigQuery or Similar services in Public Cloud
Knowledge about ETL processes in the Data Warehouse environment/Data Lake and how to automate them.



Nice to haveKnowledge of provisioning cloud resources using Terraform.
Knowledge of Shell Scripting.
Experience with Git, CI/CD pipelines, Docker, and Kubernetes.
Knowledge of Google Cloud Cloud Monitoring & Alerting
Knowledge of Data Form, Cloud Spanner
Knowledge of Data Warehouse solution - Data Vault 2.0
Knowledge on NewRelic
Excellent analytical and conceptual thinking.
Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams.
Good communication and experience in working with distributed teams (especially Germany + India)



How we’ll support youTraining and development to help you excel in your career
Coaching and support from experts in your team
A culture of continuous learning to aid progression
A range of flexible benefits that you can tailor to suit your needs






This Is Gayathri Ramachandran Admin Of Gayatimes. It is a free job portal website platform for Job Seekers.


JOIN IN OUR TELEGRAM CHANNEL TO APPLY JOB & TO GET REGULAR JOB UPDATE



CLICK HERE TO SEE REMOTE JOBS


Check out my podcast, Gayathri Ramachandran , on Spotify for Podcasters: LISTEN HERE



FOLLOW US IN INSTAGRAM



CLICK HERE TO APPLY Deutsche Bank Hiring Data Engineer Salary : 22 LPA - 27 LPA Freshers /  Experienced Candidates Can Apply

No comments:

Post a Comment

IBM Hiring Data Engineer Salary : 15LPA - 25LPA Mumbai Location Freshers Can Apply

IBM Hiring  Data Engineer Salary : 15LPA - 25LPA Mumbai Location Freshers Can Apply     Company : IBM  Job Category :-  ...