Senior Data Engineer
Deutsche Bank AG
All India, Pune • 2 months ago
Experience: 5 to 9 Yrs
PREMIUM
Deal of the Day
--:--:--
15 Days Free Trial
After Free Trial → Flat 50% OFF
Upgrade to CVX24 Premium
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Auto-forward profile to 10 top recruiters
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$0
Activate
$0
A small token amount will be charged to verify.
Get Refund in 48 Hours.
Free Earplugs Delivery Only after Payment of Rs. 99 for Five Consecutive Months.
After free-trial 6 Months subscription will be auto Activated @ $
1
(Cancel Anytime). Quoted price includes 50% discount.
Enter Your Details
Job Description
Role Overview:
You will be working as a Senior Data Engineer, AVP in the Transaction Monitoring and Data Controls team, where you will design, implement, and operationalize Java components. Your primary focus will be on designing, building, and maintaining scalable and reliable PySpark/DBT/BigQuery data pipelines on Google Cloud Platform (GCP) for processing high-volume transaction data for regulatory and internal compliance monitoring. Additionally, you will implement data quality frameworks and monitoring solutions to ensure data accuracy, completeness, and timeliness within critical transaction monitoring systems. Collaboration with various areas within Technology, Data, and Innovation (TDI) such as Cloud Platform, Security, Data, Risk & Compliance will be essential to create optimum solutions for the business.
Key Responsibilities:
- Design, build, and maintain scalable and reliable PySpark/DBT/BigQuery data pipelines on Google Cloud Platform (GCP)
- Implement robust data quality frameworks and monitoring solutions
- Contribute to DevOps capabilities for maximum automation of applications
- Collaborate across TDI areas to create optimal solutions for the business
Qualifications Required:
- Expert hands-on Data Engineering using Java/Scala/Kotlin or Python in toolsets such as Apache Spark, Dataflow/Apache-Beam, PySpark, or Dataflow/Apache-Beam
- Professional experience with data warehousing technologies, ideally Google BigQuery
- Hands-on experience with DevOps pipelines in CI/CD tools such as Team City, Jenkins, or GitHub Actions
- Experience in software design and architecture considering non-functional requirements
- Proficiency in engineering within a secure, enterprise hybrid cloud environment
- Experience working with globally distributed teams and excellent communication skills
Additional Company Details:
The company strives for a culture where employees are empowered to excel together every day. Training, development, coaching, and support from experts in the team are provided to help employees excel in their careers. Flexible benefits are also offered to tailor to individual needs. For further information, please visit the company website. Role Overview:
You will be working as a Senior Data Engineer, AVP in the Transaction Monitoring and Data Controls team, where you will design, implement, and operationalize Java components. Your primary focus will be on designing, building, and maintaining scalable and reliable PySpark/DBT/BigQuery data pipelines on Google Cloud Platform (GCP) for processing high-volume transaction data for regulatory and internal compliance monitoring. Additionally, you will implement data quality frameworks and monitoring solutions to ensure data accuracy, completeness, and timeliness within critical transaction monitoring systems. Collaboration with various areas within Technology, Data, and Innovation (TDI) such as Cloud Platform, Security, Data, Risk & Compliance will be essential to create optimum solutions for the business.
Key Responsibilities:
- Design, build, and maintain scalable and reliable PySpark/DBT/BigQuery data pipelines on Google Cloud Platform (GCP)
- Implement robust data quality frameworks and monitoring solutions
- Contribute to DevOps capabilities for maximum automation of applications
- Collaborate across TDI areas to create optimal solutions for the business
Qualifications Required:
- Expert hands-on Data Engineering using Java/Scala/Kotlin or Python in toolsets such as Apache Spark, Dataflow/Apache-Beam, PySpark, or Dataflow/Apache-Beam
- Professional experience with data warehousing technologies, ideally Google BigQuery
- Hands-on experience with DevOps pipelines in CI/CD tools such as Team City, Jenkins, or GitHub Actions
- Experience in software design and architecture considering non-functional requirements
- Proficiency in engineering within a secure, enterprise hybrid cloud environment
- Experience working with globally distributed teams and excellent communication skills
Additional Company Details:
The company strives for a culture where employees are empowered to excel together every day. Training, development, coaching, and support from experts in the team are provided to help employees excel in their careers. Flexible benefits are also offered to tailor to individual needs. For further information, please visit the company website.
Skills Required
Posted on: March 1, 2026
Relevant Jobs
Step 2 of 2