Mastercard Logo

Mastercard

Data Engineer II

Posted Yesterday
Be an Early Applicant
Hybrid
Gurugram, Haryana
Mid level
Hybrid
Gurugram, Haryana
Mid level
Develop robust data pipelines, optimize performance, and troubleshoot data issues, utilizing technologies like Spark, SQL, and Hadoop.
The summary above was generated by AI
Our Purpose
Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential.
Title and Summary
Data Engineer II
Data Engineer II
Overview:
Our Purpose
We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. We cultivate a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team - one that makes better decisions, drives innovation, and delivers better business results.
Job Title
Data Engineer II
Who is Mastercard?
Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships, and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential.
Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all.
Our Team:
Product Data Analytics has pioneered a transaction-based revenue methodology to inform and guide Mastercard's Strategy, Product Design, and Innovation. We build internal analytics partnerships, strengthening their focus on the health of the business, portfolio and revenue optimization opportunities, initiative tracking, new product development and Go-To Market strategies.
• Are you excited about Data Assets and the value they bring to an organization? • Are you an evangelist for data driven decision making? • Are you motivated to be part of a Global Analytics team that builds large scale Analytical Capabilities supporting end users across 6 continents? • Do you want to be the go-to resource for data science & analytics in the company?
The Role:
Work with Global pricing and interchange strategy team to understand requirement and can design, build, and optimize robust data pipelines at scale. You'll work across Python/Scala/Spark, modern cloud platforms, and big data ecosystems to enable reliable data movement, transformation, and analysis for high impact business use cases. If you enjoy solving complex data problems, tuning performance, and modernizing legacy data flows
Key Responsibilities:• Design & Build Pipelines: Develop resilient, scalable data pipelines using Spark (PySpark), Hive/Impala, SQL, and orchestration tools (e.g., Airflow, NiFi).
.Orchestration & Scheduling: Implement and maintain job schedules, SLAs, retries, alerts, and dependencies using Airflow/NiFi or similar schedulers.• Data Integration: Create robust ingestion patterns for batch and streaming from diverse sources-files, APIs, RDBMS, NoSQL.• Data Modeling & Flow Analysis: Analyze existing pipelines and underlying data models to understand lineage, dependencies, and flow; document and improve them.• Troubleshooting & Root Cause: Independently investigate and resolve data pipeline failures, data quality issues, and performance bottlenecks.• Performance Optimization: Identify Spark performance bottlenecks and tune (e.g., partitioning, caching, join strategy, shuffle optimization, broadcast hints).• Code Modernization: Understand and refactor legacy pipelines; propose alternatives aligned with best practices and modern platforms.• Data Formats & Transformation: Handle CSV, JSON, XML, and multi layered transformation approaches (bronze/silver/gold).• Analytics Enablement: Build Spark/Python/Impala applications for ingestion and data analysis; enable downstream consumption in SQL/NoSQL and BI tools.• Collaboration: Partner with data scientists, analysts, and platform teams to operationalize models and enable self service analytics.
Required Qualifications:• 4-6 years of data engineering experience.• Strong programming skills in Python with production-grade Spark (RDD/DataFrame) development.• Solid experience with Hadoop ecosystem (HDFS, YARN), Hive, Impala.• Advanced SQL skills with query optimization for large datasets (e.g., window functions, CTEs, partition pruning).• Hands-on with job scheduling/orchestration tools such as Apache Airflow and/or Apache NiFi.• Experience building pipelines via Alteryx and/or SSIS (or willingness to modernize from these).• Exposure to cloud platforms (Databricks) and modern data warehouses.• Proven ability to analyze data pipelines and debug data quality/lineage issues end to end.• Experience working with structured, semi structured, and unstructured data.
Familiarity with NoSQL databases and building Spark apps on top of Hadoop/SQL/NoSQL stores.• Data visualization using Power BI/Tableau
Preferred / Good to Have:• AI/ML Knowledge
Tools & Environment :• Big Data: Hadoop, Spark, Hive, Impala• Scheduling: Apache Airflow/NiFi• ETL: Alteryx, SSIS• Data Stores: SQL Server, other SQL/NoSQL• Cloud: Databricks• Languages: Python, SQL• Visualization: Power BI/Tableau• Formats: CSV, JSON, XML, Parquet/ORC
Corporate Security Responsibility
All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:
  • Abide by Mastercard's security policies and practices;
  • Ensure the confidentiality and integrity of the information being accessed;
  • Report any suspected information security violation or breach, and
  • Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.

Top Skills

Alteryx
Apache Airflow
Apache Nifi
Databricks
Hadoop
Hive
Impala
Power BI
Python
Scala
Spark
SQL
Ssis
Tableau

Mastercard Gurugram, Haryana, IND Office

Mehrauli Gurgaon Road, Gurugram, Gurugram, India, 122002

Similar Jobs at Mastercard

Yesterday
Hybrid
Gurugram, Haryana, IND
Mid level
Mid level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The Associate Managing Consultant in Performance Analytics will manage client deliverables, develop analytics strategies, create predictive models, and mentor junior consultants while leveraging data analytics to provide insights.
Top Skills: HadoopHiveImpalaPower BIPysparkPythonRSASSQLTableau
Yesterday
Hybrid
Gurugram, Haryana, IND
Senior level
Senior level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The FP&A Senior Analyst supports financial planning and reporting for Services, assisting in revenue forecasting, financial report preparation, and business process improvements.
Top Skills: HyperionOracle
Yesterday
Hybrid
Gurugram, Haryana, IND
Senior level
Senior level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The FP&A Senior Analyst supports financial planning, analysis, reporting, revenue forecasting, and business process improvements using financial tools and models.
Top Skills: HyperionOracle

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account