Mastercard Logo

Mastercard

Data Engineer II

Posted 5 Days Ago
Be an Early Applicant
Hybrid
Pune, Mahārāshtra
Mid level
Hybrid
Pune, Mahārāshtra
Mid level
The Data Engineer II will design and maintain big data solutions using the Hadoop ecosystem, focusing on development and optimization of data pipelines and processing jobs with Apache Spark and other related tools.
The summary above was generated by AI
Our Purpose
Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential.
Title and Summary
Data Engineer II
 Job Overview• The Software Engineer II will contribute to the design, development, and maintenance of big data solutions within the Hadoop ecosystem. The role involves working with distributed data processing frameworks and supporting the creation of reliable, high-performance data pipelines.• The engineer will be responsible for hands-on development and troubleshooting using the Cloudera distribution, along with related ecosystem tools such as Apache Ozone, Apache Iceberg, Apache Airflow, and Apache NiFi. The position requires hands on experience in Apache Spark to support large-scale, massively parallel data processing tasks.• The Software Engineer II will collaborate closely with senior team members to implement best practices, optimize data workflows, and ensure seamless integration of components across the data platform.
 Role
• Contribute to the design, development, and enhancement of data engineering solutions within the Hadoop ecosystem, ensuring robust, scalable, and efficient data processing.• Work hands-on with the Cloudera distribution, implementing and maintaining data pipelines and platform components using tools such as Apache Ozone, Apache Iceberg, Apache Airflow, and Apache NiFi.• Develop and optimize large-scale data processing jobs using Apache Spark, leveraging massively parallel processing frameworks for high-performance workloads.• Collaborate closely with senior engineers, architects, and cross-functional teams to implement best practices, improve workflow reliability, and support platform-level enhancements.• Participate in performance tuning, troubleshooting, and root cause analysis of distributed systems and data pipelines in production environments.• Contribute to the evaluation and adoption of emerging technologies in data storage, orchestration, and distributed processing to continuously improve system performance and efficiency.• Ensure adherence to operational standards, including reliability, quality, and security requirements, while meeting project timelines and SLAs.• Support documentation, automation, and incremental improvements to engineering processes and data platform components.
 Education
• Bachelor's degree in Information Technology, Computer Science or Management Information Systems or equivalent work experience.
 Knowledge / Experience:• experience in related field, including around 2 years of experience in delivering secure solutions in Financial Services Sector is preferred. • Thorough knowledge and understanding of Software Engineering Concepts and Methodologies is required.• Demonstrate MC Core Competencies.
 About You• You are proactive, detail-oriented, and able to work independently while collaborating effectively within a team environment.• You have a strong ability to learn and adapt to new big data technologies, tools, and frameworks, continuously improving your technical depth.• You bring 3+ years of overall experience, including 2+ years of hands-on experience working on Hadoop ecosystem projects with the Cloudera distribution.• You have practical experience building and maintaining distributed data pipelines using Apache Spark for massively parallel processing.• You have hands-on knowledge of Apache Ozone and Apache Iceberg, with experience implementing distributed storage and modern table formats.• You must have knowledge in creating and managing orchestration workflows using Apache Airflow and Apache NiFi for complex data movement and integration.• You possess strong analytical and debugging skills, with the ability to troubleshoot distributed systems and resolve production issues effectively.• You communicate clearly, work well in collaborative, multi-location teams, and maintain strong organizational discipline.• You are adaptable, self-driven, and eager to contribute to the continuous improvement of data engineering practices and platform capabilities.
Corporate Security Responsibility
All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:
  • Abide by Mastercard's security policies and practices;
  • Ensure the confidentiality and integrity of the information being accessed;
  • Report any suspected information security violation or breach, and
  • Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.

Top Skills

Apache Airflow
Apache Iceberg
Apache Nifi
Apache Ozone
Spark
Cloudera
Hadoop

Mastercard Gurugram, Haryana, IND Office

Mehrauli Gurgaon Road, Gurugram, Gurugram, India, 122002

Similar Jobs at Mastercard

Yesterday
Hybrid
Pune, Mahārāshtra, IND
Mid level
Mid level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The Data Engineer II will develop workflows for datasets to support reporting and analytics projects, requiring SQL expertise and collaboration with various teams.
Top Skills: AlteryxEssbaseHyperionMs-ExcelPower BISQL
9 Days Ago
Hybrid
Pune, Mahārāshtra, IND
Mid level
Mid level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The Data Engineer will manage data projects, implementing and automating ETL processes, ensuring data quality, and collaborating on global client engagements.
Top Skills: .NetMicrosoft Sql ServerPerlPythonSsis StackVb Script
10 Days Ago
Hybrid
Pune, Mahārāshtra, IND
Mid level
Mid level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
This role involves designing and maintaining data infrastructure for Analysts and Data Scientists, supporting large-scale data ETL processes and enhancing data quality and architecture.
Top Skills: AthenaAWSCloud FormationData PipelineDatadogEmrGlueJavaScriptJenkinsKinesisPHPPsqlPythonRedshiftS3SagemakerSplunkSQLTableau

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account