Capco Logo

Capco

QE (Scala, Kafka, NiFi, Jason)

Posted 7 Hours Ago
Be an Early Applicant
Hybrid
Pune, Maharashtra
Mid level
Hybrid
Pune, Maharashtra
Mid level
The Data QA Engineer performs end-to-end testing of data pipelines, ensuring data quality and integrity in real-time environments, primarily using Kafka and NiFi.
The summary above was generated by AI

Job Title: Data QA Engineer (Kafka, NiFi, Scala Ecosystem)

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT


Job Title: Data QA Engineer (Kafka, NiFi, Scala Ecosystem)
 
Experience: 4–7 Years Location: Pune (Hybrid – Client Office)
 
Job Overview
 
We are looking for a Data QA Engineer with strong experience in data pipeline and streaming validation across technologies like Apache Kafka, Apache NiFi, and Scala-based systems. The role requires deep expertise in data testing, ETL validation, and real-time data processing, ensuring high data quality, integrity, and reliability across complex distributed systems.
 
Key Responsibilities Perform end-to-end data testing across batch and real-time data pipelines Validate data ingestion, transformation, and streaming workflows built on Kafka and NiFi Verify JSON payloads, event streams, and message integrity across producers and consumers Design and execute test cases covering data accuracy, completeness, reconciliation, edge cases, and failure scenarios Perform data validation using complex SQL queries across large datasets and data warehouses Validate Scala-based backend processing logic impacting data transformations Monitor and test Kafka topics, partitions, offsets, and message flows Ensure data consistency across upstream and downstream systems Identify, log, and track data quality issues and pipeline defects to closure Collaborate with data engineers, developers, and business stakeholders for requirement validation and issue resolution Participate actively in Agile ceremonies and provide timely, accurate status updates Required Skills & Qualifications 4–8 years of experience in data testing / ETL / data QA roles Strong hands-on experience with Apache Kafka (event validation, topic monitoring, consumers/producers) Experience in Apache NiFi pipeline testing and data flow validation Proficiency in SQL for data analysis and validation Experience working with JSON payloads and event-driven architectures Understanding of Scala-based data processing systems (basic to intermediate level) Strong knowledge of data warehousing and ETL concepts Experience in testing real-time/streaming data pipelines Familiarity with Agile/Scrum practices Key Expectations Strong ownership and accountability in delivering data validation tasks within timelines Consistent participation in daily stand-ups and effective communication of progress and risks Ability to build deep functional and data understanding of systems under test Focus on edge cases, data anomalies, and negative scenarios to ensure robust validation Maintain discipline in status reporting, coordination, and collaboration Good to Have Experience with big data technologies (Spark, Hadoop ecosystem) Exposure to data quality frameworks and automation tools Knowledge of CI/CD pipelines and DevOps practices Domain experience in Banking / Financial Services (BFSI) Familiarity with cloud platforms (AWS/Azure/GCP) Key Competencies Strong analytical and data validation skills Attention to detail in high-volume data environments Proactive communication and collaboration Ability to work in fast-paced, distributed systems

Top Skills

Apache Kafka
Apache Nifi
Data Warehousing
ETL
JSON
Scala
SQL

Similar Jobs at Capco

7 Hours Ago
Hybrid
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The UI Developer will design and develop scalable web applications using modern UI frameworks, collaborate on REST API integration, and maintain code quality in an Agile environment.
Top Skills: AngularJavaScriptReactRest ApisSpring BootTypescriptVue3
7 Hours Ago
Hybrid
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Full Stack Developer will design, develop, and maintain applications using Scala, Kafka, and NiFi, focusing on backend services and data pipelines while collaborating with frontend teams and implementing best coding practices.
Top Skills: AngularApache NifiAWSAzureDockerGCPGitJSONKafkaKubernetesNoSQLReactRestful ApisScalaSQL
11 Hours Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Responsible for regulatory submissions, variance analysis, applying regulatory rules, and documenting business requirements while ensuring audit readiness.
Top Skills: BaselCapitalEbaEcbIrrbbLiquidityPra

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account