Capco Logo

Capco

Data Engineer

Sorry, this job was removed Sorry, this job was removed at 03:00 p.m. (IST) on Thursday, Apr 03, 2025
Remote
Hybrid
Hiring Remotely in India
Remote
Hybrid
Hiring Remotely in India


Job Title: Jr. Data Engineer 

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

JOB SUMMARY:

  • Position: Jr Consultant
  • Location: Pune / Bangalore/Hyderabad
  • Band: M1/M2 (3 to 7 years)

 Role Description:

Must have Requirements:

  • Pyspark or Scala development and design.
  • Experience using scheduling tools such as Airflow.
  • Experience with most of the following technologies (Apache Hadoop, Pyspark, Apache Spark, YARN, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services).
  • Sound knowledge on working Unix/Linux Platform
  • Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL.
  • Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA.
  • Understanding of big data modelling techniques using relational and non-relational techniques
  • Experience on debugging code issues and then publishing the highlighted differences to the development team.

Good to have Requirements:

    • Experience with Elastic search.
    • Experience developing in Java APIs.
    • Experience doing ingestions.
    • Understanding or experience of Cloud design patterns
    • Exposure to DevOps & Agile Project methodology such as Scrum and Kanban.

WHY JOIN CAPCO? 

You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. We offer: • A work culture focused on innovation and creating lasting value for our clients and employees • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients • A diverse, inclusive, meritocratic culture

We offer:

  • A work culture focused on innovation and creating lasting value for our clients and employees
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
  • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients


#LI-Hybrid

Similar Jobs at Capco

5 Days Ago
Remote
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Sr. Data Engineer will work on projects involving PySpark and Scala with a focus on data analysis and debugging. They will utilize their skills in Spark, GIT, and familiar CICD tools to manage the Big Data Application Life Cycle while ensuring efficient incident management using Control-M and Service Now.
5 Days Ago
Remote
Hybrid
Pune, Maharashtra, IND
Expert/Leader
Expert/Leader
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The role involves designing and maintaining large-scale data pipelines using AWS services, particularly Glue, while collaborating on ETL solutions.
Top Skills: AthenaAWSAws GlueEc2LambdaPysparkPythonRedshiftS3Spark
11 Hours Ago
Remote
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Senior Data Engineer is responsible for developing data pipelines, improving data models, and integrating new data management technologies while consulting on complex projects.
Top Skills: AirflowAnsibleChefHadoopHiveKafkaProtobuf RpcPythonScalaSparkSQLTerraform

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account