Capco Logo

Capco

Pyspark Azure Synapse Data Engineer - Bangalore/ Chennai/ Pune/ Hyderabad/ Mumbai/ Gurgaon

Sorry, this job was removed at 11:23 a.m. (IST) on Wednesday, Oct 09, 2024
India
Internship
India
Internship

Job Title: Data Engineer with Pyspark, Azure Synapse

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.


Key Skills: Data Engineering, Pyspark, ADLS, Synapse, Hadoop, CI/CD

Technical Requirement:

Role: Data Engineer / Azure Data Engineer

Responsibilities

  • Responsible for designing, building, and maintaining data pipelines and infrastructure on Microsoft Azure
  • Extract, transform, and load data from various sources including databases, APIs, and flat files.
  • Stay up to date with the latest advancements in Azure data technologies and best practices.
  • Propose suitable data migration sets to the relevant stakeholders.
  • Assist teams with processing the data migration sets as required.
  • Assist with the planning, tracking and coordination of the data migration team and with the migration run book and the scope for each customer.

Role Requirements

  • Strong Data Analysis experience in Financial Services
  • Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context
  • Demonstrate a continual desire to implement “strategic” or “optimal” solutions and where possible, avoid workarounds or short-term tactical solutions.
  • Working with stakeholders to ensure that negative customer and business impacts are avoided.
  • Manage stakeholder expectations and ensure that robust communication and escalation mechanisms are in place across the project portfolio.
  • Good understanding of the control requirement surrounding data handling

Experience/Skillset

  • Proven experience working with Microsoft Azure Data Services, including Data Factory, Data Lake Storage, Synapse Analytics, and Azure SQL Database.
  • Strong knowledge of data warehousing, data modeling, and ETL/ELT principles.
  • Hands-on experience in Airflow, Elastic
  • Perform data cleansing, transformation, and integration tasks to ensure data quality and consistency.
  • Develop and maintain data models for data warehousing, data lakes, and other data storage solutions.
  • Proficiency in programming languages like Python, PySpark, Scala, or SQL.
  • Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context
  • Knowledge of CI/CD processes for application software integration and deployment, including Maven, Git, and Jenkins
  • Implement and maintain security best practices for data storage and access.
  • Monitor and troubleshoot data pipelines and data quality issues.
  • Knowledge of COBOL copy book is preferable to migrate the COBOL copy book solution to Azure data solution.
  • Document and communicate technical solutions effectively.
  • Enthusiastic and energetic problem solver to join an ambitious team.
  • Business analysis skills, defining and understanding requirements.
  • Attention to detail.
  • Good knowledge of SDLC and formal Agile processes, a bias towards TDD and a willingness to test products as part of the delivery cycle.
  • Ability to communicate effectively in a multi-program environment across a range of stakeholders.
  • Strong verbal and written communication skills

If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Similar Jobs at Capco

2 Days Ago
Mumbai, Maharashtra, IND
Remote
6,000 Employees
Senior level
6,000 Employees
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a Senior Associate in the Data & Analytics Group, the role involves resolving data quality issues, coordinating with data owners, onboarding data sources, developing data quality rules, analyzing business processes, and improving data governance and productivity.
Be an Early Applicant
5 Days Ago
India
Remote
6,000 Employees
Senior level
6,000 Employees
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Senior Data Engineer will design and develop data pipelines, utilize big data technologies like Pyspark and Hadoop, and implement scheduling tools like Airflow. Responsibilities include debugging code, collaborating with development teams, and employing version control and automation tools. The engineer will also work on projects in the financial services industry, focusing on data transformation and delivery excellence.
5 Days Ago
Pune, Maharashtra, IND
Remote
6,000 Employees
Senior level
6,000 Employees
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Sr. Data Engineer will work on financial services projects using PySpark and Scala with a focus on debugging and data analysis. The role requires understanding of SDLC, experience with Big Data application life cycles, and proficiency in GIT, with bonus skills in CICD tools like Jenkins and Ansible.

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account