Equifax Inc. Logo

Equifax Inc.

Data Engineer

Posted 8 Days Ago
Be an Early Applicant
Trivandrum, Thiruvananthapuram, Kerala
Junior
Trivandrum, Thiruvananthapuram, Kerala
Junior
Design and build data pipelines using Apache Beam in Java and Apache Spark on Google Cloud Platform. Collaborate with stakeholders for data integration, manage cloud infrastructure, tune performance, and document technical processes while staying updated with the latest technologies.
The summary above was generated by AI

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you.

What You Will Do:

  • Design and Build Data Pipelines: Develop and maintain robust data pipelines using Apache Beam in Java and Apache Spark to process large datasets efficiently across Google Cloud Platform (GCP).

  • Data Integration: Collaborate with data scientists, analysts, and other stakeholders to design data integration solutions for various business needs, ensuring data quality and reliability.

  • Cloud Infrastructure Management: Utilize GCP services like DataProc, BigQuery, and Cloud Storage to manage cloud based data processing and storage solutions.

  • Performance Tuning: Identify and optimize performance bottlenecks in data processing workflows, ensuring high efficiency and low latency in pipeline execution.

  • Monitoring and Maintenance: Establish monitoring, logging, and alerting mechanisms for data pipelines to ensure operational excellence.

  • Documentation: Create technical documentation outlining data flows, architecture design, and maintenance procedures for future reference.

  • Stay Current with Technologies: Keep abreast of the latest developments in data engineering tools and technologies, particularly within the GCP ecosystem, to continuously improve processes and solutions.

What Experience You Need:

  • Experience with GCP: 1+ years of experience working with cloud platform services, particularly in data processing and analytics.

  • Apache Beam In Java: Demonstrated expertise in building ETL processes using Apache Beam in Java.

     Spark and DataProc: Handson experience with Apache Spark, including optimizing jobs in DataProc for performance and efficiency.

  • SQL Proficiency: Strong skills in Hive SQL and familiarity with data warehousing concepts, along with experience in writing complex queries for data manipulation and analysis.

  • Data Loading and Transformation: Experience in managing and transforming large data sets, and knowledge of data storage solutions and frameworks.

  • CI/CD Practices: Familiarity with version control tools (like Git) and CI/CD methodologies for code deployment.

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field.

What Could Set You Apart:

  • Strong Programming Skills: Proficiency in additional programming languages (Python, Scala, etc.) and frameworks that support data engineering tasks.

  • Big Data Technologies: Experience with other big data frameworks or tools (e.g., Hadoop, Kafka, Airflow) that complement data engineering efforts.

  • Cloud Certifications: Relevant GCP certifications (e.g., Google Cloud Professional Data Engineer) that demonstrate your commitment and expertise in the field.

  • Architectural Knowledge: Understanding of data architecture principles, including data lakes, data warehousing, and the concepts of batch and stream processing.

  • Active Participation in the Community: Contributions to opensource projects, speaking engagements at conferences, or involvement in data engineering forums can enhance your profile.

  • Business Acumen: Ability to translate technical requirements into actionable business solutions and insights.

We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks.

Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference!

Primary Location:

IND-Trivandrum-Equifax Analytics-PEC

Function:

Function - Tech Dev and Client Services

Schedule:

Full time

Top Skills

Java
Python
Scala

Similar Jobs

Be an Early Applicant
3 Days Ago
Trivandrum, Thiruvananthapuram, Kerala, IND
16,742 Employees
Junior
16,742 Employees
Junior
Fintech • Consulting
As a Data Engineer, you will develop and optimize data pipelines, design data storage solutions, ensure data quality, monitor workflows, and collaborate with teams to implement comprehensive data solutions using technologies like Java and SQL.
Be an Early Applicant
8 Days Ago
Trivandrum, Thiruvananthapuram, Kerala, IND
658 Employees
Mid level
658 Employees
Mid level
Digital Media • Information Technology
The GCP Data Engineer will design, develop, and maintain scalable data pipelines, optimize ETL processes, and work with both structured and unstructured data. Responsibilities include creating data products, ensuring data quality, and applying CI/CD principles to analytical assets.
Be an Early Applicant
11 Days Ago
Trivandrum, Thiruvananthapuram, Kerala, IND
16,742 Employees
Senior level
16,742 Employees
Senior level
Fintech • Consulting
As a Data Engineer at Equifax, you will focus on extracting, transforming, and loading data from diverse sources using Big Data technologies and GCP services. You will develop and support data ingestion jobs, work on optimizing queries, and apply modern software development practices in a hybrid work environment.

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account