Zensar Technologies Logo

Zensar Technologies

Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
Design and build ETL pipelines using Apache Spark/PySpark, implement Iceberg table operations, and validate analytical schema. Requires strong data engineering experience and collaboration with stakeholders.
The summary above was generated by AI

Key responsibilities:**


- Design and build the Apache Spark/PySpark ETL pipeline (Bronze → Silver → Gold medallion architecture)


- Implement Apache Iceberg table operations (MERGE, UPSERT, SCD Type 2 logic, incremental loads)


- Design and validate the analytical star schema (fact/dimension tables, conformed dimensions)


- Define and execute three-tier data quality rules, dead-letter handling, and validation logic


- Build business logic connectors, transformation helpers, and custom derivations


- Collaborate with stakeholders to clarify KPIs, query patterns, and analytical use cases


- Write comprehensive unit, integration, and end-to-end tests



**Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

Responsibilities

Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

Qualifications

Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

About UsAt Zensar, we’re “experience-led everything”. We are committed to conceptualizing, designing, engineering, marketing, and managing digital solutions and experiences for over 130 leading enterprises. We are a company driven by a bold purpose: Together, we shape experiences for better futures. Whether for our clients, our people, or the world around us, this belief powers everything we do. At the heart of our culture is ONE with Client - a set of four core values that reflect who we are and how we work: One Zensar, Nurturing, Empowering, and Client Focus.
Part of the $4.8 billion RPG Group, we’re a community of 10,000+ innovators across 30+ global locations, including Milpitas, Seattle, Princeton, Cape Town, London, Zurich, Singapore, and Mexico City. Explore Life at Zensar and join us to Grow. Own. Achieve. Learn. to be the best version of yourself.
We believe the best work happens when individuality is celebrated, growth is encouraged, and well-being is prioritized. We are an equal employment opportunity (EEO) and affirmative action employer, committed to creating an inclusive workplace. All qualified applicants will be considered without regard to race, creed, color, ancestry, religion, sex, national origin, citizenship, age, sexual orientation, gender identity, disability, marital status, family medical leave status, or protected veteran status.

Similar Jobs

Yesterday
Remote
India
Mid level
Mid level
Software • Consulting
The Looker Data Engineer will develop Looker data models, optimize reporting solutions, use Looker APIs, and work with SQL and cloud platforms.
Top Skills: DbtLookerLookmlPythonSnowflakeSQL
10 Days Ago
Remote or Hybrid
India
Mid level
Mid level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
Design and optimize SQL scripts, develop Java applications and APIs, lead collaboration sessions, and ensure solutions adhere to RDBMS principles.
Top Skills: Aws RdsCore JavaGitOciOracle DbPl/SqlRest ApisSQL
10 Days Ago
Remote or Hybrid
India
Mid level
Mid level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
The Business Data Analyst II will design and optimize SQL and PL/SQL scripts, develop Java applications and RESTful APIs, gather requirements, and ensure solutions maintain data integrity and security.
Top Skills: Aws RdsCore JavaGitOciOracle DbPl/SqlRest ApisSQLSql Developer

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account