Design and develop ETL/ELT pipelines for Snowflake and PostgreSQL, optimize data models, manage Snowflake environments, and collaborate on data solutions.
Key Responsibilities:
- Design and develop robust ETL/ELT pipelines to ingest, transform, and load data into Snowflake and PostgreSQL.
- Optimize and maintain data models, schemas, and queries for performance and scalability.
- Collaborate with data analysts, scientists, and business stakeholders to understand data needs and deliver high-quality solutions.
- Manage and monitor Snowflake environments, including storage, compute resources, and user access.
- Implement best practices for data governance, data quality, and data security.
- Develop and maintain documentation for data pipelines, models, and processes.
- Troubleshoot and resolve data issues in a timely manner.
- Participate in code reviews, architecture discussions, and continuous improvement initiatives.
Required Skills & Qualifications:
- 6+ years of professional experience in Data Engineering or a related field.
- Strong expertise in Snowflake – including data warehousing concepts, performance tuning, SnowSQL, and security controls.
- Deep understanding of PostgreSQL – advanced SQL, indexing, query optimization, and data modelling.
- Proficient in programming/scripting languages such as Python or Scala for data processing.
- Experience with ETL tools (e.g., dbt, Apache Airflow, Talend, Informatica).
- Familiarity with cloud platforms (AWS, GCP, or Azure) and services like S3, GCS, or Blob Storage.
- Knowledge of data lake architectures and data integration best practices.
- Excellent problem-solving and communication skills.
Preferred Qualifications:
- Experience with data cataloguing and lineage tools (e.g., Alation, Collibra).
- Hands-on experience with streaming data (e.g., Kafka, Spark Streaming).
- Familiarity with CI/CD pipelines for data engineering workflows.
- Exposure to DevOps or DataOps practices.
- Certifications in Snowflake, PostgreSQL, or cloud platforms (AWS/GCP/Azure).
- Design and develop robust ETL/ELT pipelines to ingest, transform, and load data into Snowflake and PostgreSQL.
- Optimize and maintain data models, schemas, and queries for performance and scalability.
- Collaborate with data analysts, scientists, and business stakeholders to understand data needs and deliver high-quality solutions.
- Manage and monitor Snowflake environments, including storage, compute resources, and user access.
- Implement best practices for data governance, data quality, and data security.
- Develop and maintain documentation for data pipelines, models, and processes.
- Troubleshoot and resolve data issues in a timely manner.
- Participate in code reviews, architecture discussions, and continuous improvement initiatives.
Required Skills & Qualifications:
- 6+ years of professional experience in Data Engineering or a related field.
- Strong expertise in Snowflake – including data warehousing concepts, performance tuning, SnowSQL, and security controls.
- Deep understanding of PostgreSQL – advanced SQL, indexing, query optimization, and data modelling.
- Proficient in programming/scripting languages such as Python or Scala for data processing.
- Experience with ETL tools (e.g., dbt, Apache Airflow, Talend, Informatica).
- Familiarity with cloud platforms (AWS, GCP, or Azure) and services like S3, GCS, or Blob Storage.
- Knowledge of data lake architectures and data integration best practices.
- Excellent problem-solving and communication skills.
Preferred Qualifications:
- Experience with data cataloguing and lineage tools (e.g., Alation, Collibra).
- Hands-on experience with streaming data (e.g., Kafka, Spark Streaming).
- Familiarity with CI/CD pipelines for data engineering workflows.
- Exposure to DevOps or DataOps practices.
- Certifications in Snowflake, PostgreSQL, or cloud platforms (AWS/GCP/Azure).
Top Skills
Apache Airflow
AWS
Azure
Dbt
GCP
Informatica
Postgres
Python
Scala
Snowflake
Talend
Similar Jobs
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
Responsible for ensuring performance, scalability, and reliability of services while automating operational tasks, managing deployments, and driving system improvements within cloud environments.
Top Skills:
AnsibleArgocdChefDockerGoGoogle Cloud PlatformKubernetesPowershellPythonTerraform
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
The Senior Software Engineer will design, develop, and scale cloud-based applications and mentor junior engineers, focusing on Go and Python technologies to optimize the backend services while collaborating with product teams.
Top Skills:
.NetAWSAzureC#GCPGoGrpcPythonRestful ApisSQL
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
The Senior Full Stack Engineer will lead both frontend and backend development, design scalable services, and integrate AI features into SaaS applications.
Top Skills:
.NetAngularAWSAzureCSS3DockerGCPGoHTML5JavaScriptKafkaKubernetesPythonRabbitMQReactSQLTypescriptVue
What you need to know about the Delhi Tech Scene
Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

