Synechron Logo

Synechron

Data Engineer (Python, PySpark, Snowflake, Cloud) | Scalable Data Pipelines & Cloud Migration Support

Posted 3 Hours Ago
Be an Early Applicant
Remote
Hiring Remotely in Hinjawadi, Pune, Mahārāshtra, IND
Senior level
Remote
Hiring Remotely in Hinjawadi, Pune, Mahārāshtra, IND
Senior level
The Data Engineer is responsible for managing and optimizing cloud data pipelines, automating workflows, supporting data migration, and ensuring compliance and performance across cloud platforms.
The summary above was generated by AI

Job Summary
Synechron is seeking a highly experienced Snowflake Support Engineer with expertise in Python, Databricks, and PySpark to support enterprise data solutions. This role involves managing and optimizing data pipelines, supporting Cloud data platforms, and ensuring system security, reliability, and performance. The successful candidate will collaborate with cross-functional teams to implement best practices, improve operational efficiency, and ensure data governance and compliance in a cloud environment.

Software Requirements
 

Required Software Proficiency:

  • Python (version 3.7+): extensive experience scripting and automating data workflows

  • Databricks: hands-on experience in managing data pipelines and performing data transformations within Databricks environment

  • PySpark: strong skills in processing large datasets and developing scalable data transformations

  • Snowflake: proficiency in designing, deploying, and supporting data warehousing solutions, including data loading and performance tuning

  • SQL: advanced query writing and data validation skills supporting data reconciliation and monitoring

  • Cloud Platform (AWS, Azure, or GCP): support cloud data services, deployment, and integration support (preferred)

Preferred Software Skills:

  • CI/CD tools: Jenkins, Azure DevOps, or GitLab for automation of deployment workflows

  • Monitoring tools: Prometheus, Grafana, CloudWatch for system health and performance monitoring

  • Data governance and security tools supporting encryption, access controls, and compliance standards (GDPR, HIPAA)

Overall Responsibilities

  • Support and optimize scalable, secure, and high-performance data pipelines built within Snowflake and Databricks environments

  • Automate data workflows and pipeline operations to improve efficiency and reduce manual intervention

  • Troubleshoot and resolve data ingestion, transformation, and performance issues promptly

  • Collaborate with data engineers, data scientists, and business teams to implement data validation, security, and reconciliation processes

  • Support data migration activities, schema updates, and cloud deployment supporting enterprise data strategies

  • Monitor system performance, manage incident resolution, and optimize resource use for cost efficiency

  • Maintain comprehensive documentation of data pipelines, architecture, security protocols, and operational procedures

  • Support automation and infrastructure as code initiatives supporting scalable data environments

Technical Skills (By Category)

  • Languages & Data Tools (Essential):

    • Python (3.7+), PySpark, SQL — core skills for scripting, data processing, and validation

    • Snowflake — data warehousing, schema design, and query optimization

  • Databases & Data Management:

    • Snowflake, relational databases (Oracle, SQL Server, PostgreSQL) supporting enterprise data management

    • Data governance, metadata management, and lineage tools supporting compliance frameworks

  • Cloud & Infrastructure:

    • AWS, Azure, or GCP cloud services supporting data integration, storage, and migration (preferred)

    • Infrastructure as Code: Terraform, CloudFormation (preferred)

  • Monitoring & Automation:

    • Prometheus, Grafana, CloudWatch for system monitoring and alerting

    • CI/CD pipelines (Jenkins, Azure DevOps) for automation of deployment and testing

  • Security & Data Governance:

    • Knowledge of encryption protocols, role-based access control, and compliance standards such as GDPR, HIPAA

Experience Requirements

  • Minimum of 6+ years supporting enterprise cloud data platforms, big data pipelines, and data warehousing solutions

  • Proven experience designing, deploying, and optimizing scalable data pipelines on cloud environments supported by Snowflake and Databricks

  • Strong understanding of cloud migration, data security, and compliance practices in regulated industries (preferred)

  • Hands-on experience with data validation, reconciliation, and automation of data workflows

  • Demonstrated ability to troubleshoot, optimize, and support high-availability data systems in enterprise environments

Day-to-Day Activities

  • Develop, test, and support scalable data pipelines supporting analytics, migration, and compliance initiatives

  • Collaborate with data scientists, application, and business teams to refine data workflows and support ongoing improvements

  • Troubleshoot and resolve performance bottlenecks, data inconsistencies, and security issues

  • Support cloud data migration, schema updates, and operational automation processes

  • Monitor system health, generate operational reports, and implement security and compliance measures

  • Maintain technical documentation, architecture diagrams, and operational runbooks

  • Automate deployment pipelines, infrastructure provisioning, and system updates using Terraform, Jenkins, or similar tools

Qualifications

  • Bachelor’s or Master’s degree in Data Engineering, Computer Science, or related field

  • 6+ years supporting enterprise data platforms, cloud data migration, and scalable data pipeline deployment

  • Relevant certifications: SnowPro, AWS Data Analytics, GCP Professional Data Engineer, Azure Data Engineer are preferred

  • Proven track record of managing large-scale, compliant, and secure data environments supporting critical business operations

Professional Competencies

  • Strong analytical and troubleshooting skills for complex data workflows and security issues

  • Leadership qualities to guide junior team members and support best practices

  • Effective communication skills to engage stakeholders and document processes clearly

  • Adaptability to evolving cloud technologies, security standards, and industry regulations

  • Focus on operational security, data quality, and system reliability

S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT
 

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Candidate Application Notice

Similar Jobs

An Hour Ago
Remote or Hybrid
India
Junior
Junior
Security • Cybersecurity
As a QA Engineer in DevOps, you will test and analyze requirements, write test cases, debug environments, and investigate customer scenarios.
Top Skills: Ci/CdDockerGoJavaJenkinsKotlinKubernetesLinuxPython
Junior
Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
The Associate Manager develops global medical content under supervision, collaborates with teams to ensure quality and compliance, and tracks project progress.
Top Skills: Generative Ai Technology PlatformsMedical Content ToolsMultimedia
An Hour Ago
Easy Apply
Remote or Hybrid
India
Easy Apply
Senior level
Senior level
Big Data • Cloud • Software • Database
Lead the development of a foundational platform for application modernization, focusing on infrastructure, cloud services, and observability metrics. Drive technical decisions and product scope in a remote setting.
Top Skills: AbacAWSAzureCi/CdDockerGCPKubernetesOauth 2.0OidcRbacSAML

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account