dentsu Logo

dentsu

Lead Data Engineer

Sorry, this job was removed Sorry, this job was removed at 06:18 p.m. (IST) on Sunday, Mar 23, 2025
Be an Early Applicant
6 Locations
6 Locations

The Lead GCP Data Engineer will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs

Job Description:

Key Responsibilities:

Data Engineering & Development:

  • Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data.
  • Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer.
  • Develop and optimize data architectures that support real-time and batch data processing.

Cloud Infrastructure Management:

  • Manage and deploy GCP infrastructure components to enable seamless data workflows.
  • Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices.

Collaboration and Stakeholder Engagement:

  • Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals.
  • Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation.

Quality Assurance & Optimization:

  • Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations.
  • Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines.
  • Monitor and optimize pipeline performance to meet SLAs and minimize operational costs.

Qualifications and Certifications:

  • Education:
    • Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field.
  • Experience:
    • Minimum of 5 years of experience in data engineering, with at least 3 years working on GCP cloud platforms.
    • Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform, Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer.
  • Certifications:
    • Google Cloud Professional Data Engineer certification preferred.

Key Skills:

  • Mandatory Skills:
    • Advanced proficiency in Python for data pipelines and automation.
    • Strong SQL skills for querying, transforming, and analyzing large datasets.
    • Expertise in GCP services such as BigQuery, Dataform, Cloud Functions, Cloud Storage, Dataflow, and Kubernetes (GKE).
    • Hands-on experience with CI/CD tools like Jenkins, Git, or Bitbucket for automation, deployment, and version control.
    • Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer.
  • Nice-to-Have Skills:
    • Experience with other cloud platforms like AWS or Azure.
    • Knowledge of data visualization tools (e.g., Looker, Tableau).
    • Understanding of machine learning workflows and their integration with data pipelines.

Soft Skills:

  • Strong problem-solving and critical-thinking abilities.
  • Excellent communication skills to collaborate with technical and non-technical stakeholders.
  • Proactive attitude towards innovation and learning.
  • Ability to work independently and as part of a collaborative team.

Location:

Bengaluru

Brand:

Merkle

Time Type:

Full time

Contract Type:

Permanent

Similar Jobs

3 Days Ago
Block 1A, Geeta Colony, Central Delhi, Delhi, IND
Senior level
Senior level
Fintech • Real Estate • Financial Services
The Lead QA Engineer oversees the quality assurance process, leads the QA team, develops test plans, and ensures product quality and functionality through manual and automated testing.
Top Skills: Ci/CdGitSeleniumTestng
20 Days Ago
5 Locations
Mid level
Mid level
AdTech • Marketing Tech
The Data Management Developer Lead will oversee the development of database marketing solutions, utilizing various database technologies. Responsibilities include designing and developing solutions, troubleshooting issues, and ensuring continuous integration and deployment. The role requires both technical and client-facing skills to align technology solutions with business needs.
Top Skills: Amazon RedshiftAWSAzureAzureCloud ComputingDatastageGCPGoogle BigqueryInformaticaMatillionMicrosoft Sql ServerOraclePythonSnowflakeSsisStonebranchTalendTeradataTidalTivoliUnixVertica
5 Days Ago
Ar Colony, Ramakrishna Puram, New Delhi, South West, Delhi, IND
Mid level
Mid level
Consumer Web • Real Estate
The Senior Data Engineer is responsible for building high-performance applications and scalable data solutions using Python. Key tasks include developing and maintaining ETL pipelines, integrating APIs, optimizing SQL queries, and collaborating with cross-functional teams to deliver data-driven solutions.
Top Skills: AWSNumpyPandasPower BIPythonScrapySQLTableau

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account