The Lead GCP Data Engineer will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs
Job Description:
Key Responsibilities:
Data Engineering & Development:
- Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data.
- Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer.
- Develop and optimize data architectures that support real-time and batch data processing.
Cloud Infrastructure Management:
- Manage and deploy GCP infrastructure components to enable seamless data workflows.
- Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices.
Collaboration and Stakeholder Engagement:
- Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals.
- Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation.
Quality Assurance & Optimization:
- Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations.
- Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines.
- Monitor and optimize pipeline performance to meet SLAs and minimize operational costs.
Qualifications and Certifications:
- Education:
- Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field.
- Experience:
- Minimum of 5 years of experience in data engineering, with at least 3 years working on GCP cloud platforms.
- Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform, Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer.
- Certifications:
- Google Cloud Professional Data Engineer certification preferred.
Key Skills:
- Mandatory Skills:
- Advanced proficiency in Python for data pipelines and automation.
- Strong SQL skills for querying, transforming, and analyzing large datasets.
- Expertise in GCP services such as BigQuery, Dataform, Cloud Functions, Cloud Storage, Dataflow, and Kubernetes (GKE).
- Hands-on experience with CI/CD tools like Jenkins, Git, or Bitbucket for automation, deployment, and version control.
- Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer.
- Nice-to-Have Skills:
- Experience with other cloud platforms like AWS or Azure.
- Knowledge of data visualization tools (e.g., Looker, Tableau).
- Understanding of machine learning workflows and their integration with data pipelines.
Soft Skills:
- Strong problem-solving and critical-thinking abilities.
- Excellent communication skills to collaborate with technical and non-technical stakeholders.
- Proactive attitude towards innovation and learning.
- Ability to work independently and as part of a collaborative team.
Location:
Bengaluru
Brand:
Merkle
Time Type:
Full time
Contract Type:
Permanent