The Senior Product Support Engineer will design and optimize Hadoop and Spark-based systems, troubleshoot performance issues, and support migrations while working in a 24/7 environment.
Position Summary:
We are seeking a Product Specialist - Hadoop SME (Subject Matter Expert) who will be responsible for designing, optimising, migrating and scaling Hadoop and Spark-based data processing systems. This role involves hands-on experience with Hadoop and other core data operations, focusing on building resilient, high-performance distributed data systems. You will collaborate with customer engineering teams to deliver high-throughput Hadoop and Spark applications and solve complex data challenges in migration, upgrades, reliability, and optimise post-migration system performance.
If you’re passionate about working in fast-paced, dynamic environments and want to be part of the cutting edge of data solutions, this role is for you.
We’re looking for someone who can:
- Design and optimise distributed Hadoop-based applications, ensuring low-latency, high-throughput performance for big data workloads.
- Troubleshooting: Provide expert-level support for data or performance issues in Hadoop and Spark jobs and clusters.
- Data Processing Expertise: Work extensively with large-scale data pipelines using Hadoop and Spark's core components
- Performance Tuning: Conduct deep-dive performance analysis, debugging, and optimisation of Impala and Spark jobs to reduce processing time and resource consumption.
- Cluster Management: Collaborate with DevOps and infrastructure teams to manage Impala and Spark clusters on platforms like Hadoop/YARN, Kubernetes, or cloud platforms (AWS EMR, GCP Dataproc, etc.).
- Migration: Provide dedicated support to ensure the stability and reliability of the new ODP Hadoop environment during and after migration. Promptly address evolving technical challenges and optimise system performance after migration to ODP.
- This role requires flexibility to work in rotational shifts, based on team coverage needs and customer demand.
Candidates should be comfortable supporting operations in a 24/7 environment and be willing to adjust their working hours accordingly.
Top Skills
Aws Emr
Gcp Dataproc
Hadoop
Impala
Kubernetes
Spark
Similar Jobs
Artificial Intelligence • Hardware • Information Technology • Machine Learning
The PDE Technical Program Manager oversees advanced semiconductor packaging programs, ensuring execution on schedule and quality by coordinating with multiple teams.
Top Skills:
2.5D/3D)Advanced Packaging Technologies (FcHbmSemiconductor PackagingWb
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
You will analyze data to create actionable insights, apply machine learning techniques, and communicate findings to stakeholders, enabling data-driven decisions.
Top Skills:
CC++JavaJavaScriptPythonRSQL
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The role involves business analysis and gathering requirements in wholesale credit risk, focusing on model development, regulatory compliance, and credit risk systems management.
Top Skills:
Basel IiiCrrEbaEcbPra
What you need to know about the Delhi Tech Scene
Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

.jpeg)

