The Big Data Engineer will design and develop scalable data solutions, build data pipelines, ensure data quality, and collaborate across teams.
Müller’s Solutions is looking for an experienced Big Data Engineer to join our innovative team. In this role, you will be responsible for designing and developing scalable big data solutions that enable advanced analytics and insights across the organization. You will work with large data sets and utilize cutting-edge technologies to ensure data processing is efficient and reliable.
Key Responsibilities:- Design, build, and maintain scalable data pipelines using big data technologies such as Apache Spark, Hadoop, and Kafka.
- Collaborate with data scientists, analysts, and IT teams to identify data requirements and deliver solutions that meet business needs.
- Ensure high data quality and integrity by implementing robust data validation and testing processes.
- Optimize data storage solutions for performance and cost-efficiency across cloud and on-premises environments.
- Monitor and troubleshoot data processing workflows to ensure timely data delivery.
- Document data architectures, processes, and workflows for knowledge sharing and compliance.
- Stay ahead of industry trends and best practices in big data technologies and methodologies.
Requirements
Requirements:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 3+ years of experience as a Big Data Engineer or in a similar data engineering role.
- Proficiency in big data technologies such as Apache Spark, Hadoop, and Apache Kafka.
- Strong coding skills in programming languages like Java, Scala, or Python.
- Familiarity with data processing frameworks and ETL tools.
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and data storage solutions (e.g., HDFS, S3, BigQuery).
- Analytical mindset and excellent problem-solving abilities.
- Strong communication skills to collaborate effectively with technical and non-technical stakeholders.
- A self-motivated and proactive approach to work, with the ability to manage multiple tasks and deadlines.
Preferred Qualifications:
- Experience with machine learning frameworks and algorithms is a plus.
- Knowledge of data governance and security best practices.
- Familiarity with containerization technologies and orchestration tools like Docker and Kubernetes.
Benefits
1- Attractive Package.
2- Family Benefits.
3- Visa.
4-Air Tickets.
Top Skills
Spark
AWS
Azure
BigQuery
Docker
GCP
Hadoop
Hdfs
Java
Kafka
Kubernetes
Python
S3
Scala
Similar Jobs
Cloud • Information Technology • Internet of Things • Machine Learning • Software • Cybersecurity • Infrastructure as a Service (IaaS)
Administer, support, and optimize multi-vendor enterprise storage and backup platforms. Perform provisioning, monitoring, migrations, DR testing, backup operations, incident escalation, root-cause analysis, runbook maintenance, and mentor junior staff to meet SLAs and RPO/RTO objectives.
Top Skills:
CloudData DomainDatabasesDell EcsDell Emc NetworkerDell NasDell PowermaxDell UnityDell VmaxFibre ChannelHuawei OceanstorIbm EssIbm Flash StorageIscsiMultipathingNasPowerprotect Data Manager (Ppdm)Pure StorageSanStorage NetworkingUnix/LinuxVMwareWindows
Cloud • Information Technology • Internet of Things • Machine Learning • Software • Cybersecurity • Infrastructure as a Service (IaaS)
Maintain and operate a high-traffic integration backbone using TIBCO suite and Node.js microservices. Deploy and monitor services on Kubernetes/OpenShift, manage APIs with Mashery, orchestrate Kafka streams, handle secure file transfers with TIBCO MFT, configure ELK for observability, and maintain CI/CD pipelines. Follow ITIL processes, ensure SLA compliance, coordinate incidents with teams, and perform production change deployments and troubleshooting.
Top Skills:
Ci/CdDockerElasticsearchGraphQLKafkaKibanaKubernetesLinux/Unix ShellLogstashNode.jsOpenshiftSslTibco Bw6Tibco BwceTibco MasheryTibco Mft
Artificial Intelligence • Information Technology • Software • Analytics • Consulting • Generative AI
Deployment Strategists at Northslope Technologies solve complex business challenges through data-driven solutions, working directly with enterprise clients to drive operational change.
Top Skills:
AipPalantir FoundryPythonTypescript
What you need to know about the Delhi Tech Scene
Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.


