Mindera Logo

Mindera

Senior Data Engineer

Posted 2 Days Ago
Be an Early Applicant
India
Senior level
India
Senior level
The Senior Data Engineer will design and develop scalable ETL/ELT pipelines, enhance data streaming solutions, and optimize Spark jobs for performance while maintaining data integrity. Responsibilities include collaborating with data teams, documenting workflows, and researching new technologies.
The summary above was generated by AI

Description

We are looking for an experienced Data Engineer to become a valuable member of our energetic team. The perfect candidate will possess extensive knowledge of big data technologies, ETL/ELT workflows, and data modeling techniques. This position will concentrate on designing and enhancing data pipelines, maintaining data integrity, and bolstering our analytics projects.

Requirements

We are looking for an experienced Data Engineer with 5 to 7+ years of pertinent experience to be a part of our energetic team. The ideal candidate will possess a robust background in big data technologies, ETL/ELT processes, and data modeling. This position will concentrate on developing and refining data pipelines, ensuring data fidelity, and facilitating our analytics efforts.

Key Responsibilities:

  • Design, develop, and maintain scalable ETL/ELT pipelines using PySpark and Databricks to facilitate data ingestion and processing.
  • Implement and enhance data streaming solutions for real-time data processing.
  • Improve Spark job performance by addressing memory management, partitioning strategies, and implementing efficient data storage formats.
  • Collaborate with data scientists and analysts to gather data requirements and provide reliable datasets for analysis.
  • Create and refine complex SQL queries for data extraction, transformation, and analysis.
  • Maintain data quality and integrity through automated testing and validation methods.
  • Document data workflows and maintain metadata for governance purposes.
  • Research and adopt new data engineering technologies and methods to enhance efficiency and scalability.

Mandatory Skills:

  • PySpark: Proficient in using PySpark for data processing and ETL workflows.
  • Azure Databricks: Experience with the Databricks platform, including cluster setup and management.
  • Data Streaming: Knowledge of streaming data processing with frameworks such as Spark Streaming.
  • Python: Strong programming skills in Python for scripting and automation tasks.
  • SQL: Advanced skills in SQL for querying and managing relational databases.
  • Spark Optimization: Experience in optimizing Spark applications for enhanced performance.

Optional Skills:

  • Snowflake: Familiarity with Snowflake for data warehousing and query optimization.
  • Cloud Platforms: Understanding of cloud services (AWS, Azure, GCP) for data storage and processing.
  • ETL/ELT Concepts: Knowledge of ETL/ELT processes, data modeling, and data warehousing best practices.
  • Big Data Tools: Familiarity with tools and frameworks such as Kafka, Hadoop, and Hive.
  • CI/CD Practices: Understanding of CI/CD for automated deployment and version control using tools like Git, Jenkins, etc.
Benefits
We offer
  • Flexible working hours (self-managed)
  • Competitive salary
  • Annual bonus, subject to company performance
  • Access to Udemy online training and opportunities to learn and grow within the role

At Mindera we use technology to build products we are proud of, with people we love.

Software Engineering Applications, including Web and Mobile, are at the core of what we do at Mindera.

We partner with our clients, to understand their products and deliver high-performance, resilient and scalable software systems that create an impact on their users and businesses across the world.

You get to work with a bunch of great people, and the whole team owns the project together.

Our culture reflects our lean and self-organisation attitude.

We encourage our colleagues to take risks, make decisions, work in a collaborative way and talk to everyone to enhance communication. We are proud of our work and we love to learn all and everything while navigating through an Agile, Lean and collaborative environment.

Check out our Blog: and our Handbook:

Our offices are located: Porto, Portugal | Aveiro, Portugal | Coimbra, Portugal | Leicester, UK | San Diego, USA | Chennai, India | Bengaluru, India

Top Skills

Pyspark
Python
SQL

Similar Jobs

15 Hours Ago
Remote
Hybrid
2 Locations
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
SailPoint seeks a Senior Data Engineer to design and implement robust data ingestion and processing systems. Responsibilities include developing scalable data pipelines, integrating diverse data sources, leveraging AWS services, and using tools like Apache Airflow for orchestration. Candidates should have extensive experience in data engineering and relevant technologies.
Top Skills: Apache AirflowAWSDockerFlinkKubernetesSpark
15 Hours Ago
Remote
Hybrid
India
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
The Senior Data Engineer at SailPoint will design and implement ELT processes for data ingestion and processing systems. Responsibilities include developing scalable data pipelines, collaborating with cross-functional teams, and utilizing AWS services to ensure efficient data solutions. The role requires strong adaptability and problem-solving skills in a dynamic environment.
Top Skills: AWSDockerFlinkKubernetesSpark
Yesterday
Hybrid
Bengaluru, Karnataka, IND
Senior level
Senior level
Healthtech • Software • Analytics • Biotech • Pharmaceutical • Manufacturing
As a Senior Data Engineer at Takeda, you will design and implement complex data structures and pipelines, lead the ingestion of structured and unstructured data, develop data quality methodologies, and influence analytics to derive actionable insights from large data sets, while collaborating closely with architects and product owners.
Top Skills: AnsibleAWSChefDatabricksEltETLGitlabKafkaSagemakerSparkTerraform

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account