Dario Logo

Dario

Data Engineer

Job Posted 17 Days Ago Posted 17 Days Ago
Be an Early Applicant
IN
Mid level
IN
Mid level
The Data Engineer is responsible for designing, developing, and maintaining data infrastructure, optimizing performance, collaborating with cross-functional teams, and supporting data-driven decision making. This role requires expertise in data warehousing, ETL processes, and cloud technologies, with a focus on building scalable data pipelines and systems.
The summary above was generated by AI

Description

At Dario, Every Day is a New Opportunity to Make a Difference.

We are on a mission to make better health easy. Every day our employees contribute to this mission and help hundreds of thousands of people around the globe improve their health. How cool is that? We are looking for passionate, smart, and collaborative people who have a desire to do something meaningful and impactful in their career.

We are seeking a highly skilled and experienced Data Engineer to join our team. As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining our data infrastructure, ensuring the availability, scalability, and reliability of our data pipelines and systems. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to support data-driven decision making and enable advanced analytics. The ideal candidate is a proficient problem solver with a strong background in data engineering, ETL processes, data warehousing, and cloud technologies.

Responsibilities 

  • Develop and maintain DarioHealth data infrastructure. 
  • Optimize and enhance existing data infrastructure to improve performance, efficiency, and reliability. 
  • Define and build the future data infrastructures in both big data & near real-time approaches. 
  • Collaborate with R&D, analysts, and stakeholders to understand data requirements, implement data models, and support analytical initiatives. 
  • Develop and maintain documentation, including data dictionaries, technical specifications, and standard operating procedures. 
  • Work in an Agile methodology, focusing on delivery & quality. 
  • Stay up to date with emerging trends and advancements in data engineering and relevant technologies. 



Requirements

  • 3+ Years of Data Engineering hands-on experience 
  • Solid understanding of data warehousing concepts, dimensional modeling, and schema design. 
  • At least 2 years of proven experience with Python 
  • Experience in Data Pipelines development with Airflow or similar 
  • Experience with multiple database engines (relational, key-value, big-data etc.) 
  • Agile based development experience  
  • Proven experience with Git, CI/CD 
  • Experience with cloud-based data technologies, preferably AWS 
  • Experience with Snowflake
  • Experience establishing data quality and testing standards.
  • B.Tech/B.Sc. in industrial/information systems engineering, computer science, statistics, or equivalent experience. 

Preferred Qualifications: 

  • Experience with NoSQL databases like MongoDB/Redis. 
  • dbt experience  
  • Experience with Real-time pipelines 
  • Spark/Kafka/BigQuery experience 
  • Experience with data visualization tools (e.g., Tableau, Power BI). 
  • Experience with data governance, data privacy, and security practices 

***DarioHealth promotes diversity of thought, culture and background, which connects the entire Dario team. We believe that every member on our team enriches our diversity by exposing us to a broad range of ways to understand and engage with the world, identify challenges, and to discover, design and deliver solutions. We are passionate about building and sustaining an inclusive and equitable working and learning environments for all people, and do not discriminate against any employee or job candidate.​***


Top Skills

Airflow
AWS
BigQuery
Ci/Cd
Dbt
Git
Kafka
NoSQL
Power BI
Python
Snowflake
Spark
Tableau

Dario Gurugram, Haryana, IND Office

Gurugram, Haryana, India, 122001

Similar Jobs

11 Hours Ago
Hybrid
Hyderabad, Telangana, IND
Mid level
Mid level
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
The Data Engineer is responsible for analyzing data pipelines, ensuring data quality, developing scalable solutions, and collaborating cross-functionally.
Top Skills: AWSGCPHive SqlPy-SparkPythonScalaShell ScriptingSpark SqlSQLUnix
4 Hours Ago
Hybrid
Mumbai, Maharashtra, IND
Mid level
Mid level
Financial Services
The role involves designing, analyzing, and delivering data for credit stress models, creating data pipelines, writing requirements, and collaborating with various stakeholders.
Top Skills: JIRAExcelPythonSQL
5 Hours Ago
Hybrid
Bengaluru, Karnataka, IND
Senior level
Senior level
Healthtech • Software • Analytics • Biotech • Pharmaceutical • Manufacturing
The Senior Data Modeler at Takeda is responsible for delivering high-quality data modeling and architecture solutions. Responsibilities include designing and optimizing data models, collaborating with stakeholders, ensuring data consistency, and implementing data governance standards. The role also involves providing guidance to other data professionals.

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account