Zuru Logo

Zuru

Data Engineer

Posted 7 Hours Ago
Be an Early Applicant
Ahmedabad, Gujarat
Mid level
Ahmedabad, Gujarat
Mid level
As a Data Engineer at Zuru Tech, you will develop and maintain complex data pipelines and ETL processes, ensuring the integration of data across various cloud platforms. Responsibilities include building scalable data transformation frameworks, working on data migration to cloud warehouses, and collaborating with technical teams to enhance data architecture and resolve pipeline issues.
The summary above was generated by AI

About Us

Zuru Tech is digitalizing the construction process of buildings all around the world. We have a multi-national team developing the world’s first digital building fabrication platform, you design, we build it! 

We at ZURU develop the Zuru Home app, a BIM software meant for the general public, architects, and engineers, from here anyone can buy, design, and send to manufacturing any type of building with complete design freedom. Welcome to the future!

What are you Going to do? 


📌Be responsible for developing and maintaining complex data pipelines, ETL, data models, and standards for various data Integration & data warehousing projects from source to sinks on Saas/Paas platforms - snowflake, databricks, Azure Synapse, Redshift, BigQuery, etc.

📌Develop scalable, secure, and optimized data transformation pipelines and integrate them with downstream sinks.

📌Provide support to technical solutions from a data flow design and architecture perspective, ensure the right direction, and propose a resolution to potential data pipeline-related problems.

📌Participate in developing Proof of concepts (PoC) of key technology components to project stakeholders.

📌Developing scalable and reusable frameworks for ingesting geospatial data sets.

📌Develop connectors to extract data from sources and use event/streaming services to persist and process data into sinks.

📌Collaborate with other project team members (Architects, Sr Data Engineers) to support the delivery of additional project components (like API interfaces, Search, and visualization).

📌Participate in evaluating and creating PoVs around the performance aspects of data integration tools in the market against customer requirements.

📌Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.

What are we Looking for?


✔ Must have 3+ years of experience working as a Data Engineer in cloud transformation projects in the areas of data management solutions - AWS or Azure or GCP or Snowflake, Databricks, etc.

✔ Must have hands-on experience in at least one end-to-end implementation of data lake/warehouse projects using Paas and Saas - such as Snowflake, Databricks, Redshift, Synapse, BigQuery, etc. and 6 month+ data warehouse/data lake implementations on-premise.

✔ Experience in data pipeline development, ETL/ELT, implementing complex stored Procedures and standard DWH and ETL concepts.

✔ Experience in setting up resource monitors, RBAC controls, warehouse sizing, query performance tuning, IAM policies, and Cloud networking (VPC, Virtual Network etc).

✔ Experience in Data Migration from on-premise RDBMS to cloud data warehouses

✔ Good understanding of relational as well as NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling).

✔ Hands-on experience in Python, PySpark, and programming for data integration projects.

✔ Good experience in Cloud data storage (Blob, S3, Object stores, Minio), data pipeline services, data integration services, and data visualization.

✔ Support in providing resolution to an extensive range of complicated data pipeline-related problems, proactively and as issues surface.


Required Skills


✔ AWS / Azure / GCP (any one of the 3) native data engineering / architect

✔ Apache Kafka, ELK, Grafana

✔ Snowflake / data bricks / redshift

✔ Hadoop


 What do we Offer? 

💰 Competitive compensation

💰 Annual Performance Bonus

⌛️ 5 Working Days with Flexible Working Hours

🌎 Annual trips & Team outings

🚑 Medical Insurance for self & family

🚩 Training & skill development programs

🤘🏼 Work with the Global team, Make the most of the diverse knowledge 

🍕 Several discussions over Multiple Pizza Parties


A lot more! Come and discover us! 

Top Skills

Pyspark
Python

Similar Jobs

10 Days Ago
Campus Area, Gandhi Nagar Extension, Kachchh, Gujarat, IND
Senior level
Senior level
Information Technology
The Azure Data Engineer will manage data engineering projects, develop and maintain data pipelines, and collaborate with teams to meet data requirements. They will focus on data quality, security, and process optimization while staying current with industry technologies and fostering team collaboration.
7 Hours Ago
Ahmedabad, Gujarat, IND
Senior level
Senior level
Software • Automation
The Senior Data Engineer at Zuru Tech will design scalable data pipelines, manage ETL processes across cloud platforms, develop advanced data models, and ensure data integration solutions conform to governance standards. They will lead and mentor junior engineers, collaborate with stakeholders, and actively contribute to Agile processes.
Top Skills: PysparkPythonSQL
14 Hours Ago
Vadodara, Gujarat, IND
Senior level
Senior level
Fintech • Information Technology • Logistics
The Engineer-Data Science will work in a dynamic environment on power transformers design, building advanced visualizations and data models. Responsibilities include creating dashboards, conducting data analysis, and applying data science techniques to inform business decisions. A key aspect is developing connections between various data sources and using DAX queries in Power BI.
Top Skills: Dax

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account