Delta Exchange Logo

Delta Exchange

Senior Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
As a Senior Data Engineer, you will manage the ETL lifecycle, build batch pipelines, transform data using SQL and Python, and ensure data quality and performance for analytics.
The summary above was generated by AI

About the Company

At Delta, we are reimagining and rebuilding the financial system. Join our team to make a positive impact on the future of finance.

🎯 Mission Driven: Re-imagine and rebuild the future of finance.

💡 Most innovative cryptocurrency derivatives exchange. With a daily traded volume of ~$ 3.5 billion, and increasing. Delta is bigger than all the Indian crypto exchanges combined.

📈 Offer the widest range of derivative products and have been serving traders all over the globe since 2018 and growing fast.

💪🏻 The founding team is comprised of IIT and ISB graduates. Business co-founders have previously worked with Citibank, UBS and GIC; and our tech co-founder is a serial entrepreneur who previously co-founded TinyOwl and Housing.com.

💰 Funded by top crypto funds (Sino Global Capital, CoinFund, Gumi Cryptos) and crypto projects (Aave and Kyber Network).

Role Summary:

Support our analytics team by owning the full ETL lifecycle—from master data to analytics-ready datasets. You will build and maintain daily batch pipelines that process 1–10 million master-data rows per run (and scale up to tens or hundreds of millions of rows), all within sub-hourly SLAs. Extract from OLTP and time-series sources, apply SQL/stored-procedure logic or Python transformations, then load into partitioned, indexed analytics tables. Reads run exclusively on read-only replicas to guarantee zero impact on the production master DB. You’ll also implement monitoring, alerting, retries, and robust error handling to ensure near-real-time dashboard refreshes.


Requirements

Required Skills & Experience: 

* 4+ years in data engineering or analytics roles, building daily batch ETL pipelines at 1–10 M rows/run scale (and up to 100 M+).

* Expert SQL skills, including stored procedures and query optimisation on Postgres, Mysql, or similar RDBMS.

* Proficient in Python for data transformation (pandas, NumPy, SQLAlchemy, psycopg2).

* Hands-on with CDC/incremental load patterns and batch schedulers (Airflow, cron).

* Deep understanding of replicas, partitioning, and indexing strategies.

* Strong computer-science fundamentals and deep knowledge of database internals—including storage engines, indexing mechanisms, query execution plans and optimisers for MySQL and time-series DBs like TimescaleDB.

* Experience setting up monitoring and alerting (Prometheus, Grafana, etc.).

Key Responsibilities:

1. Nightly Batch Jobs: Schedule and execute ETL runs.

2. In-Database Transformations: Write optimised SQL and stored procedures.

3. Python Orchestration: Develop Python scripts for more complex analytics transformations.

4. Data Loading & Modelling: Load cleansed data into partitioned, indexed analytics schemas designed for fast querying.

5. Performance SLAs: Deliver end-to-end sub-hourly runtimes.

6. Monitoring & Resilience:Implement pipeline health checks, metrics, alerting, automatic retries, and robust error handling.

7. Stakeholder Collaboration: Work closely with analysts to validate data quality and ensure timely delivery of analytics-ready datasets.

Top Skills

Airflow
Grafana
MySQL
Postgres
Prometheus
Python
SQL
Timescaledb

Similar Jobs

3 Days Ago
Remote or Hybrid
Hyderabad, Telangana, IND
Senior level
Senior level
Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Architect and deliver distributed platform components, lead technical initiatives, and ensure high-quality, scalable data systems. Collaborate across teams and adopt new technologies.
Top Skills: Apache FlinkApache IcebergSparkJavaKafkaKafka ConnectKubernetesMySQLOraclePostgres
Yesterday
Remote
India
Senior level
Senior level
Big Data • Marketing Tech
The Senior Data Engineer designs, builds, and optimizes data pipelines, utilizing Python, Azure services, and big data frameworks, collaborating with teams to maintain sprint goals.
Top Skills: SparkAzure CloudCi/CdDatabricksDevOpsPysparkPython
2 Days Ago
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Healthtech • Pharmaceutical
The Senior Data Engineer will develop data pipelines for healthcare data, ensuring quality and reliability. Responsibilities include data ingestion, transformation, and collaboration with stakeholders.
Top Skills: AirflowAWSDagsterDbtFivetranMatillionPythonSnowflakeSQL

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account