Terrascope Logo

Terrascope

Data Engineer

Sorry, this job was removed at 04:16 a.m. (IST) on Thursday, May 22, 2025
Be an Early Applicant
Hybrid
3 Locations
Hybrid
3 Locations

Similar Jobs

24 Days Ago
Hybrid
Selangor, MYS
Mid level
Mid level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The Data Engineer II will collaborate on projects related to customer experience, develop tools for business initiatives, and ensure data security and accuracy.
Top Skills: APIs
2 Days Ago
In-Office
Kuala Lumpur, WP. Kuala Lumpur, Kuala Lumpur, MYS
Junior
Junior
Database
As a Data Engineer, you'll implement data processing systems using tools like Spark and Hadoop, manage high volume data, and collaborate to provide insights.
Top Skills: Amazon Web ServicesSparkDockerElasticsearchGitGCPGradleHadoopJavaJenkinsAzureNexusPythonScala
3 Days Ago
In-Office
Georgetown, Timur Laut, Pinang, MYS
Mid level
Mid level
Artificial Intelligence • Information Technology • Business Intelligence
The Data Engineer designs and builds scalable data pipelines, ensuring data quality and accessibility while collaborating with stakeholders to translate business needs into technical solutions.
Top Skills: AirflowAlteryxAws LambdaAws RdsAws S3MySQLPostgresProphetPythonScikit-LearnSQLStatsmodelsStreamlit
Empowering enterprises to keep the planet habitable for all, Terrascope aspires to be the easiest carbon measurement and decarbonization platform for companies in the land, nature, and net-zero economy sectors.

Terrascope is a leading decarbonisation software platform designed specifically for the Land, Nature (LAN), and the Net-Zero Economy (NZE). As the easiest-to-use platform for these sectors, our comprehensive solution blends deep industry expertise with advanced climate science, data science, and machine learning. Terrascope enables companies to effectively manage emissions across their supply chains. 

Our integrated platform offers solutions for Product and Corporate Carbon Footprinting, addressing Scope 3 and land-based emissions, SBTi FLAG & GHG Protocol LSR reporting, and supporting enterprise decarbonisation goals.

Publicly launched in June 2022, Terrascope works with customers across sectors, from agriculture, food & beverages, manufacturing, retail and luxury, to transportation, real estate, and TMT.

Terrascope is globally headquartered in Singapore and operates in major markets across APAC, North America, and EMEA. Terrascope is a partner of the Monetary Authority of Singapore’s ESG Impact Hub, a CDP Gold Accredited software provider, has been independently assured by Ernst & Young, and a signatory of The Climate Pledge to achieve Net Zero by 2040.


We are seeking a Senior Data Engineer to design, build, and optimize our data infrastructure, ensuring scalable and reliable data pipelines for ingestion, transformation, and analytics.

This role is a hybrid between Data Engineering —focused on building robust data pipelines, optimizing data models, and managing infrastructure—and Analytics Engineering, where you’ll shape business-ready datasets, support BI tools, and guide the organization’s data modeling strategy.

This role is ideal for candidates who thrive in a startup environment, are passionate about data architecture and analytics, and are eager to solve real-world sustainability challenges.

Key Responsibilities : Data Engineering

  • Build and optimize scalable data pipelines for ingestion, transformation, and storage.
  • Work with structured and unstructured data, handling diverse file formats such as CSV, Excel, JSON, and PDFs.
  • Extract, parse, and integrate emission factor databases (EFDBs) from external sources, ensuring they are structured for ingestion, efficient retrieval and analysis.
  • Leverage on scheduler tools such as Apache Airflow, Dagster, or equivalent to manage and automate data workflows.
  • Ensure data integrity, consistency, and governance while working with high-volume datasets.
  • Develop and optimize queries in MongoDB and SQL for high-performance querying.
  • Manage and deploy data infrastructure on cloud providers such as AWS.
  • Consolidate data models written in Node.js and Python across different systems, ensuring a single source of truth for internal applications.

Analytics Engineering

  • Design and implement robust data models to support analytics, business intelligence (BI), and data science.
  • Advise on data modeling strategies to optimize performance, maintainability, and scalability.
  • Enable BI reporting and self-service analytics by preparing analytics-ready datasets.
  • Work with BI tools such as Tableau, GoodData, Power BI, Looker, Metabase, or equivalent to build visualizations and dashboards.
  • Optimize query performance, materialized views, and aggregation strategies for efficient reporting.
  • Collaborate closely with data scientists, Product, Implementations and Sales teams to provide actionable insights.
  • Ensure that datasets are properly indexed and structured for fast and efficient access.

What We Are Looking For

  • 5 to 8 years of experience in Data Engineering and/or Analytics Engineering roles.
  • Strong knowledge of Python (including pandas library) and/or Node.js, SQL.
  • Proven experience with SQL (PostgreSQL) and/or NoSQL databases (MongoDB or AWS DocumentDB), including indexing, partitioning, and query optimization.
  • Experience with data modeling (designing database schemas, tables, entity-relationship diagrams, etc)
  • Strong data structures and algorithms knowledge and understanding of time and space complexity.
  • Experience working with scheduler tools like Apache Airflow, Dagster, or similar frameworks.
  • Ability to parse and process unstructured data, including PDFs, Excel, and other file formats.
  • Experience working with large-scale data systems and optimizing query performance.
  • Hands-on experience with AWS, GCP, or Azure, for data storage, compute, databases and security.
  • Possess knowledge in DevOps and Infrastructure-as-Code (Terraform, GitOps, Kubernetes, CI/CD tools like GitHub Actions, Argo CD).
  • Passion or willingness to learn about sustainability and carbon emissions data.

Nice to Have

  • Experience with other NoSQL and SQL databases beyond MongoDB.
  • Hands-on experience with streaming data architectures such as Kafka.
  • Exposure to distributed computing frameworks such as Spark.
  • Knowledge and practical experience in AWS EC2, S3, Glue, IAM and CloudWatch.
  • Experience in data security, governance, and compliance best practices.
  • Background in carbon accounting methodologies or sustainability-related data processing.
  • Experience working in a SaaS-product company and/or startups, and comfortable with change and ambiguity.

Your Privacy and Fairness in Our Recruitment Process
We are committed to protecting your data, ensuring fairness, and adhering to workplace fairness principles in our recruitment process. To enhance hiring efficiency and minimize bias, we use AI-powered tools to assist with tasks such as resume screening and candidate matching. These tools are designed and deployed in compliance with internationally recognized AI governance frameworks. We continuously validate our systems to uphold fairness, transparency, and accountability, ensuring they do not result in unfair or discriminatory outcomes. Your personal data is handled securely and transparently, and final hiring decisions are made by our recruitment team to ensure a human-centered approach. If you have questions about how your data is processed or wish to report concerns about fairness, please contact us at [email protected].

We're committed to creating an inclusive environment for our strong and diverse team. We value diversity and foster a community where everyone can be their authentic self.

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account