Astreya Logo

Astreya

Data Engineer

Posted 13 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
The Data Engineer will build and maintain ETL/ELT pipelines, implement automated data quality checks, optimize data storage, and create interactive visualizations. They will manage CI/CD pipelines and deliver production-ready data solutions.
The summary above was generated by AI
  • Years of experience: 8 to 10  years

Focus: Reliability, Automation, and Data Quality. The Engineer turns the architect's blueprint into a working, automated reality.

  • Key Responsibilities:
  • Build and maintain ETL/ELT pipelines to move data from source to storage.
  • Implement automated data quality testing and observability alerts.
  • Optimize data storage formats (e.g., Parquet, Delta Lake) for high-speed querying.
  • Manage CI/CD pipelines for data code deployment.
  • Create interactive visualizations with actionable insights by persona
  • Deliverables: Production-ready optimized data pipelines, Cleaned datasets, interactive & intelligent dashboards.
  • Tech Stack - AWS
  • Data Ingestion & Processing : AWS API Gateway, AWS Lambda, AWS Glue ETL, AWS Glue Crawler
  • Data Storage & Analytics : Amazon S3, Amazon Redshift (Data Warehouse), Amazon Athena
  • Governance & Security : AWS Lake Formation, AWS IAM, AWS CloudTrail, Amazon CloudWatch
  • AI & Analytics : AWS Bedrock
  • Visualization : Amazon QuickSight
  • Visa Requirement: Preferred USA B1 Business Visa with a minimum validity of 2 years

Top Skills

Amazon Athena
Amazon Cloudwatch
Amazon Quicksight
Amazon Redshift
Amazon S3
Aws Api Gateway
Aws Bedrock
Aws Cloudtrail
Aws Glue Crawler
Aws Glue Etl
Aws Iam
Aws Lake Formation
Aws Lambda

Similar Jobs

8 Hours Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
You will build and implement data ingestion processes using big data tools and collaborate with a team on machine learning projects. Responsibilities include performance monitoring and defining data security principles.
Top Skills: DatabricksDelta LakeFlumeHdfsHiveKafkaPythonScalaSparkSQL
12 Days Ago
Remote or Hybrid
Expert/Leader
Expert/Leader
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
Lead architecture, design, development, and delivery of enterprise data platform components. Build and optimize ETL/ELT pipelines, data models, and warehouses (Snowflake). Mentor engineers, drive data governance, ensure performance, security, and scalability, and collaborate cross-functionally to adopt best practices and new tools.
Top Skills: Cloud-Native (Google Cloud Preferred)CtesData WarehouseElt/Etl ToolsJavaNoSQLPythonRdbmsSnowflakeSQLStored ProceduresUdfs
2 Hours Ago
Remote
India
Senior level
Senior level
Software
Seeking experienced Data Engineers with 5-7 years of experience in Databricks, PySpark, and Data Fabric concepts to aid in data transformation initiatives.
Top Skills: Azure Data FactoryDatabricksPyspark

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account