Yassir Logo

Yassir

Senior Data Engineer

Posted 8 Days Ago
Be an Early Applicant
Remote or Hybrid
8 Locations
Senior level
Remote or Hybrid
8 Locations
Senior level
The role involves building a centralized data lake on GCP, developing SPARK-powered data pipelines, ensuring data quality, and collaborating with cross-functional teams for advanced analytics and data models.
The summary above was generated by AI
Yassir is the leading super App in the Maghreb region set to changing the way daily services are provided. It currently operates in 45 cities across Algeria, Morocco and Tunisia with recent expansions into France, Canada and Sub-Saharan Africa. It is backed (~$200M in funding) by VCs from Silicon Valley, Europe and other parts of the world.
We offer on-demand services such as ride-hailing and last-mile delivery. Building on this infrastructure, we are now introducing financial services to help our users pay, save and borrow digitally.
Helping usher the continent into a digital economy era. We’re not just about serving people - we’re about creating a marketplace to bring people what they need while infusing social values.

Responsibilities

  • Build a centralized data lake on GCP Data services by integrating diverse data sources throughout the enterprise.
  • Develop, maintain, and optimize SPARK-powered batch and streaming data processing pipelines. Leverage GCP data services for complex data engineering tasks and ensure smooth integration with other platform components
  • Design and implement data validation and quality checks to ensure data's accuracy, completeness, and consistency as it flows through the pipelines.
  • Work with the Data Science and Machine Learning teams to engage in advanced analytics.
  • Collaborate with cross-functional teams, including data analysts, business users, operational and marketing teams, to extract insights and value from data.
  • Collaborate with the product team to design, implement, and maintain the data models for analytical use cases.
  • Design, develop, and upkeep data dashboards for various teams using Looker Studio.
  • Engage in technology explorations, research and development, POC’s and conduct deep investigations and troubleshooting.
  • Design and manage ETL/ELT processes, ensuring data integrity, availability, and performance.
  • Troubleshoot data issues and conduct root cause analysis when reporting data is in question.

Required Technical Skills

  • PySpark  Batch and Streaming
  • GCP  Dataproc, Dataflow, DataStream, Dataplex, Pub/Sub, BigQuery and
  • Cloud Storage
  • NoSQL (preferably MongoDB
  • Programming languages: Scala/Python
  • Great Expectation, or similar DQ framework
  • Familiarity with workflow management tools like: Airflow, Prefect or Luigi
  • Understanding of Data Governance, Data Warehousing and Data Modelling
  • Good SQL knowledge

Business

  • Able to communicate effectively, distill technical knowledge into digestible
  • messages in a succinct / visual way
  • Proactively identify and contribute with team development initiatives, and
  • supporting junior members.

Good to have skills

  • Infrastructure-as-Code, preferably Terraform
  • Docker and Kubernetes
  • Looker
  • AI / ML engineering knowledge
  • Lineage, or relevant tools e.g. Atlan
  • DBT

At Yassir, we believe in the power of diversity and the importance of an inclusive culture. So, if you're ready to bring your unique perspective and experiences to the table, then we're excited to listen.

Don't just apply for a job, come and be a part of our journey. Let's create a better tomorrow together.

We look forward to receiving your application!

Best of luck,
Your Yassir TA Team


Top Skills

Airflow
Dbt
Docker
GCP
Kubernetes
Looker
Luigi
NoSQL
Prefect
Pyspark
Python
Scala
SQL
Terraform

Similar Jobs

14 Days Ago
Remote or Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
Seeking a Senior Data Engineer to design and implement ELT processes, build reliable data pipelines, and utilize AWS for data solutions.
Top Skills: Apache AirflowAWSDbtDockerFlinkHelmJenkinsKafkaKubernetesKustomizeSnowflakeSparkTerraform
2 Hours Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Design, develop, and maintain Big Data solutions for Disability & Absence products. Collaborate with teams to improve existing solutions and create automation scripts.
Top Skills: Ci/CdHbaseHiveKafkaNifiNoSQLPigPythonShell ScriptSolrSpark
2 Days Ago
Remote
Shri Bhrigukshetra, BLR, Uttar Pradesh, IND
Senior level
Senior level
Fintech • Analytics
The Senior Data Engineer will design, build, and maintain the data warehouse and analytics solutions for SwapClear, ensuring high performance, security, and reliability through a full software lifecycle approach while working in an Agile environment.
Top Skills: Ansi SqlDbtDockerGitKubernetesOraclePythonSnowflakeSpark

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account