Findem Logo

Findem

Software Engineer- Data (Big data and Data Pipelines)- Delhi (preferred)/ Bangalore

Posted 5 Days Ago
Be an Early Applicant
Hybrid
2 Locations
Mid level
Hybrid
2 Locations
Mid level
As a Big Data Engineer at Findem, you will build and manage data pipelines and processing solutions using big data and ETL technologies. Responsibilities include data extraction from various sources, building analytical tools, creating data models, and researching new technologies. A strong proficiency in Python and experience in big data tools like Spark and Kafka is essential.
The summary above was generated by AI

What is Findem:


Findem is the only talent data platform that combines 3D data with AI. It automates and consolidates top-of-funnel activities across your entire talent ecosystem, bringing together sourcing, CRM, and analytics into one place. Only 3D data connects people and company data over time - making an individual’s entire career instantly accessible in a single click, removing the guesswork, and unlocking insights about the market and your competition no one else can. Powered by 3D data, Findem’s automated workflows across the talent lifecycle are the ultimate competitive advantage. Enabling talent teams to deliver continuous pipelines of top, diverse candidates while creating better talent experiences, Findem transforms the way companies plan, hire, and manage talent. Learn more at www.findem.ai


Experience - 4- 7 years


We are looking for an experienced Big Data Engineer, who will be responsible for building, deploying and managing various data pipelines, data lake and Big data processing solutions using Big data and ETL technologies.


Location- Delhi (preferred)/ Bangalore (Based out of these locations or ready to relocate to this locations)

Hybrid- 3 days onsite

Role and Responsibilities

  • Build data pipelines, Big data processing solutions and data lake infrastructure using various Big data and ETL technologies
  • Assemble and process large, complex data sets that meet functional non-functional business requirements
  • ETL from a wide variety of sources like MongoDB, S3, Server-to-Server, Kafka etc., and processing using SQL and big data technologies
  • Build analytical tools to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Build interactive and ad-hoc query self-serve tools for analytics use cases
  • Build data models and data schema for performance, scalability and functional requirement perspective
  • Build processes supporting data transformation, metadata, dependency and workflow management
  • Research, experiment and prototype new tools/technologies and make them successful

Must have Skills

  • Strong in Python/Scala
  • Must have experience in Big data technologies like Spark, Hadoop, Athena / Presto, Redshift, Kafka etc
  • Experience in various file formats like parquet, JSON, Avro, orc etc
  • Experience in workflow management tools like airflow
  • Experience with batch processing, streaming and message queues
  • Any of visualization tools like Redash, Tableau, Kibana etc
  • Experience in working with structured and unstructured data sets
  • Strong problem solving skills

Good to have Skills

  • Exposure to NoSQL like MongoDB
  • Exposure to Cloud platforms like AWS, GCP, etc
  • Exposure to Microservices architecture
  • Exposure to Machine learning techniques

The role is full-time and comes with full benefits. We are globally headquartered in the San Francisco Bay Area with our India headquarters in Bengaluru.


Equal Opportunity


As an equal opportunity employer, we do not discriminate on the basis of race, color, religion, national origin, age, sex (including pregnancy), physical or mental disability, medical condition, genetic information, gender identity or expression, sexual orientation, marital status, protected veteran status or any other legally-protected characteristic.

Top Skills

Python
Scala

Similar Jobs

2 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Senior level
Senior level
Fintech • Machine Learning • Payments • Software • Financial Services
As a Senior Software Engineer in Data Management, you will develop innovative products and solutions addressing various challenges in data management, maintain awareness of industry trends, and prototype rapid product developments.
Top Skills: C++GoJavaPythonRust
4 Days Ago
2 Locations
Expert/Leader
Expert/Leader
eCommerce • Other • Retail
As a Principal Engineer on the Data Engineering team, you will define the strategy for Target's retail data platform, optimize platform performance, lead architectural decisions, mentor engineers, and collaborate across teams to enhance engineering practices and deliver quality solutions.
Top Skills: JavaPythonScalaSpark
2 Days Ago
Bangalore, Bengaluru, Karnataka, IND
Senior level
Senior level
Healthtech • Information Technology • Other • Software • Analytics
The Software Engineer will work extensively with Python for ETL processes, transforming data within Databricks and using Azure services. Ideal candidates should have a strong background in data engineering, data pipeline construction, and optimizing ETL processes, along with experience in leading data engineering projects.
Top Skills: PythonScala

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account