Level AI Logo

Level AI

Principal Engineer - Data Platform (India)

Reposted 11 Days Ago
Be an Early Applicant
Remote
Hybrid
Hiring Remotely in India
Expert/Leader
Remote
Hybrid
Hiring Remotely in India
Expert/Leader
The Principal Software Engineer will lead the design and development of the data warehouse and analytics platform, focusing on data modeling, ETL processes, and collaboration with team members. Responsibilities include designing analytical platforms, maintaining data schemas, and ensuring scalability and accuracy in data propagation.
The summary above was generated by AI

Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry.


Position Overview : We seek an experienced Principal Software Engineer to lead the design and development of our data warehouse and analytics platform in addition to help raise the engineering bar for the entire technology stack at Level AI, including applications, platform and infrastructure. 

They will actively collaborate with team members and the wider Level AI engineering community to develop highly scalable and performant systems. They will be a technical thought leader who will help drive solving complex problems of today and the future by designing and building simple and elegant technical solutions. They will coach and mentor junior engineers and drive engineering best practices. They will actively collaborate with product managers and other stakeholders both inside and outside the team.


Competencies :


Data Modeling: Skilled in designing data warehouse schemas (e.g., star and snowflake schemas), with experience in fact and dimension tables, as well as normalization and denormalization techniques.

Data Warehousing & Storage Solutions: Proficient with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics.

ETL/ELT Processes: Expertise in ETL/ELT tools (e.g., Apache NiFi, Apache Airflow, Informatica, Talend, dbt) to facilitate data movement from source systems to the data warehouse.

SQL Proficiency: Advanced SQL skills for complex queries, indexing, and performance tuning.

Programming Skills: Strong in Python or Java for building custom data pipelines and handling advanced data transformations.

Data Integration: Experience with real-time data integration tools like Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch.

Data Pipeline Management: Familiar with workflow automation tools (e.g., Apache Airflow, Luigi) to orchestrate and monitor data pipelines.

APIs and Data Feeds: Knowledgeable in API-based integrations, especially for aggregating data from distributed sources.

Responsibilities -

  • Design and implement analytical platforms that provide insightful dashboards to customers.
  • Develop and maintain data warehouse schemas, such as star schemas, fact tables, and dimensions, to support efficient querying and data access.
  • Oversee data propagation processes from source databases to warehouse-specific databases/tools, ensuring data accuracy, reliability, and timeliness.
  • Ensure the architectural design is extensible and scalable to adapt to future needs.

Requirement -

  • Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1/2 Engineering institutes with relevant work experience with a top technology company.
  • 9+ years of Backend and Infrastructure Experience with a strong track record in development, architecture and design.
  • Hands on experience with large scale databases, high scale messaging systems and real time Job Queues.
  • Experience navigating and understanding large scale systems and complex code-bases, and architectural patterns.
  • Proven experience in building high scale data platforms.
  • Strong expertise in data warehouse schema design (star schema, fact tables, dimensions).
  • Experience with data movement, transformation, and integration tools for data propagation across systems.
  • Ability to evaluate and implement best practices in data architecture for scalable solutions.
  • Experience mentoring and providing technical leadership to other engineers in the team.

  • Nice to have:

  • Experience with Google Cloud, Django, Postgres, Celery, Redis.
  • Some experience with AI Infrastructure and Operations.

To learn more visit : https://thelevel.ai/

Funding : https://www.crunchbase.com/organization/level-ai

LinkedIn : https://www.linkedin.com/company/level-ai/

Our AI platform : https://www.youtube.com/watch?v=g06q2V_kb-s

Top Skills

Apis And Data Feeds
Data Integration
Data Modeling
Data Pipeline Management
Data Warehousing & Storage Solutions
Etl/Elt Processes
Programming Skills
Sql Proficiency

Similar Jobs

13 Days Ago
Remote
8 Locations
Junior
Junior
Cloud • Software
The Software Engineer will be responsible for developing and automating infrastructure features for data platforms, ensuring fault-tolerant operations and collaborating with a team to manage Big Data technologies. Key responsibilities include writing Python code, debugging, and assisting other teams with domain expertise.
Top Skills: ElasticsearchKafkaKubernetesLinuxMongoDBMySQLOpenstackOraclePostgresPythonRedisSpark
20 Days Ago
Remote
Chennai, Tamil Nadu, IND
Junior
Junior
Insurance • Software • Energy • Financial Services
As a Data Platform Engineer, you will monitor production processes, manage incident tickets, and ensure smooth operations in data platforms. Responsibilities include production monitoring, incident management, and platform support, focusing on improving workflows and system reliability.
5 Hours Ago
Remote
Bengaluru, Karnataka, IND
Senior level
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
The Senior Data Scientist will work within the AI, Systems, and Technology team to develop and implement machine learning algorithms, train models, and collaborate with cross-functional teams. They will focus on enhancing AI/ML functionality for customer support and sales, requiring effective communication with stakeholders and proficiency in various machine learning methods.
Top Skills: Artificial IntelligenceAWSDatabricksJavaLlmMachine LearningNlpPythonSparkSQL

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account