HighLevel Logo

HighLevel

Senior Database Engineer

Posted 14 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in Delhi, Connaught Place, New Delhi, Delhi
Senior level
Remote
Hiring Remotely in Delhi, Connaught Place, New Delhi, Delhi
Senior level
The Senior Database Engineer will design, implement, and maintain high-performance analytical databases, particularly using ClickHouse. Responsibilities include optimizing database performance, developing data models, ensuring minimal downtime, and collaborating with teams to integrate databases with data pipelines.
The summary above was generated by AI

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website - https://www.gohighlevel.com/

YouTube Channel - https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage.


About the Role:

We are seeking a highly skilled Senior Database Engineer with expertise in ClickHouse and other columnar databases. The ideal candidate will have a deep understanding of database performance optimization, query tuning, data modeling, and large-scale data processing. You will be responsible for designing, implementing, and maintaining high-performance analytical databases that support real-time and batch processing.

Responsibilities:

  • Design, optimize, and maintain ClickHouse databases to support high-throughput analytical workloads
  • Develop and implement efficient data models for fast query performance and storage optimization
  • Monitor and troubleshoot database performance issues, ensuring minimal downtime and optimal query execution
  • Work closely with data engineers, software developers, and DevOps teams to integrate ClickHouse with data pipelines
  • Optimise data ingestion processes, ensuring efficient storage and retrieval of structured and semi-structured data
  • Implement partitioning, sharding, and indexing strategies for large-scale data processing
  • Evaluate and benchmark ClickHouse against other columnar databases such as Apache Druid, Apache Pinot, or Snowflake
  • Establish best practices for backup, replication, high availability, and disaster recovery
  • Automate database deployment, schema migrations, and performance monitoring using infrastructure-as-code approaches

Requirements:

  • 5+ years of experience working with high-performance databases, with a focus on ClickHouse or similar columnar databases
  • Strong knowledge of SQL, query optimisation techniques, and database internals
  • Experience handling large-scale data (TBs to PBs) and optimizing data storage & retrieval
  • Hands-on experience with ETL/ELT pipelines, streaming data ingestion, and batch processing
  • Proficiency in at least two scripting/programming languages like NodeJS, Python, Go, or Java for database automation
  • Familiarity with Kafka, Apache Spark, or Flink for real-time data processing
  • Experience in Kubernetes, Docker, Terraform, or Ansible for database deployment & orchestration is a plus
  • Strong understanding of columnar storage formats (Parquet, ORC, Avro) and their impact on performance
  • Knowledge of cloud-based ClickHouse deployments (AWS, GCP, or Azure) is a plus
  • Excellent problem-solving skills, ability to work in a fast-paced environment, and a passion for performance tuning

Preferred Skills:

  • Experience with alternative columnar databases like Apache Druid, Apache Pinot, or Snowflake
  • Background in big data analytics, time-series databases, or high-performance data warehousing
  • Prior experience working with distributed systems and high-availability architectures

EEO Statement:

At HighLevel, we value diversity. In fact, we understand it makes our organization stronger. We are committed to inclusive hiring/promotion practices that evaluate skill sets, abilities, and qualifications without regard to any characteristic unrelated to performing the job at the highest level. Our objective is to foster an environment where really talented employees from all walks of life can be their true and whole selves, cherished and welcomed for their differences while providing excellent service to our clients and learning from one another along the way! Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.


#LI-Remote #LI-NJ1

Similar Jobs

21 Hours Ago
Remote
India
Senior level
Senior level
Database • Analytics
As a Senior Software Engineer on the Integrations Team, you will design and develop high-performance solutions for data onboarding, work with diverse datastores and cloud-native architectures, and collaborate with various teams to drive innovation in data integration processes. You'll need 5+ years of experience and expertise in Golang or Java, along with strong skills in ETL and distributed systems.
Top Skills: ClickhouseGcsGoJavaKubernetesMongoDBMySQLRedshiftS3Snowflake
2 Hours Ago
Remote
Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Senior Data Analyst will work on delivering transformational projects within the banking and financial sectors, leveraging excellent analytical skills, data governance knowledge, and the ability to manage multiple priorities. Responsibilities include writing SQL queries, understanding data quality, and communicating effectively across stakeholders.
Top Skills: CmdHiveNote++PuttyScalaSQL
2 Hours Ago
Remote
Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Data Modeler will lead data modeling initiatives, manage credit risk, and ensure compliance with regulatory reporting. Responsibilities include data sourcing, requirement analysis, data flow management, and delivering business transformation projects in finance. Strong analytical skills and experience with the Hadoop and GCP ecosystems, as well as technical proficiency in analytical languages, are essential.
Top Skills: ErwinGCPHadoopMS OfficePythonRSASSQL

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account