Exinity Logo

Exinity

Data Engineer

Posted 4 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in IN
Senior level
Remote
Hiring Remotely in IN
Senior level
The Data Engineer will design and maintain data pipelines using Azure Databricks, enhancing performance and scalability, and support production environments in a fast-paced, Agile environment.
The summary above was generated by AI
Company Description

We believe the power of risk-driven investing can help anyone achieve a better life. And we’re here to make this potential a reality for an emerging global generation.  

We provide individuals in the world’s fast-developing economies with guidance, tools, and easy market access so they can trade and invest with confidence. We aim to make our clients their own wealth manager: empowered to create investment strategies and make investment decisions effortlessly, in their own time, on their own terms, in their own way.  

Our story goes back to the first days of online Forex trading - a pioneer of the Meta trader platform, Alpari, expanded rapidly in the world’s emerging markets and was joined in 2011 by our sister brand, FXTM. Now, we are adding further brands to our portfolio. Together, both brands have built a leading global presence in online trading serving over two million clients in 150 countries from regulated centres across four continents. 

Exinity is an energetic and diverse company with offices across Europe, Asia and Africa, and we’re always looking for talented individuals to join us. ‘Freedom to Succeed’ is not just a promise we make to our clients and partners, but to our people too. We’ll help you develop a range of skills, take on early responsibility, and enjoy a rewarding and fulfilling career with a fast-growing, dynamic company. 

Job Description

We are seeking Data Engineer with strong expertise in Azure Databricks. This role will focus on building, supporting, and administering scalable, high-performance data pipelines that power real-time and batch analytics for trading, risk, and operational use cases. The ideal candidate will have a deep background in data bricks data engineering, administration, capital markets data, and thrive in an Agile, fast-paced environment.

Key Responsibilities:

  • Design, develop, and maintain robust data pipelines using Azure Databricks, Confluent, DLT, Spark pipleline, and Delta Lake to support trading and market data workflows.
  • Self-study the existing pipeline and enhance existing data pipelines, ensuring continuity, scalability, and performance improvements.
  • Production pipeline support services, including job monitoringincident resolution, and performance tuning in production environments.
  • Administer Databricks workspaces, unity catalog, including cluster configuration, job scheduling, access control, and workspace optimization.
  • Build and maintain CI/CD pipelines using GitLab, enabling automated testing, deployment, and versioning of data engineering code.
  • Follow and enforce best practices in code management, including modular design, code reviews, and documentation using GitLab workflows.
  • Collaborate with fellow team members, business analysts, and data architect to understand data requirements and deliver high-quality solutions.
  • Build reusable components and frameworks to accelerate development and ensure consistency across data platforms.
  • Actively participate in Agile ceremonies (e.g., sprint planning, stand-ups, retrospectives) and contribute to continuous improvement of team processes.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in data engineering, with at least 2 years working with Azure Databricks.
  • Strong proficiency in PySparkSQL, and Python.
  • Experience supporting production pipelines, including monitoring, alerting, and troubleshooting.
  • Experience with GitLab CI/CD, including pipeline configuration, runners, and integration with cloud services.
  • Familiarity with financial capital markets domain, such as market data feeds, order books, trade execution, and risk metrics.
  • Proven ability to work effectively in Agile development environments.
  • Azure certifications (e.g., Azure Data Engineer Associate).
  • Experience with real-time data processing using Kafka or Event Hubs.

Additional Information

What you will love about Exinity:
“Freedom to succeed” is our core belief. It’s not just a promise we make to our clients and partners, but to our people too. We want our people to LEAP and so in this role you will… 

  • [Learn] (e.g., from each other/from new projects). 

  • [Exchange] (e.g., information and best practices in an open-minded environment). 

  • [Advance] (e.g., by developing skills and accepting greater responsibilities/ your career progression and diversification). 

  • [Prosper] (e.g., by acquiring skills/ by nurturing a team of x people). 

Exinity is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of gender, sexual orientation, marital or civil partner status, gender reassignment, race, colour, nationality, ethnic or national origin, religion or belief, disability or age. 

Top Skills

Azure Databricks
Ci/Cd
Confluent
Delta Lake
Dlt
Event Hubs
Gitlab
Kafka
Pyspark
Python
Spark
SQL

Similar Jobs

10 Days Ago
Remote or Hybrid
8 Locations
Senior level
Senior level
Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
The Manager, Quality - Business Data Engineer will lead data governance and management, support AI/ML initiatives, and ensure compliance in data solutions while collaborating with various stakeholders.
Top Skills: Dataiku DssPower BIPythonSnowflakeSpotfireSQL
6 Days Ago
In-Office or Remote
Bengaluru, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Data Engineer, you'll build data models, design ETL processes, improve data quality, and mentor junior engineers, leveraging AWS and various data technologies.
Top Skills: AirflowAmazon Web ServicesData Build ToolDatabricksEmrHiveJavaKinesisPythonRdsS3ScalaSparkSQLSqs
3 Hours Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Design, develop, and maintain Big Data solutions for Disability & Absence products. Collaborate with teams to improve existing solutions and create automation scripts.
Top Skills: Ci/CdHbaseHiveKafkaNifiNoSQLPigPythonShell ScriptSolrSpark

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account