Design, build, and optimize data pipelines on Azure. Implement event-driven architectures and microservices, ensuring reliability and performance. Collaborate with teams to maintain SLAs and document solutions.
Requisition Number: 2344704
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Design, build, and optimize data ingestion and transformation pipelines on Azure Databricks (Delta Lake/Spark) with an eye for performance, cost, and operational simplicity
- Implement streaming and batch patterns; leverage CDC, eventing (Service Bus/Event Grid), and orchestration (Azure Data Factory, Functions) to move and process data reliably
- Engineer high quality PySpark code and SQL, enforce coding standards, and enable re use via libraries and notebooks
- Build/extend APIs and microservices that expose curated data for real time and near real time use cases; ensure backward compatibility and observability
- Contribute to CI/CD (Azure Pipelines or equivalent), infra as code, secrets management (Key Vault), and automated testing for data and services
- Partner with architects/SMEs to migrate mainframe workloads (batch & inquiry APIs) to cloud native paradigms with clear SLAs and error budgets
- Instrument solutions with end to end monitoring/alerting; drive MTTR down through robust runbooks, dashboards, and noise free alerts
- Collaborate with business & platform teams to onboard new consumers, model data contracts, and scale for rapid growth in volume and users
- Contribute to data governance (catalog, lineage, access controls/Unity Catalog) and ensure compliance with enterprise standards
- Participate in on call/production support rotations to maintain 24×7 reliability and SLA adherence; perform root cause analysis and preventive hardening
- Document designs, patterns, and operational guides that enable fast onboarding and consistent delivery across teams
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Bachelor degree in Software Engineering with having relevant experience from 4+ years
- Hands on Azure Databricks (Spark, Delta Lake) with strong PySpark and SQL skills for large scale data processing
- Experience building event driven and batch data flows using Azure Data Factory, Service Bus/Event Grid, and Azure Functions
- API engineering (REST/gRPC), contract versioning, and service observability (logs, metrics, traces)
- CI/CD exposure on Azure (Pipelines/GitHub), including test automation and deployment strategies for data & services
- Familiarity with secrets & access via Azure Key Vault, and role based access in data platforms (Unity Catalog/ACLs)
- Production mindset: on call readiness, incident response, and proactive post incident improvements to protect SLAs for a large consumer base
- Proven ability to design for reliability (idempotency, retries, transactional writes, schema evolution) and to operate at scale (performance & cost tuning)
Preferred Qualifications:
- Experience migrating mainframe data and workloads to cloud; healthcare claims domain exposure
- Experience with Azure DevOps governance and enterprise templates for JDs (tone/structure)
- Delta Live Tables, Structured Streaming, or Kafka ingestion patterns
- Data quality frameworks (expectations, unit tests), and lineage/catalog tools
- Performance engineering for Spark (partitioning, caching, joins, skew mitigation) and SQL Warehouse optimization
- Secure engineering practices (PII handling, encryption at rest/in transit, network isolation) (Alex Overv...or Enduser | Word)
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Azure Data Factory
Azure Databricks
Azure Functions
Azure Key Vault
Azure Pipelines
Ci/Cd
Delta Lake
Pyspark
Service Bus
Spark
SQL
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Data Engineer will design and optimize data pipelines, implement event-driven architectures, and ensure data reliability and compliance in Azure environments.
Top Skills:
AzureAzure Data FactoryAzure DatabricksAzure FunctionsAzure Key VaultAzure PipelinesDelta LakeDevOpsEvent GridGitPysparkService BusSQLUnity Catalog
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Data Engineer will design and optimize data pipelines, implement event-driven architectures, and ensure data reliability and compliance in Azure environments.
Top Skills:
AzureAzure Data FactoryAzure DatabricksAzure FunctionsAzure Key VaultAzure PipelinesDelta LakeDevOpsEvent GridGitPysparkService BusSQLUnity Catalog
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Organizational Change Manager coordinates communications for technology and operational changes, ensuring clear messaging and stakeholder alignment. Responsibilities include drafting communications, supporting change management efforts, and collaborating across functions to improve awareness and adoption of new processes.
Top Skills:
Enterprise Communication Platforms
What you need to know about the Delhi Tech Scene
Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

