The DevOps Engineer will design, implement, and maintain CI/CD pipelines, manage cloud infrastructure, deploy applications, and support data pipeline processing while ensuring reliability, scalability, and security.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
We are seeking a DevOps Engineer with solid experience in cloud infrastructure, automation, and CI/CD, along with hands-on exposure to React frontend applications and Python / PySpark-based data or backend workloads. The ideal candidate will support end-to-end application and data pipeline deployments in a scalable and reliable environment.
Primary Responsibilities:
Required Qualifications:
Preferred Qualifications:
Soft Skills:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
We are seeking a DevOps Engineer with solid experience in cloud infrastructure, automation, and CI/CD, along with hands-on exposure to React frontend applications and Python / PySpark-based data or backend workloads. The ideal candidate will support end-to-end application and data pipeline deployments in a scalable and reliable environment.
Primary Responsibilities:
- Design, implement, and maintain CI/CD pipelines for React, Python, and PySpark applications
- Deploy and manage React frontend applications and Python or PySpark backend or data processing jobs
- Automate infrastructure provisioning using Infrastructure as Code (IaC) tools
- Manage and optimize cloud infrastructure (AWS or Azure or GCP)
- Build, deploy, and manage containerized applications using Docker and Kubernetes
- Support data pipelines and batch/stream processing using PySpark
- Ensure system reliability, scalability, performance, and security
- Implement monitoring, logging, and alerting solutions
- Troubleshoot deployment and production issues across environments
- Collaborate with development and data engineering teams
- Apply DevSecOps best practices across pipelines and infrastructure
- Document architecture, deployment processes, and operational runbooks
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Bachelor's or advanced degree in a related technical field
- 5+ years of experience in DevOps or Infrastructure Engineering
- Solid experience with cloud platforms (AWS preferred; Azure or GCP acceptable)
- Hands-on experience with CI/CD tools (Jenkins, GitHub Actions, GitLab CI, Azure DevOps)
- Experience with Docker and Kubernetes
- Hands-on experience with PySpark for data processing or analytics workloads
- Experience with Infrastructure as Code (Terraform, CloudFormation, ARM)
- Working knowledge of React application build and deployment processes
- Familiarity with monitoring or logging tools (Prometheus, Grafana, ELK, Datadog, etc.)
- Proficiency in Python for automation and backend support
- Solid scripting skills (Bash, Python)
Preferred Qualifications:
- Cloud certifications (AWS or Azure or GCP)
- Experience with big data platforms (EMR, Databricks, Spark on Kubernetes)
- Experience with DevSecOps and security scanning tools
- Experience working in Agile or Scrum environments
- Knowledge of microservices architecture
- Familiarity with data orchestration tools (Airflow, Step Functions)
Soft Skills:
- Solid communication and collaboration skills
- Ability to work across DevOps, frontend, backend, and data teams
- Ownership mindset and proactive problem-solving attitude
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Arm
AWS
Azure
Azure Devops
Bash
Ci/Cd
CloudFormation
Datadog
Docker
Elk
GCP
Github Actions
Gitlab Ci
Grafana
Jenkins
Kubernetes
Prometheus
Pyspark
Python
React
Terraform
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead strategy and execution for a portfolio of products enhancing payer outcomes, collaborating across teams, and managing the product lifecycle from vision to delivery.
Top Skills:
AIAnalyticsData ScienceHealthcare Data Standards (FhirHl7SaaSX12)
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead software engineering efforts, design and develop solutions using C# and AI, manage Agile processes, and ensure code quality and compliance.
Top Skills:
.NetAIAngularjsC#Edi
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead the development and implementation of AI-based solutions for automated testing in health insurance marketplaces, ensuring high-quality deliverables and efficient workflows.
Top Skills:
AlmAPIsGitInsomniaJavaJIRAPythonSeleniumSoapuiSQLUnix
What you need to know about the Delhi Tech Scene
Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

