Optum Logo

Optum

AI ML Engineer (SG25, 26, 27, 28, 29)

Posted Yesterday
Be an Early Applicant
In-Office
5 Locations
Mid level
In-Office
5 Locations
Mid level
Design, develop, and maintain ETL solutions, execute T-SQL queries, enhance SSIS packages, monitor ETL processes, and collaborate with teams to ensure data solutions meet business needs.
The summary above was generated by AI
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
  • Design, develop, and maintain high-performance ETL solutions using SSIS to support business intelligence and data warehousing needs
  • Write advanced and optimized T-SQL queries for data extraction, transformation, and loading
  • Troubleshoot and enhance existing SSIS packages to ensure reliability and performance
  • Monitor and support ETL processes across QA, UAT, and Production environments, including job monitoring and issue resolution
  • Integrate data from diverse sources such as RDBMS, APIs, FTPs, and cloud storage in various formats (CSV, JSON, XML, Parquet, etc.)
  • Utilize C# and Regex for complex data transformation tasks within ETL workflows
  • Analyze business requirements and deliver fully automated, scalable ETL solutions
  • Contribute to the development and refinement of ETL best practices, including data quality and cleansing standards
  • Provide accurate work estimates and deliver solutions in alignment with project timelines
  • Communicate effectively with the Data Engineering Manager regarding project status, risks, and dependencies
  • Collaborate with cross-functional teams to ensure data solutions meet business needs
  • Foster a culture of knowledge sharing and continuous improvement within the team
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:
  • Hands-on experience with Snowflake, Databricks, or Kafka
  • Experience in data engineering, with solid expertise in SSMS, SSIS, and SQL
  • Experience in business/data analysis
  • Experience with version control systems like TFS or GitHub
  • Solid understanding of database design, lifecycle management, and performance tuning
  • Proven ability to build and maintain DevOps pipelines for data workflows
  • Proven excellent communication skills and ability to work across teams and stakeholders

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Top Skills

C#
Databricks
Git
Kafka
Snowflake
SQL
Ssis
T-Sql
Tfs

Similar Jobs at Optum

Yesterday
In-Office
Gurugram, Haryana, IND
Mid level
Mid level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Manage end-to-end revenue cycle management focusing on cash applications, collections, and ensuring timely posting of payments. Collaborate with teams to improve processes and client satisfaction.
Top Skills: ExcelGetpaidOracle
Yesterday
In-Office
4 Locations
Junior
Junior
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Data Analyst will analyze data, create dashboards with Tableau, manage data pipelines in Snowflake or DataStage, and document processes using Visio.
Top Skills: Ibm DatastageSnowflakeSQLTableauVisio
Yesterday
In-Office
3 Locations
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and deploy AI/ML solutions, translate analytics into insights, collaborate with teams, and stay updated on AI/ML trends.
Top Skills: DockerGitJavaJenkinsKerasKubernetesMlflowPythonPyTorchRScikit-LearnTensorFlow

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account