McAfee Logo

McAfee

Data Curation Engineer - Remote

Posted 4 Days Ago
Remote
Hiring Remotely in India
Mid level
Remote
Hiring Remotely in India
Mid level
The Data Curation Engineer will develop certified and curated data products using SQL and SparkSQL on the Databricks platform. Responsibilities include liaising with data stewards, ensuring compliance with data governance standards, implementing change requests, and effective documentation of data products.
The summary above was generated by AI

Role Overview:

Job Description Summary
We are seeking a seasoned data engineer to join our Enterprise Analytics team as a Data Curation Engineer, reporting to the Capability Lead – BI & Data Curation. As a key member of McAfee’s enterprise data curation strategy, you will be responsible for developing high-quality, certified, and curated data in the Databricks platform employing complex analytical modeling.
This is a Remote position located in India. We are only considering candidates located in India and are not providing relocation at this time.

About the role:

  • Develop high-quality, curated data products including tables, schemas, and aggregates via SQL/SparkSQL
  • Understand data product requirements and liaise with Technical Data Stewards and business end-users to ensure alignment to business need
  • Ensure data products meet curation specifications, guidelines, governance standards, and business requirements in tandem with Lakehouse medallion architecture
  • Experience with data catalogs such as Unity and Collibra
  • Follow Agile sprint methodology
  • Implement change requests to production data curations
  • Ensure effective documentation and cataloging of data products
  • Ensure clear handoffs of data products to technical stewards

About you: 

  • We Prefer a Bachelor’s degree in computer science, engineering, statistics, or related field
  • Bring 3+ years of experience in data engineering, data modeling, and ETL/ELT
  • Have hands-on experience with and understanding of the following:
    • SQL/SparkSQL
    • Python or other programming language
    • Databricks or other cloud data management platform
    • Data aggregation, relational database modeling, and common data structures
    • Design and implementation of logical and physical data modeling
    • ETL/ELT tools such as Apache Airflow
    • Data quality guidelines
    • Data governance and data curation models
    • Master data management
    • Data profiling
    • Bring proven experience in development within a Lakehouse data platform

#LI-Remote


Company Overview

McAfee is a leader in personal security for consumers. Focused on protecting people, not just devices, McAfee consumer solutions adapt to users’ needs in an always online world, empowering them to live securely through integrated, intuitive solutions that protects their families and communities with the right security at the right moment.

Company Benefits and Perks:

We work hard to embrace diversity and inclusion and encourage everyone at McAfee to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees.

  • Bonus Program
  • Pension and Retirement Plans
  • Medical, Dental and Vision Coverage
  • Paid Time Off
  • Paid Parental Leave
  • Support for Community Involvement

We're serious about our commitment to diversity which is why McAfee prohibits discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Top Skills

Python
Sparksql
SQL

Similar Jobs

13 Hours Ago
Remote
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Data Scientist will develop and implement AI models to solve complex problems, collaborating with stakeholders to convert business problems into analytical use-cases. The role involves data exploration, machine learning model development, driving innovation, and ensuring scalable analytics solutions, while adhering to best practices in coding and continuous learning.
Top Skills: PysparkPythonR
13 Hours Ago
Remote
India
Entry level
Entry level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Associate Big Data Engineer will support the development and deployment of data ingestion processes and ETL code using advanced big data tools. Responsibilities include collaborating with cross-functional teams, optimizing data frameworks, ensuring data security, and operationalizing machine learning solutions under guidance from senior team members.
Top Skills: PythonScala
15 Hours Ago
Remote
2 Locations
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
The Senior Data Engineer will design and implement scalable ELT processes, maintaining data pipelines for both stream and batch processing. They will collaborate across teams to integrate diverse data sources and utilize AWS services for storage and integration solutions, while also leveraging tools like Apache Airflow for workflow orchestration.
Top Skills: AWSFlinkSpark

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account