Sysco Logo

Sysco

Senior Software Engineer - Data Integrations

Reposted 3 Hours Ago
Be an Early Applicant
Remote
Hiring Remotely in Sri Lanka
Senior level
Remote
Hiring Remotely in Sri Lanka
Senior level
Design, develop, and implement data integration solutions for Sysco's ERP and legacy systems, optimizing performance and ensuring high code quality.
The summary above was generated by AI
JOB DESCRIPTION

Senior Software Engineer – Data Integrations 

 

The Big Picture 

 

Sysco LABS is the Global In-House Center of Sysco Corporation (NYSE: SYY), the world’s largest foodservice company. Sysco ranks 56th in the Fortune 500 list and is the global leader in the trillion-dollar foodservice industry.    

 

Sysco employs over 75,000 associates, has 337 smart distribution facilities worldwide and over 14,000 IoT-enabled trucks serving 730,000 customer locations. For fiscal year 2025 that ended June 29, 2025, the company generated sales of more than $81.4 billion.    

 

Sysco LABS Sri Lanka delivers the technology that powers Sysco’s end-to-end operations.  Sysco LABS’ enterprise technology is present in the end-to-end foodservice journey, enabling the sourcing of food products, merchandising, storage and warehouse operations, order placement and pricing algorithms, the delivery of food and supplies to Sysco’s global network and the in-restaurant dining experience of the end-customer.  

 

For more information visit: www.syscolabs.lk 

 

The Opportunity 

We are currently on the lookout for Senior Software Engineers with experience in integrations to join our team and focus on designing, developing and implementing data inbound/outbound and integration solutions for Sysco's ERP and legacy systems. 

 

 

Responsibilities: 

  • Collaborating with business and technical stakeholders to understand data requirements and translate them into technical solutions 

  • Designing and implementing data inbound/outbound strategies for various systems, including AS400, SAP, and other ERP platforms 

  • Building Event-driven architecture using Kafka for real-time data streaming 

  • Creating and maintaining REST APIs for data access and consumption by other applications 

  • Ensuring high code quality by following software engineering best practices 

  • Adhering to Continuous Integration and Continuous Delivery of solutions 

  • Monitoring and optimizing integration performance and scalability 

  • Troubleshooting and resolving data integration issues, ensuring data quality and accuracy 

 

Requirements: 

  • A Bachelor’s Degree in computer science or equivalent, and 3+ years of experience developing production enterprise applications 

  • Expertise with the Java/Spring stack, Python, RESTful APIs, microservices, performance optimizations, and complex enterprise integration patterns 

  • Hands-on experience with Google Cloud Platform (GCP): Experience with services & serverless functions (e.g., BigQuery, Datastream, DataFlow, Pub/Sub, Cloud Functions, Cloud Run, Cloud composer, etc.) for data processing and orchestration, including deploying and managing containerized (Docker/Kubernetes) and serverless applications 

  • Hands-on experience with data ingestion and extraction from ERP systems like AS400, SAP, and Dynamics 365 

  • A deep understanding of relational databases, schema design, and data integration strategies 

  • Familiarity with data streaming concepts and related toolsets (Kafka, Apache Beam) 

  • Hands-on experience with DevOps tools (Jenkins), Version Controlling Systems (GitHub), and Linux-based systems 

  • Hands-on experience with Infrastructure as Code (Terraform preferred); automation via Python or Shell scripting is an added advantage 

  • Familiarity with Quality Engineering processes, unit testing frameworks, and tools like SonarQube and Veracode 

  • The ability to mentor junior developers, conduct code reviews, and enforce Quality Engineering processes (SonarQube, Veracode) 

  • The ability to monitor and troubleshoot applications (using tools like DataDog or equivalent) and provide continuous support during critical deployments 

  • Familiarity with working in a Scrum Agile delivery environment 

  • Excellent communication and leadership skills, with the ability to drive technical decisions and collaborate with stakeholders 

  • Familiarity with GraphQL (added advantage) 

  • Experience with AWS or Azure will be an added advantage 

 

Benefits:  

  • US dollar-linked compensation  

  • Performance-based annual bonus  

  • Performance rewards and recognition  

  • Agile Benefits - special allowances for Health, Wellness & Academic purposes  

  • Paid birthday leave  

  • Team engagement allowance  

  • Comprehensive Health & Life Insurance Cover - extendable to parents and in-laws  

  • Overseas travel opportunities and exposure to client environments  

  • Hybrid work arrangement  

 

Sysco LABS is an Equal Opportunity Employer 

Top Skills

Apache Beam
BigQuery
Cloud Functions
Cloud Run
Dataflow
Datastream
Docker
Git
Google Cloud Platform
Java
Jenkins
Kafka
Kubernetes
Linux
Microservices
Pub/Sub
Python
Python
Rest Apis
Shell Scripting
Sonarqube
Spring
Terraform
Veracode

Similar Jobs

3 Hours Ago
Remote
Sri Lanka
Senior level
Senior level
Food • Logistics
Lead the development and governance of the MLOps capability, ensuring efficient and compliant ML/AI model operations through platform design, process standardization, and team leadership.
Top Skills: AirflowAWSAzureDatarobotDockerGCPKubeflowKubernetesMlflowMlopsTerraform
3 Hours Ago
Remote
Sri Lanka
Mid level
Mid level
Food • Logistics
The Business Analyst role involves liaising with product teams to develop specifications, analyzing requirements, managing projects, and communicating with stakeholders. Responsibilities include resolving issues and coordinating training sessions.
Top Skills: AgileRupWaterfall
3 Hours Ago
Remote
Sri Lanka
Senior level
Senior level
Food • Logistics
The Senior Engineer - Data Engineering designs and develops scalable data solutions, implements data processing applications, and maintains data pipelines within an Agile environment, ensuring high code quality and adherence to DevOps practices.
Top Skills: Aws Data PipelinesBigQueryCloud ComposerCloud FunctionsCloud RunDatadogDataflowDatastreamEmrGitGitlabGlueGoogle Cloud PlatformHiveInformaticaJenkinsKafkaLambdaPub/SubPythonShellSparkSQLStorm

What you need to know about the Delhi Tech Scene

Delhi, India's capital city, is a place where tradition and progress co-exist. While Old Delhi is known for its rich history and bustling markets, New Delhi is defined by its modern architecture. It's clear the region places a strong emphasis on preserving its cultural heritage while embracing technological advancements, particularly in artificial intelligence, which plays a central role in shaping the city's tech landscape, fueled by investments in research and development.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account