About the company
Why We Built Lyric: Supply chains are more critical and complex than ever. Every day, large enterprises navigate trillions of possible decisions, requiring powerful algorithms—AI—to optimize their supply chain operations. Yet, most organizations struggle to leverage supply chain AI at scale. Traditional solutions present two flawed choices:
-
Buying off-the-shelf point solutions, which are rigid and limited in scope.
-
Building AI capabilities in-house, which demands immense investment and expertise.
That is—until now. (Cue dramatic music.)
Enter Lyric: Lyric is an enterprise AI platform built specifically for supply chains, offering the best of both worlds when companies weigh the decision to buy or build:
-
Out-of-the-box AI solutions for optimizing networks, allocating inventory, scheduling routes, planning fulfillment capacity, promising orders, propagating demand, building predictions, analyzing scenarios, and more. (Buy)
-
A platform-first approach that empowers both business and technical users with end-to-end composability—leveraging no-code tools, their own code, or even forking our code to build and refine decision intelligence. (Build)
With Lyric, enterprises no longer have to choose between flexibility and speed—they get both.
The Mission: We’re building a new era in supply chain with the team best equipped to lead it. With over 20 years at the intersection of supply chain and algorithms, we developed a deep conviction that global supply chains needed something like Lyric. Since our inception in December 2021, that conviction has been validated time and time again.
Today, a growing number of Fortune 500 companies—including Smurfit WestRock, Estée Lauder, Coca-Cola, Nike, and more—are innovating on their own terms with Lyric. After an incredible 2024, we expect 2025 to be an even bigger rocket ship, and we can’t wait to see what our customers—both current and future—are empowered to build with us next.
Job Overview: We are seeking a highly skilled and experienced Senior Data Scientist to join our team. The ideal candidate will have a strong background in time series analysis and a proven track record of applying these skills to solve complex problems in the supply chain domain. This role requires a minimum of 4 years of experience in data science, with a focus on time series forecasting, anomaly detection, and optimization.
Key Responsibilities:
-
Develop, implement, and maintain time series models to forecast demand, inventory levels, and other key supply chain metrics.
-
Identify and analyze anomalies in time series data to detect potential issues in the supply chain processes.
-
Collaborate with cross-functional teams to understand business requirements and translate them into data-driven solutions.
-
Design and conduct experiments to evaluate the performance of different models and approaches.
-
Communicate findings and insights to both technical and non-technical stakeholders through reports, dashboards, and presentations.
-
Mentor junior data scientists and provide guidance on best practices in time series analysis.
-
Stay current with the latest advancements in time series methods and technologies, and incorporate them into the team's workflow.
Qualifications:
-
Bachelor’s or Master’s degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. A Ph.D. is a plus.
-
Minimum of 4 years of experience in data science with a focus on time series analysis.
-
Proficiency in programming languages such as Python or R.
-
Experience with time series forecasting techniques (e.g., ARIMA, SARIMA, Prophet, LSTM).
-
Strong understanding of statistical methods and machine learning algorithms.
-
Experience with data visualization tools (e.g., Tableau, Power BI, matplotlib, seaborn).
-
Familiarity with supply chain processes and challenges is highly desirable.
-
Excellent problem-solving skills and the ability to think critically and analytically.
-
Strong communication skills and the ability to work collaboratively in a team environment.
Preferred Qualifications:
-
Experience with big data technologies (e.g., Hadoop, Spark).
-
Knowledge of optimization techniques and tools (e.g., linear programming, mixed-integer programming).
-
Experience with cloud platforms (e.g., AWS, GCP, Azure).