A 2024 lookback and 2025 preview of DecisionOps workflows

2024 was the year of the Python modeling experience, optimization integrations, and helping modelers find solutions even faster with parallel runs and interactive visualizations. In 2025, we look forward to combining ML & OR, more data integrations, and decision pipelines for smoother operations.

One thing is certain when looking back at 2024: the OR space is buzzing with new energy. From INFORMS presentation topics like democratizing decision optimization applications to the surge of knowledge sharing on social media, the drive to share experiences, innovate, and move OR into the limelight is bigger than ever. This past year, we connected with leaders and community members in the operations research space including the developers from HiGHS, the creator of Seeker, the AMPL team, and many other wonderful folks at industry events and via virtual coffee chats. It was great to trade stories (and socks) and check out the latest optimization projects, including emerging opportunities with LLMs in OR, ML with OR, and native data integrations for OR.

We’ve always known that decision science teams are resourceful and resilient. We’ve seen them go to great lengths to create and maintain the tooling they need, sacrificing time spent on modeling and bringing value to the business. In the last year, we furthered our work to change that. What if OR operations tooling — what we call DecisionOps tooling — was as fully featured as the tooling for machine learning teams? How many more optimization projects would be launched? What if we could all advocate more easily for OR at every organization with prototypes that had a smooth path to production? Let’s find out together.

What we shipped in 2024

The team dug into making DecisionOps more accessible and powerful through features that enable safe collaboration, smarter analysis, and faster development. We recently presented our roadmap review and preview which covers many of these features. We’ll dive into a few below.

A complete Python decision modeling experience

We built and shipped an end-to-end experience for building, deploying, testing, and managing Python decision models. What does that mean? You can start modeling in a notebook using a template, deploy and run your model remotely from the notebook, and manage the model in the Nextmv UI.

While building models internally and working with OR professionals in industry, we found that there’s a lot of time spent re-creating or re-building common model and testing components from scratch. Even models that serve very different purposes (e.g., routing vs packing) have the same basic structure (consuming input from a source, consuming options or configuration, using solver technology to get a solution, and characterizing the solutions with statistics). Recreating that structure every time you build a model increases the chance of introducing errors. We wanted to make the process simpler and more repeatable so we formalized a common pattern used to work with decision models – putting a structure around best practices for modeling in Python with helpful APIs.

Take a look at our generic hello world Python template as well as our full list of decision apps that use the same structure for specific use cases like shift assignment.

Expanded integrations for popular solvers and modeling tools

We’re excited to offer more of your favorite modeling tools and solvers (e.g., Hexaly, AMPL, and Gurobi) as direct integrations so you can get the most out of the optimization tech you’re already using. As your operation scales in size and number of use cases, reach for the tools that work best for your business, all under one roof. See a full list of Nextmv integrations here.

Scenario testing updates for better analysis

Understand how your model performs in different scenarios with an interactive visualization of results, including custom statistics. Select from any of the KPIs you’ve exposed in your model and we’ll render them in a table that allows you to choose what you’d like to focus on when analyzing results. 

For example, in the results below, a heatmap is shown to highlight the magnitude of the differences between scenarios.

Ensembling for finding the best run

Perform multiple runs in parallel and select the best run based on your predefined rules. Use ensembling to take into account randomness, test varying model configurations, or compare different models for a given input. You might want to see how your model performs under different conditions (e.g., solver, configuration) or simply pick the best run from multiple repetitions. The ensemble runs feature is one of our most exciting updates because it provides more confidence in the model and the decisions it’s making. 

Let’s take a closer look at what this looks like in practice. A standard run with Nextmv involves sending one input, creating one run off of one instance of your executable, and returning one output.

Now with ensemble runs, you can send that one input in, but this time, you can define an ensemble definition which says “run these N different run settings for this single input.”

For example, you can make one run using Gurobi, another with OR-Tools, and another with a greedy heuristic. The ensemble run will perform all those runs for the single input, collect the results, and decide which one to return to you based on your defined rules.

Interactive decision app dashboard

We shipped updates to the Nextmv UI (console) to provide more insight into the performance of your decision app over time. The new interactive visual on the app dashboard gives you a quick overview of your custom statistics and KPIs for historical runs – helpful for catching model drift. 

This is particularly helpful when point-to-point KPI comparisons over short periods might show insignificant changes (within an acceptable range) but performance changes over a longer period of time become significant.

Vehicle routing updates: route balancing and OSRM polylines

In addition to our third-party solver integrations, we offer a complete routing solution that’s available as a prebuilt app and is completely customizable. In recent updates to our routing app, we added route balancing as an out-of-the-box feature. Now you can configure this feature to balance workload across drivers. 

On the left, we see that the number of stops per driver differs from 16 to 2, while on the right the balanced route solution gives each driver 10 stops.

Another routing-based tool we offer is Nextplot, an open source route visualization tool. Nextplot now supports polyline creation using your own OSRM server so you can see what road-based routes look like on a map. With Nextplot and OSRM, you can quickly visualize routes and customize the output. 

Teams, roles, and permissions

We made it easier to work on models together in a shared space whether you’re solving a routing problem for an enterprise logistics company or organizing an academic hackathon. Invite other users to collaborate and designate roles and permissions per user to ensure that each person has access to what they need. For instance, a developer can make changes to the app executable while an operator can run experiments without access to change the underlying executable code. 

Looking ahead – 2025 

As we look at what’s on deck this year, we’re focusing on making the DecisionOps workflow easier to implement and manage. We’ll also bring ML and LLMs into the mix with integrations that improve decision modeling.

Decision pipelines for managing workflows

A core element of DecisionOps is creating and centrally managing repeatable processes that have traditionally been managed in different places. Decision pipelines will allow you to chain together and manage components like data processing scripts, forecasting models, optimization models, and visualization in one place. At each step in the pipeline, you’ll be able to see which steps succeeded or failed and which are still queued or in progress. This lays the foundation for further integrations, making Nextmv the central hub for your decision models.

A pipeline example (illustrated below): prepare the data, solve the model, enhance the results, and return the results.

This is available to try now. Reach out to us to try it out.

A Snowflake data integration keeps your data where it is

Getting data into and out of your decision models will be even easier with our Snowflake data integration. Incorporate data steps directly into your decision pipeline so you can pull in data from Snowflake, perform data manipulation, pass the resulting data as input to an optimization model, and then pass the results of that model back to Snowflake. 

Rendering custom visualizations for operators and other stakeholders

In addition to data manipulation, users will be able to add custom visualizations to their decision pipelines. With every model run, your custom visualization will be rendered as part of that model container. That visualization may be routes on a map, containers packed, metrics plotted on a chart, or anything else your team builds. Incorporate any custom model visualization so that it lives in the context of your model in your Nextmv account. No more copy-paste from a script on your local machine to a share point presentation just to share plots.

Acceptance testing updates for more control

Create acceptance tests that are based on a defined tolerance. Acceptance testing, which allows you to compare the output and statistics of two models (e.g., baseline and candidate) is often used during the QA process and CI/CD workflows. When you’re updating your model, sometimes there are hard pass/fail lines (a certain metric must increase or decrease) and sometimes there’s an acceptable threshold for where that metric should land. 

Below we can see that the candidate value fell outside the designated threshold and therefore did not pass the acceptance test.

ML + OR connectors for smarter model management

Embed machine learning models into your optimization models while managing them separately. Incorporating predictive analytics into decision science isn’t new, but practitioners are now going beyond using static forecasts as data inputs and are embedding ML models directly into their OR models. However, managing and monitoring these models individually is nearly impossible with traditional approaches and can pose a challenge when different teams work on each model. Our ML + OR connectors will enable you to manage your ML models separately from your optimization models while connecting the tools and tech you need. 

LLM integration for decision model explainability

The boom in LLM development and availability makes interpreting optimization models and their output much simpler. Our LLM integration will allow users to directly ask questions about their model and its results. Interact with the LLM to better understand your model, generate inputs, decide when to apply different solver types, and more.

There’s more on the horizon

We’re eager to take on the coming year with product development that standardizes operational workflows, simplifies access to the latest optimization tech, and gets more projects launched safely into production. We’re eager to continue building a DecisionOps platform that lets the work of modelers and developers shine. And we’re eager to see more and more OR and decision science teams be successful with it.

There are great things to come and even greater optimization problems to solve. Join us! Get email updates to stay in the loop with all the latest product news and find out which events we’ll be attending throughout the year. Hope to see you at one soon!

Video by:
No items found.