Just a LP, really?

In recent talks in scientific conferences, I saw some people dissmiss pure linear programming (LP) models as being easy and primitive. I often got said “oh, that’s just a LP”. Linear programming is put in the box of solved problems, like some network problems such as the (pure) shortest path or max flow. In fact, I think that quite the opposite is true. For many reasons, LP is still an important area of research and deserves much interest.

For one, solving LPs very efficiently is very important to solve mixed-integer linear (MILP) models. Solver developers have found many clever ways of solving difficult MILPs while limiting the number of nodes that need to be explored, but branching and solving LPs is still very much needed. My main body of research is in supply chain planning, where models tend to become very large very quickly.

Second, LP in itself is very appropriate to model several real-world planning or control problems. For instance, the Quebec province’s office of the chief forester needs to solve several LPs to determine the maximum annual volume of wood that can be cut for each of the province’s 70+ management areas. Some of these models are quite large (say, 5 million variables and 1 million constraints), and the amount of sensitivity analysis they can perform is therefore limited if each LP takes several hours to solve.

For these reasons it is my belief that research efforts must continue on how to better solve LPs.

OR in the field: two challenges

This post presents some thoughts about doing applied operations research together with companies. It’s a lot of work, to be sure, but it’s also a lot of fun. Some of my colleagues tease me about not being often at my office. While it’s true that I split my office time between the research center and the buisness school, I also spend a lot of time with logistics managers and planners in forest companies but also with government people. Most of my projects involve operations research in one way or another.

First, understand the problem

The first challenge is understanding the problem, as seen from the user’s perspective. I have yet to find someone whose problem is solving a straight-out-of-the-box CVRP. Logistics problems, in particular, often are interconnected, so inventory or production planning decisions often interact with other decisions such as transportation or sales.  As I mentioned in an earlier post, many users aren’t interested in a logistics plan that aims solely at minimizing direct cost. This is especially true in tactical or strategic planning, where I do most of my work. Beyond cost minimization, what the user really wants to achieve through tactical planning is often a low-cost solution that offers a “good” compromise between a certain number of other factors.

Second, build the right model

Once you have a good idea of what is important in the problem, I usually try to build the simplest possible model  that can possibly work. I do this for two reasons. First, the distribution of time needed to debug a complex model tends to be heavy-tailed, and it is less the case for simple models. Also, the flaws and shortcomings of simple models are easier to understand, explain and correct than those of bigger models. And yes, bigger models have flaws too. They just are more tricky to find and fix.

Quite often, a simple model is the right first model to build. Then add (or remove) complexity and features one at a time.

When optimal is not enough

Currently I work with Professor Mikael Rönnqvist from Université Laval on an inventory and transportation management problem from a forest products company.  As a large scale commodity products company, much of its focus is on reducing logistics … [Continue reading]

Exam design problem

For students, the final exam week is probably the most stressful time of a semester. They often have multiple exams over consecutive days and need to split their last-minute study and preparation time over these topics. I must confess that I also … [Continue reading]

The day I put my hand into the shredder

...or so to speak. Recently, I've been working (in collaboration with two people) on a production planning model for a network of sawmills. It is not yet customized for a particular company. We came to this particular model after a few meetings, and … [Continue reading]

Performance variability: consequences

P64T28C5

If you work on difficult mixed-integer programming (MIP) models, you have experienced performance variability. This refers as the impact on performance of seemingly performance neutral changes induced in the model or the solver. Part 1 of this series … [Continue reading]

Performance variability: good readings

If you work on difficult mixed-integer programming (MIP) models, you have experienced performance variability. This refers as the impact on performance of seemingly performance neutral changes induced in the model or the solver. Part 1 of this series … [Continue reading]

Performance variability defined

These days, it seems that one of the most simple and effective ways to tackle difficult mixed-integer programming (MIP) models is to run the model on a state-of-the-art solver with different random seeds. It is very easy to implement and often … [Continue reading]

In-house testing of CPLEX distributed MIP

Distributed optimization is now part of standard CPLEX releases. In fact, it's been there since release 12.5. I decided to do some experiments and post on this after seeing Daniel Junglas' talk at the German OR Conference in September, but writing … [Continue reading]

In-house testing of the Gurobi 6 distributed MIP [dual]

Distributed optimization is now part of the Gurobi 6.0 release. In today's post, I talk about some tests I ran with the distributed MIP algorithm. The associated primal post discusses results presented by Gurobi at the INFORMS 2014 Annual conference. … [Continue reading]