Integer Programming (1977) [pdf]
20 comments
·September 4, 2025electric_muse
whatever1
The most useful feature in math programming is the optimality bound estimate it offers.
It is very important to know that every single time your solution is at worst 1% away from the proven optimal one. If it is not, you get a signal and you can invoke mitigations.
This is when pure heuristics fail. They may work 99% of the time giving you near optimal solutions, but that 1% they will fail you and even worse, you will have no idea that they failed you.
Approximation algorithms also offer some bounds, but typically they are very loose, to the point that they are not useful to the bean counters. Nobody is running their fleet using (exclusively) the Christofides algorithm.
hustwindmaple
There are some new research that applies transformers to MIPs. I'm looking forward to LLMs cracking MIPs at some point and beat those commercial solvers.
bheadmaster
I find the idea of a language model solving an integer programming problem odd. LLMs can barely do basic arithmetic. How do you imagine this happening?
mmaaz
Curious what you mean by this. Do you mean like an AlphaEvolve type thing?
abhishekbasu
I've always had the impression that Mathematical programming esp. Mixed integer programming/Integer programming is largely "unknown" outside of core engineering and operations research. It's an excellent framework to solve a whole host of problems that arise in business and elsewhere, which are solved using suboptimal (hah) heuristics instead.
Okay, maybe I was a bit harsh, but it definitely doesn't pop up as often as deep learning and statistical machine learning. For those who wish to get deeper into this, I highly recommend Optimization over Integers by Bertsimas and Weismantel.
CuriouslyC
Oh yeah, there are whole subfields of engineering that the current crop of AI deep learning engineers are mostly unfamiliar with. I've been able to find places where I can make significant advances on the state of the art in AI through incorporation of concepts from decision theory, control theory, process engineering, constraint optimization, etc.
taeric
The amusing ones, to me, are the people that know of the techniques, but are convinced they can't apply.
Obviously not everything will be easy to map into a classic optimization problem. And you may have a heuristic approach that is better for a problem. But the general solvers out there have gone a long long way.
pedrosbmartins
First time I came across integer programming (and mathematical programming generally) was when studying hydroelectric power generation planning, for a masters I ended up not pursuing. Then, when selecting a masters in CS, I ended up working with an advisor who used mixed-integer programming applied to classic machine learning models (mainly optimal decision trees). A fascinating and widely applicable method, indeed!
timonoko
One thing I discovered on 8080 was that 0FFFFH is "infinity". Meaning it is better to produce this infinity than zero-division error, when system is oscillating around zero. Otherwise you have to insert zero-tests everywhere and waste precious clock cycles.
eachro
Does anyone know what the state of the art industry solvers do for these problems? I had dabbled a bit in ml approaches to combinatorial optimization with great interest a few years back, but I don't think any of these rl based methods ended up being used in production.
mmaaz
The state of the art solvers are the proprietary ones like Gurobi, FICO, Cplex, Mosek, etc. A major contributor to the proprietary "sauce" is in the heuristics they use. For example, all solvers will have a "presolve" phase which attempts to eliminate redundant constraints/variables. There may be some ML they are using behind the scenes to derive these heuristics, I'm not sure, although I know it is a major research area.
Otherwise, the basic underlying algorithms are all the same, as in the textbook: branch-and-bound and so on.
__rito__
I know about only one such library, and works great for toy problems: PuLP [0][1].
whatever1
Or how to solve your 2^n problem in polynomial time (most of the times)
gcy
There is a textbook for this in case you don't know. https://link.springer.com/book/10.1007/978-3-319-11008-0
dgan
Do modern compiler (register allocation/ instruction generation) involve some kind of integer programming or constraint solving? I vaguely remember compilers using Z3 solver
ngruhn
Many use Z3 for type checking like F*, Liquid Haskell, Dafny.
mxkopy
Nice, my undergrad thesis used this stuff. I truly believe if P=NP the proof will be an mILP solving HC or similar
nimih
The copyright on this text seems to be 1977, not 2002. See: https://web.mit.edu/15.053/www/
dang
Ok, we'll put that year above instead. Thanks!
Integer programming is one of those things every CS undergrad hears about, then spends a career quietly rediscovering in different guises. In practice, integer constraints are where your “cute model” goes to die.
Branch-and-bound, cutting planes, implicit enumeration… they’re all ways of admitting “brute force, but try to be clever about it.”
That said, the irony is half the problems that matter in the real world, like capital budgeting, scheduling, routing, warehouse placement, are integer problems.
LP relaxations are nice, but you can’t run an airline with fractional pilots or build 2.3 hospitals. So you either lean on heuristics, or you call CPLEX/Gurobi and pray.
The fact that modern solvers still do as well as they do is, honestly, one of the underappreciated miracles of applied cs.