To be honest, I think this is somewhat assymetric, and kind of implies that openai are truer "Believers" than Microsoft.
If you believe in a hard takeoff, than ownership of assets post agi is pretty much meaningless, however, it protects Microsoft from an early declaration of agi by openai.
I love SAT solvers, but way more underappreciated by software engineers are MILP solvers.
MILPs (Mixed Integer Linear Programs) are basically sets of linear constraints, along with a linear optimization functions, where your variables can either be reals or ints.
Notably, you can easily encode any SAT problem as a MILP, but it's much easier to encode optimization problems, or problems with "county" constraints as MILPs.
Both are severely underused for sure. But it didn't help that for a long time open source MILP solvers were pretty mediocre.
HiGHS didn't exist, SCIP was "non-commercial", CBC was ok but they've been having development struggles for awhile, GLPK was never even remotely close to commercial offerings.
I think if something like Gurobi or Hexaly were open source, you'd see a lot more use since both their capabilities and performance are way ahead of the open source solutions, but it was the commercial revenue that made them possible in the first place.
Using CP-SAT from Google OR-Tools as a fake MILP solver by scaling the real variables is pretty funny though and works unreasonably well (specially if the problem is highly combinatorial since there's a SAT solver powering the whole thing)
SCIP going Apache definitely improved the landscape, but Couenne (global MINLP), Bonmin (local MINLP), and IPOPT (local NLP, but e.g. [1] gets you MINLP) are solid and have been around for a long time. And anecdotally, I've seen a lot more issues with SCIP (presolvers and tolerances, mostly) than with other solvers. Still it's replaced Couenne in my toolbox, and Minotaur has replaced Bonmin, but IPOPT remains king of its domain.
Many thanks to you and the parent comment for providing names to search when looking for implementations.
A basic question, before searching these: are they "input compatible"? I mean, can a problem be formulated once and then be solved by a variety of solvers? Or does each one of them use its own input language?
For MILP there isn't one single standard, but multiple competing solutions.
Nearly every solver supports the MPS format, but that's a really old format straight from the era of punchcards, it sucks.
Many solvers support the nl format, which is a low level format spat out by the AMPL tool (commercial software for mathematical modeling).
Many solvers support the CPLEX lp format, which is a nice human readable and writable format.
Google OR-Tools includes an API for mathematical modeling that supports the relevant open source MIP solvers plus Gurobi I think and its own CP solver. There are Python and Julia packages that try to do the same (rather than calling the solver APIs directly they usually spit out a problem in nl format though).
MiniZinc supports various open source MILP solvers plus various CP solvers. Very nice language, very high level.
For MINLP the only standard I know of is OSiL but support for it is spotty, mostly supported by open source tools I think.
This is a good list --- would also add Pyomo. There's plenty of nuance to algebraic modeling languages like Pyomo and JuMP, but at base you're just writing mathematical expressions in Python for Pyomo (or in Julia for JuMP) to parse and transform into the target format. E.g. taking the objective from the Weapon Target Assignment problem (https://en.wikipedia.org/wiki/Weapon_target_assignment_probl...):
def objective(model: WtaModel) -> SumExpression:
return sum(
model.target_values[t_j]
* prod(
model.survival_rates[w_i][t_j]
** model.target_selected[w_i, t_j]
for w_i in model.weapon_types
)
for t_j in model.targets
)
Didn't know about Randomized rounding. Is there any solver with built-in support for that? To turn a strong NLP solver into a fast but approximate MINLP solver?
Not necessarily randomized rounding in particular, but many solvers use rounding methods internally e.g. as part of a feasibility pump. Minotaur and SCIP definitely do this.
SATs are cool but MILPs are cooler IMO.
Lol I've been trying to train a neural network over a finite field, not the reals and oh my god MILPs are God's gift to us.
If you get sick of MILPs, maybe you could use a representation of your finite field instead of the field itself? That way you could do everything in C^n, and preserve differentiability to use SGD or something like it.
The hedonic treadmill really gets away from some people. I've had coworkers on 7 figures talk about how they couldn't possibly retire because the costs of living in (HCOL city) are far too high for that.
When you dig down into it, there's usually some insane luxury that they're completely unwilling to give up on.
If you're a software engineer in the United States, or in London, you can almost certainly FIRE.
Yup it's insane to me. I am a software developer in Germany making 30k (after taxes) and manage to save up 600-700€ a month while still living really good (rural area, no car).
Absolutely not enough to retire early but easily enough to not live paycheck to paycheck. Making 6 figures in the USA and not being able to afford life is so cryptic to me.
Add family and 100k after taxes in Munich will be no big deal. I could live alone in the car, but the kids might want their own rooms and their own beds.
100k is a unreachable dream for me unless I found a business myself and actually succeed. Munich is expensive, I've seen some prices there. I live near Denmark though so Munich would not be a option in the first place.
I could afford a house considering my savings rate and current appartment rent. Not a big one but it would be enough. Have no reason to buy one for myself though.
You can do over 100k if you freelance but it's risky to be a freelancer in a lot of ways in Germany. Salaries in Berlin and Munich are approaching 100k or over for leadership roles. The problem is COL in both cities is high and Berlin you basically can't get a flat anymore even if you can pay the rent on it.
Back in uni, we had to design a MIP program to solve flow, it was a nice and fun problem.
Turns out you can save a lot of time and effort by just cutting infeasible solutions out one by one, and resuming the solver, as opposed to writing some tricky constraints.
Yet those companies don’t necessarily compete for performance and comparaison, but instead for their own profit. If Nintendo makes profit from selling a device that runs a game in lower spec than Sony, they’re Happy with it. Computing devices aren’t driven by performance only.
As someone who has a different last name to my child, I constantly encounter really weird issues where people assume that I don't exist since "First name, Childs last name" doesn't exist in the database.
Really? I am curious to hear an example, because I am struggling to imagine when this would be something they would try to find you by your kids last name.
For me, it's mostly been in medical settings. I've had particular trouble with public health programs automatically signing me up under my child's last name.
I don't know if my local healthcare catchment just has their software setup wrong, but it's a continual annoyance.
Even the assumption that people have (something that can be used as) a last name is incorrect.
Currently I'm living in Indonesia, where a surprisingly large number of people have just one name (plus, when they have many, they're more often than not completely arbitrary).
This was very common practice up to the '90s. If you have a single name, they duplicate it in your passport, and you end up like "Soekarno Soekarno". Which STILL raises eyebrows in several western countries' ignorant airline employees (and sometime even immigration officers, though they're admittedly more well educated about such issues).
Nowadays they proactively give at least two names to their children to match the western(-ized) system assumptions.
Sounds like a very unique experience :)
reply