Suppose that you live in a society with arranged marriages and you've been elected the town matchmaker. You'd like to make everyone happy, so you request a list from each man and woman about who they would be willing to marry. Your job is then to find a set of pairings that maximize the number of matches.

Okay, that's not too bad, it's a bipartite matching problem and can be solved with the Hopcroft-Karp algorithm.

Now suppose that it's also a polyandrous society, where every woman marries exactly two men. In addition to preferring certain men, each women also wants a pair of men with complementary skills, but every woman has a slightly different opinion about what that entails. So, for example, we may have:

Alice is willing to marry (Martin and Nick) or (Martin and Otto) or (Nick and Paul).

Beatrice is willing to marry (Martin and Nick) or (Otto and Paul).

etc.

How do we determine the maximum matching? (in polynomial time)

I’ve had a similar problem in the back of my brain:

You have a number of village chores that need to be performed, the list of which fluctuates from week to week and contains repeats of frequently needed chores. You have a list of villagers and their preferences for each chore. How do you maximize the closeness of every villager’s to-do list with their preference list?

I assume that there are more chores than villagers and that every chore needs to be assigned, right?

Have every villager create a list ranking the chores from 1 to n, with their preferred chores near the top.

I’m assuming here that every chore is roughly equal. If one chore is 100 times worse than the others, then simply ranking them fails to capture that.

Compute m = ceil(chores / villagers).

Construct a graph as follows:

Each villager and each chore are nodes. There is a directed edge from each villager to each chore with a capacity of 1 and a cost equal to the rank the villager gave the chore.

There is a source node with an edge to each villager with a capacity of m and 0 cost (to ensure that everyone gets a roughly equal number of chores).

There is a sink node with an edge from each chore with a capacity of 1 and a 0 cost.

Now, run a min-cost max-flow algorithm. I couldn’t find a good online reference for this, but here’s the general idea:

Run a Maximum flow algorithm to get an initial assignment of chores so that every chore is assigned and the capacity constraints are not violated.

Then look for augmenting paths that reduce the total cost, while maintaining the same level of flow. When there are no more augmenting paths, you have an optimal solution.

There are fewer chores than villagers (presently). Chores that are bad because they are time consuming are divided into manageable chunks. No chore is sufficiently onerous as to be universally despised (indeed, some villagers have expressed total indifference as to which chores they perform). One constraint I may have left unclear: No one wants to do the same chore every week. We gave everyone points to assign to various chores, creating a more organic ranking system, allowing me (for example) to list all the sweeping chores equally and all other chores equally beneath them in preference (since I love sweeping).

Speaking of algorithms, did you have that monte carlo reference I mentioned earlier?

You asked for good references. I only have bad references. Many of the advances in Monte Carlo techniques have been done by physicists looking for faster ways to compute complicated integrals. I go cross-eyed every time I look at them, trying to mentally translate their jargon into something more concrete like playing cards. I was fortunate enough to have a math professor suggest and explain Metropolis—Hastings to me when I was trying to sample peers from a peer-to-peer file-sharing network.

Nevertheless, here are some starting points:

Honestly though, a good university science library is probably more useful than the Internet.

On a related note, I noticed yesterday that if I assign each player in Poker Stove 20% (66+,A4s+,K8s+,Q9s+,J9s+,T9s,A9o+,KTo+,QTo+,JTo), Poker Stove is faster with 8 players (280,000 games/second) than with 7 players (62,000 games/second). Any idea why that might be? My best guess is that you have more than one algorithm in there and when I add the 8th player it switches over.

Thanks for the references. Monte Carlo is the red headed stepchild in my code. I don’t really care about it, and it’s only provided for those who do. Still, I’d like something better than what I have now.

Which version of PokerStove are you using? In 1.23 I gradually shift from using one MC algorithm to another based on the number of collisions encountered.

1.23.

By the way, is there any way to run Poker Stove from the command line? I’d like to write some automated tests for Poker Crunch and it’d be great to be able to run Poker Stove and Poker Crunch programmatically and compare the results.

(this would undoubtedly unearth any lingering bugs in Poker Stove, too )

I’m redevloping a new version right now. I’ll let you know when a usable cli is ready.

The physics guys focus on parallel replication, so I’m not sure how great they their techniques will be for you. You can always browse the MILC code and see if anything there inspires you.

Since this is vaguely on-topic… do you have a link to a serious discussion of the computational complexity of a planned economy?

All I hear from opponents is “That problem is impossible!”

I’m given to understand there are no impossible problems in computer science… there are -expensive- problems in computer science, but not impossible ones…

I’m given to understand there are no impossible problems in computer science… there are -expensive- problems in computer science, but not impossible ones…There are, in fact, impossible problems.

For example, there’s the Halting problem.

Then there’s the result of Godel’s Incompleteness Theorem: given a sufficiently non-trivial set of axioms, there exist theorems which are true but unprovable. A computer cannot find a proof because it’s impossible to do so.

Then there’s underspecified problems. A computer cannot answer “What is my favorite color?” without more information to work with.

Then, finally, there are the expensive problems. Some problems are so inherently intractable that while they are formally possible (an algorithm exists), they are practically impossible (i.e., the universe will end before the algorithm terminates or there are not enough particles in the universe to store all of the information that the algorithm requires). In some cases, there’s a proof that no faster algorithm exists. I think it is fair to call these problems impossible.

I’m not familiar with the planned economy problem, so for me personally it’s in the underspecified category. Do you have a mathematically rigorous definition of the problem?

Well, that is what I get for trying to be flip… I’ll try to frame the problem properly.

Given expressed (ranked) preferences for employment for which workers are qualified (including participation in education to meet future needs), known raw materials, and expressed preferences for material goods and services (in terms of survey information, past sales information from a market economy, etc), set prices for goods and services and wages to maximize the health and happiness of the citizens.

Like neuromancerzss said, finding the assignments that will maximize the goals might be impossible, but finding a good assignment of the variables is quite tractable.

… if you can provide a mathematically rigorous definition “health and happiness of the citizens” as a function of everything else, and get everyone to agree on the definition, that is.

This sounds fundamentally like the social welfare problem. Have you read Arrow?

It sounds like you are trying to construct a global social welfare function from local valuations. You can maximize it, but it’s not perfectly synthetic [that’s a term that I stole from language study] [someone is going to have their preferences disproportionately represented]. I guess the right way to say it is that the mapping is lossy.

No, I haven’t read that book, and I’m not trying to construct anything. I -am- trying to refute those who claim it is impossible.

I’m unclear why you feel someone is going to have their preferences disproportionately represented.

This isn’t a feel thing, it’s a know thing. You should read REALLY read Arrow. Here’s a summary of his result. Good luck with the refutation, Arrow won the Nobel Prize for this work, so you have a tough row to hoe.

I don’t see where this applies. I said that they express their preferences and the computer algorithmically determines an optimum. There isn’t any voting… why do you feel Arrow applies?

‘Voting’ is a euphemism for the expression of preferences. If I said, “Here are 100 tokens, and a list of 12 things, assign relative weights to each based on their relative value to you,” that would be a voting mechanism. Mapping those individual rankings to a global set of preferences is the social welfare function. [And this, in some part, becomes a problem about the incomparability of personal utility functions; not that you can’t, or shouldn’t try, but it’s a challenge to do it]

I’m not trying to argue with you, so please don’t feel that way. Optima are available, and computable, but the point of Arrow is that there has to be some kind of sacrifice, either in the ways [the people] or the means [the mechanism design].

For example, If I said to a hundred people, “Rank these ten things in order of your preference,” and then I took the top three, then I wouldn’t care if the preference aggregation mechanism was broken for the bottom seven. I only care that it gets the top three completely and accurately. In principle, it is broken, and in practice it doesn’t matter. Expressing the does-or-doesn’t matter-ness is an exercise in constraint definition.

Some problems are unsolvable, like the halting problem. Planning an economy doesn’t sound like any of those sorts of problems though. Especially if you’re looking for just “very good” rather than “absolutely perfect” efficiency.

1.23

By maximum here, you mean a matching which maximizes the number of arrangments. Do you care if there are multiple solutions, and if so which one is selected? Or do you want a list of all arrangments which satisfy the constraints?

By maximum here, you mean a matching which maximizes the number of arrangments.Yes.

Do you care if there are multiple solutions, and if so which one is selected?If there are multiple solutions, I don’t care which one is selected.

This feels like 2-SAT. Which means that getting a “all women can have some preference satisfied” or its negation is easy, but finding the maximum such arrangement in the case when not all women can be satisfied will me NP-Complete, just like MAX-2SAT.

Alternatively, perhaps everything looks like MAX2SAT to me because of its recent occurrence in my dissertation.

This feels like 2-SAT. Which means that getting a “all women can have some preference satisfied” or its negation is easy, but finding the maximum such arrangement in the case when not all women can be satisfied will me NP-Complete, just like MAX-2SAT.I would be delighted to have a polynomial algorithm that determines if “all women can be satisfied or not”.

Do you see a way to translate the problem into a 2-CNF?

Well, we begin by setting up implications.

AMN -> !(AMO)

AMN -> !(ANP)

AMN -> !(BMN)

.

.

.

Then we include the singleton clauses A, B, C, etc…

Thus, we require that every girl gets married, and we also require that …. hmm. Y’know what? This isn’t working. I’ve got it formulated as a 2SAT, but I can’t figure out a way for “nobody gets married” to fail to satisfy the 2SAT

Would this work? I’m not a 2SAT expert.

AMN -> A

AMO -> A

BMN -> B

!A -> !(AMN)

!A -> !(AMO)

!B -> !(BMN)

Hmm, no, you’re right. I can’t find a way either. Shoot.