I don't have a specific one in mind, just the intuition that recursive processes can have a lot sharper of a cutoff than a less differentially-minded intuition might suggest.
You can also look at it probabilistically, rather than differentially. Consider even just "If the probability of a margin call causing a margin call is X, and a margin call occurs, how many margin calls will occur in a chain?" The number sharply goes up as your raise the probability close to one, it doesn't just smoothly increase. It's even worse once you add in to the model that the probability is not independent, but as more occur the probability of the next one also would increase. Especially if you add that non-independence in, what you'll see is a phase change, where you get a surprisingly sharp transition between "a margin call doesn't usually cause another one" to "a never-ending cascade of margin calls occurs", rather than a smooth one.
(I may post a model of this. Someone may beat me to it, too. It's not that hard.)
I'm not saying this is an accurate model of the financial system, just the sort of thing I was going for:
import random
import itertools
def withProb(p):
return random.random() < p
def avg100(f):
return sum(f() for i in range(100))/100
# Independent probabilities; look what happens as you get close to 1.
def marginCallChain(prob):
total = 0.0
while withProb(prob) and total < 100000:
total += 1
return total
# Dependent probabilities:
def marginCallDependent(prob):
total = 0.0
while withProb(prob) and total < 100000:
total += 1
prob = 1 - 1 /((1 / (1 - prob)) * 1.01)
return total
If you play with that with something like "avg100(lambda: marginCallChain(.95))", you can find that the chain starts extending a lot as you get close to 1. It's not a terribly sharp phase change, though.
"avg100(lambda: marginCallDependent(.95))" shows more interesting behavior. That only slightly raises the probability of the next margin call based on the fact that one occurred, and what you can see is that around .93-.95, you start seeing that every once in a while, the probability manages to occasionally work itself up to effectively 1 and the chain hits the upper limit I set. As you raise up towards one, you start to see it more and more often; .96 still sometimes manages to have a run of 100 without a crash, but even .965 the odds that one will occur in that run of 100 start to approach 1. There's a phase change between where 100 runs of the model have almost no probability of having a runaway to where the probability of at least 1 runaway is quite probable, and it's more sudden than a linear understanding of the process would suggest.
Again, this is not a model of the financial system; this is a simple model of the point I was making.
Okay. I see your point. I just tend to think of "phase change" as something different. But if the idea is that margin calls beget more margin calls, I can't disagree.
You can also look at it probabilistically, rather than differentially. Consider even just "If the probability of a margin call causing a margin call is X, and a margin call occurs, how many margin calls will occur in a chain?" The number sharply goes up as your raise the probability close to one, it doesn't just smoothly increase. It's even worse once you add in to the model that the probability is not independent, but as more occur the probability of the next one also would increase. Especially if you add that non-independence in, what you'll see is a phase change, where you get a surprisingly sharp transition between "a margin call doesn't usually cause another one" to "a never-ending cascade of margin calls occurs", rather than a smooth one.
(I may post a model of this. Someone may beat me to it, too. It's not that hard.)