Let’s say you make a trip to the store, making sure to lock the door behind your on the way out. When you return and try to let yourself in, you discover that you lost your keys somewhere along the way. Round-trip, the whole distance traveled was longish to hunt for a pair of lost keys, like 1km. They could be anywhere!
How should you go about finding your keys? Should you spend the whole cold day and night slowly scouring your path? That sounds awful. But reality isn’t going to do you any favors: there’s no way your keys are more likely to be in one place along the way than another. So, for example, if the space within ten meters of your door accounts for 2% of the whole trip, the probability of finding your keys within that space must be equal to 2%, not greater than or less than 2%. Right?
Nope. It turns out that reality wants to do you a favor. There’s a good place to look for your keys.
Intuition says that they are as likely to be in one place along the way as any other. And intuition is right for the special case that your keys were definitely very secure and very unlikely to have fallen out on that particular trip. But they probably weren’t. After all, if it was so unlikely, they shouldn’t have fallen out. So we can’t just consider the world where the very unlikely happened. We have to consider several possible worlds of two rough types:
* The worlds in which your keys were very secure, but the very unlikely happened and they fell out anyway.
* The worlds in which your keys, on that particular trip, were unusually loose and bound to fall out.
So those are the two types of possible world we’re in, and we don’t have to consider them equally. The mere fact that your keys fell out means it’s more likely that you’re in the second type of world, that they were bound to fall out. And if they were bound to fall out, then they probably fell out right away. Why? We can take those worlds and divide them again, into those where your keys were likely but not too too likely to fall out, and those in which your keys were not just very likely, but especially very likely to fall out. And so on. Of the worlds in which your keys were bound to fall out, the ones that are most likely are the ones in which they fell out right away.
So there it is. If you lost your keys somewhere along a long stretch, you don’t have to search every bit of it equally, because they most likely fell out on your way down the doorstep, or thereabouts. The probability of finding your keys within 10 meters of the door is greater than 2%, possibly much greater.
What is the probability exactly? If you’d had several keys to lose, we might be able to better estimate which specific world we’re in of the millions. But even with just one key lost, the mere fact that it got lost means it was most likely to have gotten lost immediately.
Why is it magic?
If you know the likelihood of losing your keys, that makes them impossible to find. If you have no idea the chances they fell out, then they’re more than likely near the door. It’s your uncertainty about how you lost them that causes them to be easy to find. It’s as if the Universe is saying “Aww, here you go, you pitiful ignorant thing.”
Solving the puzzle, with and without data
So you can’t get the actual probability without estimates of how often this trick works. But even without hard data, we can still describe the general pattern. The math behind this is tractable, in that someone who knows how to prove things can show that the distribution of your key over the length of the route follows an exponential distribution, not a uniform distribution, with most of the probability mass near the starting point, and a smooth falling off as you get further away. The exponential distribution is commonly used for describing waiting times between events that are increasingly likely to have happened at least once as time goes by. Here is my physicist friend, “quantitative epistemologist” Damian Sowinski explaining how it is that your uncertainty about the world causes the world to put your keys close to your door.
If you get in this situation and try this trick, write me whether it worked or not and I’ll keep a record that we can use to solve for lambda in Damian’s notes.
In the meantime, we do have one real-world data point. This all happened to me recently on my way to and from the gym. I was panicking until I realized that if they fell out at all, they probably fell out right away. And like magic, I checked around my starting point And There They Were. It’s an absolutely magical feeling when mere logic helps you solve a real problem in the real world. I’ve never been so happy to have lost my keys.
UPDATE: How strong is the effect?
All of the above tells us that there’s a better than 2% chance of finding your keys in the first 10 meters. But how much better than 2%? 20% or 2.001%? If the latter, then we’re really talking intellectual interest more than a pro-tip; even if the universe is doing you a favor, it’s not exactly bending over backwards for you. To tackle this, we have mathematician Austin Shapiro. Backing him up I can add that, on the occasion on which this trick worked for me, my keys were super super loose, just like he predicts. A takeaway is going to be that if this trick works for you, you did a very bad job of securing your keys.
I read your blog post, including Damian’s note. I have some things to add, but to clearly explain where they fit in, let me try to delineate two separate “chapters” in the solution to your key problem.
In chapter 1, we narrow our set of models for the location of the keys to the exponential distributions. Damian gives a good account of how this can be justified from first principles. But after doing this, we still have an infinite set of models, because an exponential distribution depends on a parameter (the expected rate of key losses per kilometer walked, which may be high if the keys are loose and hanging out of your pocket, or low if they are well secured).
In chapter 2, we use conditional probability to select among the possible values of , or, as you put it in your blog post, try to figure out which world we are in. This is the part that interests me, and it’s also the part that still needs mathematical fleshing-out. All Damian says about it is “So what is the value of ? That’s a question for experiment — one must measure it.” But as you say, we’ve already done one experiment: you observed that your keys did fall out during a 1 km walk. This is enough to put a posterior distribution on , if we posit a prior distribution.However… what does a neutral prior for look like? I don’t know any principled way to choose. A uniform distribution between 0 and some finite ceiling is unsuitable, since according to such a model, if you’re ever very likely to lose your keys, you’re usually pretty likely to lose your keys.Assigning itself an exponential prior distribution seems murkily more realistic, so I tried that. If , then, if I did my math right, your probability of having lost your keys in the first km of your walk works out to , which is for small . So in this case, Bayesian reasoning boosts the chances that you lost your keys in the first, say, 10 meters, by a factor of . Observe that for this effect to be large, has to be pretty small… and the smaller is, the higher your average propensity to lose your keys (the mean of the exponential distribution is ). Thus, for example, to achieve the result that the universe is helping you find your keys to the tune of a factor of 5 — i.e., that your chance of having lost your keys in the first 10 meters is 5% instead of the “intuitive” 1% — you need to assume that, a priori, you’re so careless with your keys as to lose them 4 times per kilometer on an average trip. That prior seems just as implausible as the uniform prior.I can think of one kind of prior that could lead to a strong finding that the universe wants to help you find your keys. That would be a bimodal prior, with a high probability that is close to 0 (key chained to both nipple rings) and a small probability that is very large (key scotch-taped to beard), with nothing in between. But I can’t think of any reason to posit such a prior that isn’t transparently circular reasoning, motivated by the answer we’re trying to prove.So… while all the exponential models definitely give you a better chance of finding your keys near the beginning of your route than near the end, I’m not convinced the effect size is all that strong; or, if it is (and you do have one magical experience to suggest it is), I’m not convinced that math is the reason!Au.