Well, I should like to see some examples. So far, our tally of actual examples of this alleged phenomenon seems to still be zero. All the examples proffered thus far… aren’t.
“These boots,” I said gesturing at what I was trying on, on my feet, “cost $200. Given that I typically buy a pair for $20 every year, that means these boots have to last 10 years to recoup the initial investment.”
That was on January 17, 2005. They died earlier this month – that is in the first week of December, 2018. So: almost but not quite 14 years.
So, purely as an investment, they returned a bit under $80, which is a 40% ROI.
But… if we’re talking about this just as an investment, we need to compare to other investments. Let’s say the S&P 500 returns 7% consistently (I think that’s pessimistic—note, not adjusting for inflation because the $20 boots haven’t changed with inflation either).
In one world, Siderea buys $200 boots and invests $80. After 14 years, she has $80⋅1.0714=$206 and no boots.
In another world, Siderea buys $20 boots and invests $260. A year later she withdraws $20 and buys boots, and so on. After 14 years, she has… finite geometric series, 20(r1+...+r14), I think she has $483 and no boots.
So if we think of this as a purely financial investment, I guess it was a bad one?
(This is also often missing when people talk about buying versus renting. Yes, the mortgage is often lower than rent, and house value is likely higher at the end, but you gave up investing your deposit. How do those effects compare? Probably depends on time and place.)
In another world, Siderea buys $20 boots and invests $260. …
I think that your calculation is a bit off. After a year, she’ll have $258.20 (i.e., ($260 * 1.07) − $20). After two years, $256.27 (i.e., ($258.20 * 1.07) − $20). And so on. After 14 years, she’ll have $219.41.
Still better than buying the expensive boots—in purely financial terms.
(Inflation-adjustment is another important point, of course. That $200 in 2005 would be $265 in 2018 dollars.)
In short, yes, this is indeed a very poor example—ironic, as it’s a real-life version of the original example!
(This is also often missing when people talk about buying versus renting. Yes, the mortgage is often lower than rent, and house value is likely higher at the end, but you gave up investing your deposit. How do those effects compare? Probably depends on time and place.)
This is precisely what made renting come out far ahead, in my aforementioned calculation. (And this is without even considering the time value of money.)
Smaller correction—I think you’ve had her buy an extra pair of boots. At $260 she’s already bought one pair, so we apply x↦1.07x−20 thirteen times, then multiply by 1.07 again for the final year’s interest, and she ends with no boots, so that’s $239.41. (Or start with $280 and apply x↦1.07(x−20) fourteen times.)
Not sure why my own result is wrong. Part of it is that I forgot to subtract the money actually spent on boots—I did “the $20 she spends after the first year gets one year’s interest, so that’s $21.40; the $20 she spends after the second year gets two years’ interest, so that’s $22.90...” but actually it’s only $1.40, $2.90 and so on. But even accounting for that, I get $222.58. So let’s see...
Suppose she only needs to buy two pairs of boots. According to your method she goes $40 → $21.40 → $1.50. (Or, $40 and no boots → $20 and boots → $21.40 and no boots a year later → $1.40 and boots → $1.50 and no boots a year later.) According to mine, of her original $40, $20 of it earns no interest and $20 of it earns a years’ interest. But that assumes the interest she earns in that year is withdrawn, she gets to keep it but it doesn’t keep earning interest. So that’s why I got the wrong answer.
I think you’ve had her buy an extra pair of boots.
Ah, true. So, $239.41, at the end.
(Of course, this all assumes that the cheap boots don’t get more expensive over the course of 14 years. Siderea does say that she spends $20 each year on boots, but that’s hard to take seriously over a decade-plus period…)
From https://siderea.dreamwidth.org/1477942.html:
But… if we’re talking about this just as an investment, we need to compare to other investments. Let’s say the S&P 500 returns 7% consistently (I think that’s pessimistic—note, not adjusting for inflation because the $20 boots haven’t changed with inflation either).
In one world, Siderea buys $200 boots and invests $80. After 14 years, she has $80⋅1.0714=$206 and no boots.
In another world, Siderea buys $20 boots and invests $260. A year later she withdraws $20 and buys boots, and so on. After 14 years, she has… finite geometric series, 20(r1+...+r14), I think she has $483 and no boots.
So if we think of this as a purely financial investment, I guess it was a bad one?
(This is also often missing when people talk about buying versus renting. Yes, the mortgage is often lower than rent, and house value is likely higher at the end, but you gave up investing your deposit. How do those effects compare? Probably depends on time and place.)
I think that your calculation is a bit off. After a year, she’ll have $258.20 (i.e., ($260 * 1.07) − $20). After two years, $256.27 (i.e., ($258.20 * 1.07) − $20). And so on. After 14 years, she’ll have $219.41.
Still better than buying the expensive boots—in purely financial terms.
(Inflation-adjustment is another important point, of course. That $200 in 2005 would be $265 in 2018 dollars.)
In short, yes, this is indeed a very poor example—ironic, as it’s a real-life version of the original example!
This is precisely what made renting come out far ahead, in my aforementioned calculation. (And this is without even considering the time value of money.)
Huh, thanks for the correction.
Smaller correction—I think you’ve had her buy an extra pair of boots. At $260 she’s already bought one pair, so we apply x↦1.07x−20 thirteen times, then multiply by 1.07 again for the final year’s interest, and she ends with no boots, so that’s $239.41. (Or start with $280 and apply x↦1.07(x−20) fourteen times.)
Not sure why my own result is wrong. Part of it is that I forgot to subtract the money actually spent on boots—I did “the $20 she spends after the first year gets one year’s interest, so that’s $21.40; the $20 she spends after the second year gets two years’ interest, so that’s $22.90...” but actually it’s only $1.40, $2.90 and so on. But even accounting for that, I get $222.58. So let’s see...
Suppose she only needs to buy two pairs of boots. According to your method she goes $40 → $21.40 → $1.50. (Or, $40 and no boots → $20 and boots → $21.40 and no boots a year later → $1.40 and boots → $1.50 and no boots a year later.) According to mine, of her original $40, $20 of it earns no interest and $20 of it earns a years’ interest. But that assumes the interest she earns in that year is withdrawn, she gets to keep it but it doesn’t keep earning interest. So that’s why I got the wrong answer.
Ah, true. So, $239.41, at the end.
(Of course, this all assumes that the cheap boots don’t get more expensive over the course of 14 years. Siderea does say that she spends $20 each year on boots, but that’s hard to take seriously over a decade-plus period…)