Triple or nothing paradox

You are at a casino. You have $1. A table offers you a game: you have to bet all your money; a fair coin will be tossed; if it lands heads, you triple your money; if it lands tails, you lose everything.

In the first round, it is rational to take the bet since the expected value of winning is $1.50, which is greater than what you started out with.

If you win the first round, you’ll have $3. In the next round, it is rational to take the bet again, since the expected value is $4.50 which is larger than $3.

If you win the second round, you’ll have $9. In the next round, it is rational to take the bet again, since the expected value is $13.50 which is larger than $9.

You get the idea. At every round, if you won the previous round, it is rational to take the next bet.

But if you follow this strategy, it is guaranteed that you will eventually lose everything. You will go home with nothing. And that seems irrational.

Intuitively, it feels that the rational thing to do is to quit while you are ahead, but how do you get that prediction out of the maximization of expected utility? Or does the above analysis only feel irrational because humans are loss-averse? Or is loss-aversion somehow optimal here?

Anyway, please dissolve my confusion.