I’m a bit late to the party, but I’d like to mention Vimium. It’s a browser extension (chrome and a firefox port) that creates vim-like hotkeys and injects them into every page.
Scroll with “hjkl”, search with ”/”, jump tabs and bookmarks with “T” or “B”. My favorite command is “f”, which puts a little box with letters next to everything clickable on the page. Type the letters and it clicks the element.
I’d estimate I use the mouse for browser navigation about 20-30% of the time. The activation energy for learning to use “f” in particular was very low, because it was almost immediately a better experience than using the mouse.
So, for optimizing a process with many variables (like tomato sauce), estimate the direction you might improve each variable and move a small amount in that direction, instead of exhaustively testing each variable independently? Because we know that actually works pretty well.
There are some things that Alice does that a gradient descent optimizer doesn’t, though, which might also be important. Particularly: she recognizes which variables are likely to affect which features, and she adds a new variable (carrot) from a rather large search space.
I wonder if Alice is vulnerable to a local minimum trap—she might converge upon pretty good tomato sauce that she can’t improve upon, while Bob exhaustively searches for (and might eventually find) a perfect tomato sauce. I agree with the point, though—if you try Bob’s strategy, you’ll be eating a lot of bad sauce in the process of exploring all possible ingredient combinations.