Iteration Fixed Point Exercises

This is the third of three sets of fixed point ex­er­cises. The first post in this se­quence is here, giv­ing con­text.

Note: Ques­tions 1-5 form a co­her­ent se­quence and ques­tions 6-10 form a sep­a­rate co­her­ent se­quence. You can jump be­tween the se­quences.

  1. Let be a com­plete met­ric space. A func­tion is called a con­trac­tion if there ex­ists a such that for all , . Show that if is a con­trac­tion, then for any , the se­quence con­verges. Show fur­ther that it con­verges ex­po­nen­tially quickly (i.e. the dis­tance be­tween the th term and the limit point is bounded above by for some )

  2. (Banach con­trac­tion map­ping the­o­rem) Show that if is a com­plete met­ric space and is a con­trac­tion, then has a unique fixed point.

  3. If we only re­quire that for all , then we say is a weak con­trac­tion. Find a com­plete met­ric space and a weak con­trac­tion with no fixed points.

  4. A func­tion is con­vex if , for all and . A func­tion is strongly con­vex if you can sub­tract a pos­i­tive para­baloid from it and it is still con­vex. (i.e. is strongly con­vex if is con­vex for some .) Let be a strongly con­vex smooth func­tion from to , and sup­pose that the mag­ni­tude of the sec­ond deriva­tive is bounded. Show that there ex­ists an such that the func­tion given by is a con­trac­tion. Con­clude that gra­di­ent de­scent with a suffi­ciently small con­stant step size con­verges ex­po­nen­tially quickly on a strongly con­vex smooth func­tion.

  5. A finite sta­tion­ary Markov chain is a finite set of states, along with prob­a­bil­is­tic rule for tran­si­tion­ing be­tween the states, where rep­re­sents the space of prob­a­bil­ity dis­tri­bu­tions on . Note that the tran­si­tion rule has no mem­ory, and de­pends only on the pre­vi­ous state. If for any pair of states , the prob­a­bil­ity of pass­ing from to in one step is pos­i­tive, then the Markov chain is er­godic. Given an er­godic finite sta­tion­ary Markov chain, use the Banach con­trac­tion map­ping the­o­rem to show that there is a unique dis­tri­bu­tion over states which is fixed un­der ap­pli­ca­tion of tran­si­tion rule. Show that, start­ing from any state , the limit dis­tri­bu­tion ex­ists and is equal to the sta­tion­ary dis­tri­bu­tion.

  6. A func­tion from a par­tially or­dered set to an­other par­tially or­dered set is called mono­tonic if im­plies that . Given a par­tially or­dered set with finitely many el­e­ments, and a mono­tonic func­tion from to it­self, show that if or , then is a fixed point of for all .

  7. A com­plete lat­tice is a par­tially or­dered set in which each sub­set of el­e­ments has a least up­per bound and great­est lower bound. Un­der the same hy­pothe­ses as the pre­vi­ous ex­er­cise, ex­tend the no­tion of for nat­u­ral num­bers to for or­di­nals , and show that is a fixed point of for all with or and all ( means there is an in­jec­tion from to , and means there is no such in­jec­tion).

  8. (Knaster-Tarski fixed point the­o­rem) Show that the set of fixed points of a mono­tonic func­tion on a com­plete lat­tice them­selves form a com­plete lat­tice. (Note that since the empty set is always a sub­set, a com­plete lat­tice must be nonempty.)

  9. Show that for any set , forms a com­plete lat­tice, and that any in­jec­tive func­tion from to defines a mono­tonic func­tion from to . Given in­jec­tions and , con­struct a sub­set of and a sub­set of of such that and .

  10. (Can­tor–Schröder–Bern­stein the­o­rem) Given sets and , show that if and , then . ( means there is an in­jec­tion from to , and means there is a bi­jec­tion)


Please use the spoilers fea­ture—the sym­bol ‘>’ fol­lowed by ‘!’ fol­lowed by space -in your com­ments to hide all solu­tions, par­tial solu­tions, and other dis­cus­sions of the math. The com­ments will be mod­er­ated strictly to hide spoilers!

I recom­mend putting all the ob­ject level points in spoilers and in­clud­ing meta­data out­side of the spoilers, like so: “I think I’ve solved prob­lem #5, here’s my solu­tion <spoilers>” or “I’d like help with prob­lem #3, here’s what I un­der­stand <spoilers>” so that peo­ple can choose what to read.


To­mor­row’s AI Align­ment Fo­rum Se­quences post will be “Ap­proval-di­rected agents: overview” by Paul Chris­ti­ano in the se­quence Iter­ated Am­plifi­ca­tion.

The next post in this se­quence will be re­leased on Satur­day 24th Novem­ber, and will be ‘Fixed Point Dis­cus­sion’.