I’d honestly rephrase this as “if there are two points, they are the same point.”
the distance between them…equals the distance between their images…which equals or is less than some fraction of the distance between them…so the distance between them must be 0
Existence
remember these? it all goes to the same place, no matter where you start from. here transformation ‘f’ could be ‘0.5s pass’…
the distance between adjacents can be expressed as dampened distance between originals
the distance between Cauchy-spinoffs is the sum of distances between adjacents, which can be expressed as dampened distance between originals
loosening the bound: CK^m =< CK^N
i quite enjoyed this proof. they spelt out some steps where others might not have! i was pleasantly surprised how helpful the introduction of constant C was, etc..
I started sketching a visualization of this proof, as I’mwont to do, but honestly it’s feeling complicated enough that I couldn’t quite work out what to get down onto the page.
I’m able to reproduce the proof from memory! …I just don’t know how I’d visually represent this one quickly, and I need to move on.
I also spent some time with the proof that l2 is complete, which I omit here (also for reasons of time!). Need to move onto connectedness :’D
happy with today’s metaphor
We’ll usually use the sup norm on C[a,b] for completeness
C[0,1] is a closed subset of complete space B[0,1], so complete. to show C[0,1] is closed in B[0,1]: use that fn → f in sup norm is uniform, and uniform limit of continuous is continuous. this works nicely because C[0,1] inherits the sup norm from B[0,1]. you can use ‘closed subset of complete space is complete’ when both spaces use the same norm
this works in L1 or L2 norm!
this is pretty interesting. sup norm is like a gatekeeping community organizer: it won’t overlook / turn a blind eye to any point of difference. so you don’t get cases like MS at lighthaven, who lived within the space but converged to something outside it
There are lots of equivalent norms we can use on finite-dimensional R^n
In sequence space, we generally define specify the space then its norm, and they’re complete wrt their own norms
Polynomial space is not complete
i should check out 2021 Q1(c): Completeness of polynomial space with different metrics
R^n with the l2-norm is a closed subspace of l2!
(on R^n, every sequence is automatically in l1, l2, and l∞)
Recently I’ve been thinking about a neural network as a point in extraordinarily high-dimensional space, moving around the space as you prompt/query it differently.
l2 is not closed or complete in the l∞ norm
not closed
not complete
The Newton-Raphson sequence: a Cauchy sequence in Q that escapes Q
Proving l2 is complete
i remember when i first encountered sequences of sequences. it was quite a moment :)
we assemble the coordinate-wise limits into one limit sequence.
we keep using a property of the whole sequence to make some claim about some coordinate of it! in this case, we use a bound on the whole sequence’s l2-norm to bound a single coordinate’s l2-norm. earlier, we used a bound on the l2-distance between two sequences to bound the l2-distance between two l2-coordinates
we pass from l2-distance between sequences to l2-distance between sequences coordinate-wise to l2-distance between sequence and limit (still coordinate-wise) to l2-distance between full sequence and limit
maybe i’ll develop “proving the l2 is complete [using Cauchy properties]” into a pretty visual—it’s a great candidate for it. might also be a good problem/proof to send C