Suppose V and W are finite-dimensional and that U is a subspace of V.
Prove that there exists T∈L(V,W) such that null T=U if and only if dimU≥dimV−dimW.
Let v1,…,vr be a basis for U and define Tvj=0 for all 1≤j≤r, we now have U=null T and would like to finish definining T without adding anything to the nullspace.
Extend v1,…,vr to a basis v1,…,vn of V using 2.33 and let w1,…,wm be a basis of W.
I'd like to define Tvj+r=wj for the rest of the vj's, but this requires m≥n−r so that W has room for n−r independent vectors. Rearranging our condition gives r≥n−m which is just
dimU≥dimV−dimW
This completes the forward direction.
To see null T=U is impossible when dimU<dimV−dimW we simply apply 3.22
dimV=dimnull T+dimrange T=dimU+dimrange T<dimV−dimW+dimrange T
Which implies dimW<dimrange T which is impossible since range T is a subspace of W.
This completes the backward direction.