Suppose v1,,vmv_1, \dots, v_m is linearly independent in VV and wVw \in V. Prove that if v1+w,,vm+wv_1 + w, \dots, v_m + w is linearly dependent, then wspan(v1,,vm)w \in \text{span}(v_1,\dots,v_m).


Suppose v1+w,vm+wv_1+w\dots,v_m+w is linearly dependent, then

a1(v1+w)++am(vm+w)=0a_1(v_1+w) + \dots + a_m(v_m+w) = 0

With some aj0a_j \ne 0. Distribute and move the ww's to one side

a1v1++amvm=(a1++am)wa_1v_1 + \dots + a_mv_m = -(a_1+\dots+a_m)w

Since the vv's are linearly independent, and some aj0a_j \ne 0 we must have a1v1++amvm0a_1v_1 + \dots + a_mv_m \ne 0. Therefor (a1++am)w0-(a_1+\dots+a_m)w \ne 0 which allows us to divide by the constant term to get

w=b1v1++bmvmw = b_1v_1 + \dots + b_m v_m

For bk=ak/(a1++am)b_k = -a_k/(a_1 + \dots + a_m). Thus wspan(v1,,vm)w \in \text{span}(v_1,\dots,v_m).