Definition: Linear independence. A subset is called linearly independent if for any implies that must be equal to 0 and zero only.
Essentially, if there is no way to add the vectors in the set to get back to zero without multiplying by them zero, they are linearly independent.
Examples of linearly independent vectors include any basis vectors, among others
Non-examples are sets of vectors where multiple are co-linear
Example: Say we have some . Suppose that . The only solution for this is . Thus, is linearly independent.
There is a special set of linearly independent vectors:
They are vectors where the th element is 1 and the rest are zero.
Remark: If is linearly independent,
This makes sense because any real number times zero equals zero
For a set of vectors to be linearly independent, they must reach zero through multiplying only by zero.
Proof: Suppose some and . Then is not linearly independent.
This claims that if there is in , and we can make that vector through linear combinations of other vectors in the set, it is not linearly independent
As , then there exists . We want to see if can be made by these combinations.
Since the coefficient of , and the total sums to 0, we can conclude that is not linearly independent.
Remark: If and is not linearly independent, then is also not linearly independent.
This is just saying that is a subset of a set is not linearly independent, any larger sets that contain it are also not linearly independent.