Basis and Dimension
MIT 18.06SC Linear Algebra, Fall 2011
View the complete course: ocw.mit.edu/18-06SCF11
Instructor: Ana Rita Pires
A teaching assistant works through a problem on basis and dimension.
License: Creative Commons BY-NC-SA
More information at ocw.mit.edu/terms
More courses at ocw.mit.edu
Пікірлер: 45
These MIT lectures are too good to be true. Thanks to all behind these videos.
Thanks a lot for showing the case of these vectors as columns. I had solved the matrix for pivots and chosen first three columns of the Echelon matrix as the basis. But clearly, as you pointed it out I was wrong. Awesome tutorial.
@bridge5189
4 жыл бұрын
But, in the next lecture #10, at kzread.info/dash/bejne/oHygp5l-l62slNY.html Prof Gilbert Strang says that basis is the PIVOT COLUMNS!!
@dHnd2j1u
4 жыл бұрын
@@bridge5189 Actually, a few seconds before Prof Strang explains by saying "the pivot columns i'm interested in are columns of A, the ORIGINAL A" kzread.info/dash/bejne/oHygp5l-l62slNY.html
@sachinbs3961
3 жыл бұрын
@Indrajeet It is possible to pick the last columns if we performed column elimination. Then we would only have performed linear combinations of columns. Of-course then it would be same as writing the vectors as rows as doing row elimination which was the first method as explained in the video.
@NicolasAumar
Жыл бұрын
@@bridge5189 The basis could be the pivot columns of the initial matrix, not the pivot columns of the matrix after elimination.
@hoangduy500
Жыл бұрын
@@bridge5189 pivot columns here mean the position of the initial column, not the column after elimation.
I just fell in love with this teacher ,you gave me such a great understanding
When you work with column vectors and perform row operations on matrices, the operations mix the components within each vector. For example, when you add or subtract rows representing column vectors (like [x, y, z, w]), the operations combine elements from the same vector. This can lead to combinations like x + y within the same vector, which doesn't maintain the original components separately. However, when you use row vectors in row operations, these operations don't mix components of the same vector. Each row represents a different vector, and the operations performed on rows don't blend or combine elements within a single vector. you just add or subtract or exchange vectors, and it is fine. So , that's why you cannot use the final columns when you eliminate using column vectors. You do not perform linear operations between different vectors, rather you mix a vector with itself
@anonymousgawd..3047
6 ай бұрын
Thnx 🎉red ...good work it's a basic thought but yeah there should be clarity
@then-go
Ай бұрын
Thank you so much for your explanation!
Thank you Ana and MIT!
Thanks MIT for sharing such a great teacher and his teaching with the Whole World..A Learner From India..
I think this girl and the Asian one are the best TA so far in any MIT ocw
thanks for pointing out about using the transpose matrix to solve the problem. that was exactly my question
@zhaopeter6532
5 жыл бұрын
hi, do you know why they have same pivots (a matrix and its transpose)?
@samuelleung9930
4 жыл бұрын
zhao peter it just happens to be the same. Or when you do rref, you will always get the same pivots :)
thanks a lot for the clarification at the end
love this course
as there are 5 column vectors and each vector belong to R^4(we can have almost 4 linearly independent vectors for R^4) so don't even need to check if they are dependent by doing gaussian elemination.
excellent way of teaching👏👏👏👏
Very clear. Thank you very much!
It would be good to know why you can use the rows of the echelon matrix when doing vectors-as-rows, but can't use the columns of the echelon matrix when doing vectors-as-columns. The fact is stated, and justification is given in terms of the example ("not enough numbers"). But the reason why the methods aren't symmetrical is not explained. I believe there ought to be a good geometric explanation for this, or at least something in terms of the definitions of the spaces.
@robertchu4092
5 жыл бұрын
This is because columns in the echelon matrix are not formed by any linear combinations of the original columns. The process of creating echelon matrix is basically a series of row operations (i.e., new rows are formed by linear combinations of original and modified rows), which preserves linear independence of pivot rows (not pivot columns). That's why she said you may even use those original rows that correspond to the pivot rows to form the basis for that space.
@krishnkantswarnkar4735
5 жыл бұрын
@@robertchu4092 Hey! thanks. I had the same doubt. This was a good explaination.
@harshadzade3971
2 жыл бұрын
@@robertchu4092 This was helpful! Thanks!
please confirm (1,1,-2,0,-1) row vector or column vector, while soling TA taken as Row Vector is it correct
thank you so much
great recite!
Thank you!
So if I wrote the vectors as rows and did the elimination, I can directly use the final 3 rows (with pivots)?
@heyheyheyy5008
3 жыл бұрын
Yep
Can I solve it by finding the rref of the given matrix
Thankyou very much
Dont We take columns in basis?
Why is that the elimination of column vectors changes the column space ( 7:06 ) but the elimination of row vectors doesn't change the space (4:40) ?
@ashutoshtiwari4398
5 жыл бұрын
I got the answer. Lec-10, 24:00. Row transformation on a matrix A dosen't change its row space but changes its column space.
@DeepakSingh-xt5io
5 жыл бұрын
@@ashutoshtiwari4398 i was about to comment the same thing :)
@user-ks5wj6hz9x
4 жыл бұрын
@@ashutoshtiwari4398 thanks
@shaunwu3609
4 жыл бұрын
Because you are performing the row operations on the column vectors, inevitably changing the column space. If you perform column operations on the column vectors, you would not be changing the column space. The column position of the leading ones after transposing the matrix and performing row operations would correspond to the row position of the original matrix.
Thank you :)
Hey, I know this is a stupid question. What is the transpose of this universe?
@sohebsk2196
Ай бұрын
"esrevinu " 😂
Vow....
Thank you!