|
datasplit,comment,video_id,video_name,playlist_name,annotator_H1_general,annotator_H1_setup,annotator_H1_pedagogy,annotator_H1_confusion,annotator_H1_gratitude,annotator_H1_personal_experience,annotator_H1_clarification,annotator_H1_bookmark,annotator_H1_non_english,annotator_H1_na,annotator_H2_general,annotator_H2_setup,annotator_H2_pedagogy,annotator_H2_confusion,annotator_H2_gratitude,annotator_H2_personal_experience,annotator_H2_clarification,annotator_H2_bookmark,annotator_H2_non_english,annotator_H2_na,comment_id,annotator_openaichat_v1_gratitude,annotator_openaichat_v0_setup,annotator_openaichat_v8_confusion,annotator_openaichat_v47_pedagogy,annotator_openaichat_v0_non_english,annotator_openaichat_v8_bookmark,annotator_openaichat_v1_clarification,annotator_openaichat_v39_general,annotator_openaichat_v45_personal_experience,annotator_openaichat_v3_na,annotator_openaichat_v46_personal_experience,annotator_openaichat_v47_personal_experience,annotator_openaichat_v48_personal_experience,annotator_openaichat_v49_personal_experience,annotator_openaichat_v50_personal_experience,annotator_openaichat_v40_general,annotator_openaichat_v48_pedagogy,annotator_openaichat_v49_pedagogy,annotator_openaichat_v50_pedagogy,annotator_openaichat_v51_pedagogy,annotator_openaichat_v52_pedagogy,annotator_openaichat_v53_pedagogy,annotator_openaichat_v54_pedagogy,annotator_openaichat_v55_pedagogy,annotator_openaichat_v56_pedagogy,annotator_openaichat_v57_pedagogy,annotator_openaichat_v58_pedagogy,annotator_openaichat_v59_pedagogy,annotator_openaichat_v60_pedagogy,annotator_openaichat_v61_pedagogy,annotator_openaichat_v62_pedagogy,annotator_openaichat_v63_pedagogy,annotator_openaichat_v64_pedagogy,annotator_openaichat_v65_pedagogy,annotator_openaichat_v66_pedagogy,annotator_openaichat_v67_pedagogy,annotator_openaichat_v68_pedagogy,annotator_openaichat_v69_pedagogy,annotator_openaichat_v70_pedagogy,annotator_openaichat_v71_pedagogy,annotator_openaichat_v72_pedagogy,annotator_openaichat_v73_pedagogy,68,annotator_openaichat_v51_personal_experience,annotator_openaichat_v74_pedagogy,annotator_openaichat_v52_personal_experience,annotator_openaichat_v73b_pedagogy,annotator_openaichat_v73c_pedagogy,annotator_openaichat_v73d_pedagogy,annotator_openaichat_v74_na,annotator_openaichat_v74_clarification,annotator_openaichat_v74_non_english,annotator_openaichat_v74_gratitude,annotator_openaichat_v74_general,annotator_openaichat_v74_confusion,annotator_openaichat_v74_setup,annotator_openaichat_v75_pedagogy,annotator_openaichat_v75_clarification,annotator_openaichat_v75_personal_experience,annotator_openaichat_v75_confusion,annotator_openaichat_v75_non_english,annotator_openaichat_v75_na,annotator_openaichat_v75_setup,annotator_openaichat_v75_general,annotator_openaichat_v75_gratitude |
|
validation,Determinant has been determined. Thank you MIT!,srxexLishgY,18. Properties of Determinants,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,54.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0 |
|
validation,19:40 if you turn the volume down everything he says changes its meaning,tzoYhe3H5dM,"Lec 31: Stokes' theorem | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,6.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
validation,"""Be assured it works just the same way if you have 10,000 variables""",PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,"I suppose that’s why MIT is not for every student and why it is difficult and costly to get into. This course review requires a special level of mastery of the material, which assumes a great deal of understanding and problem practice throughout the course. When a matrix is not square and doesn’t have a unique solution (independent rows or columns) all kinds of scenarios could happen. Adding to that, whether the matrix is orthogonal, orthonormal, symmetric, positive or semi definite, as well as others involving Eigenvectors, values, pivots and determinants. I suppose that the best way to wrap all that information around the head is to start with some simple 2 by 2 or 3 by 3 matrix and experiment with all different scenarios using a good software package to spit out the output and see how the theory works. I am using scientific notebook.",RWvi4Vx4CDc,34. Final Course Review,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,86.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,1.0,0.0,0.0,1.0,1.0,1.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,1.0,0.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,0.0,0.0,1.0,1.0,1.0,1.0,,0.0,1.0,,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,"21:12 In fact, if we normalize x1, x2, x3 to q1, q2, q3, then A = 0*q1*q1^T + c*q2*q2^T + 2*q3*q3^T (since q's are orthonormal). Any c, which is not necessarily real, can make A symmetric. ---can anyone help me check this statement? I havent found any comment discussing this yet.",HgC1l_6ySkc,32. Quiz 3 Review,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,8.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,form is pathetic,J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,176.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,"34:43 why ""directional second derivative"" would not give us a clue of whether it is a min or max? I thought it is a promising way. hmmm. ",15HVevXRsBA,"Lec 13: Lagrange multipliers | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,12.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,FCP,hwDRfkPSXng,"Lecture 32: ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule","MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,58.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
validation,Thank you very much! Amazing lectures!,RWvi4Vx4CDc,34. Final Course Review,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,131.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
validation,@seisdoesmatter The chalk's really awesome. Looks thicker than ordinary chalk. It seems a bit like the chalk kids use to paint on the pavement!,srxexLishgY,18. Properties of Determinants,"MIT 18.06 Linear Algebra, Spring 2005",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,27.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
validation,His teaching style seems casual and intuitive. I go to a small public college and the course is much more formal and proof driven. These lectures are a great addition to (as well as a nice break from) formal proofs. Thanks MIT!,yjBerM5jWsc,"9. Independence, Basis, and Dimension","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,1.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,70.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,1.0,1.0,1.0,1.0,1.0,1.0,0.0,1.0,0.0,1.0,1.0,1.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,1.0,,0.0,1.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
validation,"really useful, thanks!!",15HVevXRsBA,"Lec 13: Lagrange multipliers | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,191.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
validation,Prerequisites please??,j9WZyLZCBzs,1. Probability Models and Axioms,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,108.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,Marvellous!!!,k3AiUhwHQ28,25. Stochastic Gradient Descent,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,103.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,feels like too many things a squished into one lecture,EObHWIEKGjA,7. Discrete Random Variables III,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,173.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,1.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,Keep up the good work🙏,LY7YmuDbuW0,"Lecture 1: Sets, Set Operations and Mathematical Induction","MIT 18.100A Real Analysis, Fall 2020",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,99.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,for 2 by 2 systems the normal method at school is fine but it's too long and it's much quicker to write it in a matrix and the pivots and words like that are just terms to describe what he is doing,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,174.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,1.0,1.0,1.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,1.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,No.,MsIvs_6vC38,4. Factorization into A = LU,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,104.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,so substitute infinity minus one and chart the resultant here and now on the time line relax allow the unknown as infinite as well same graph pulse the fibronchi generator chart for rms the rms positive on sine only so return to chart and now condense the univariable into quantum zero,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,192.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 |
|
validation,can anyone explain how the 5th column of w8 came out to be?,Xa2jPbURTjQ,3. Orthonormal Columns in Q Give Q'Q = I,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,169.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,Simply excellent!! The Internet can save the world,YP_B0AapU0c,"Lec 16: Double integrals | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,119.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,"Prof. Strang. Please dont wink at me. I'm too shy. : P |
|
|
|
Kidding! I've grown used to it from your linear algebra series 😅 |
|
|
|
Thanks for your work and all the enthusiasm.",Cx5Z-OslNWE,Course Introduction of 18.065 by Professor Strang,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,109.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,1.0 |
|
validation,are the online assignments available for this course?,YiqIkSHSmyc,Lecture 1: The Column Space of A Contains All Vectors Ax,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,166.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,"Too funny, people need to see this. |
|
|
|
acesofclubs |
|
1 year ago |
|
|
|
Mafia Mob moment at 11:36",5q_3FDOkVRQ,"Lec 5 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,151.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
validation,this lecturer is the best lecturer i've ever had. never encounterd such clear explanations! very recommende :-),j9WZyLZCBzs,1. Probability Models and Axioms,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,199.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,This class is essentially week one of a Machine Learning class XD,tBUHRpFZy0s,24. Classical Inference II,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,142.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0 |
|
validation,Why the hell does my video freezes when it reaches 00:27 secods it feels like my ethernet service provider is preventing me from lisnening to a great lec! anyways need some coffee...,7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,158.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,"the audio is so low even when i turn up all volume. prof needs to put his mike anywhere but his chest plz. specifically talking about parts around 33:18. mans whispering. and you can't even hear what the students are asking. but i guess since this is opencourseware, i can't complain much about free college lectures.",rLlZpnT02ZU,4. Parametric Inference (cont.) and Maximum Likelihood Estimation,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,196.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,1.0,0.0 |
|
validation,"Around 34:00 : what about the case there AC - B^2 > 0 but A = 0? I take it that is also a case where we have a local max, since -B^2 is always negative; i'm just sorta surprised no-one noticed the omission?",3_goGnJm5sA,"Lec 10: Second derivative test; boundaries & infinity | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,40.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,Thank you so much for putting analysis online!,LY7YmuDbuW0,"Lecture 1: Sets, Set Operations and Mathematical Induction","MIT 18.100A Real Analysis, Fall 2020",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,127.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
validation,is this course applicable for lab analysis of data using statistical methods ? im a physics student wondering if this is worth the time to learn,VPZD_aij8H0,1. Introduction to Statistics,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,186.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,Awesome Teacher Gibert Strang,FX4C-JpTFgY,3. Multiplication and Inverse Matrices,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,42.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,Is a hyperbola in high dimensional spaces called a hyperhyperbola?,vF7eyJ2g3kU,27. Positive Definite Matrices and Minima,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,90.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,"Professore: ""How many of you actually knew about vectors before that?"" |
|
Class: -______________- ... seriously?",PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,112.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,This was great. I only need some Harvard friends I can impress.,P7a4bjE6Crk,12. Iterated Expectations,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,150.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,41:21 it would help if it were in shot,NcPUI7aPFhA,Lecture 8: Norms of Vectors and Matrices,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,14.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
validation,"The only reason I can see that you'd be opposed to this is perhaps if you think that American workers and businesses cannot keep up? Or just threatened and xenophobic of non-Americans? Well, if rising standards serves as a motivation, then American businesses have to step up their games and become better. Which is better for you. |
|
|
|
It's not all about direct national interest. And as an international student who is not going to MIT, I greatly appreciate this gesture (which doesn't cost them much).",Pd2xP5zDsRw,"Lec 20 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,141.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,"Thank you, professor Strang and all guys insist on this perfect linear algebra lecture. |
|
I started this lecture on 23/12/2021 and completed it on 08/01/2022 in Hong Kong, just before my new semester. I will remember this 18.06 forever, and professor Strang will be the best linear algebra teacher in my heart!",RWvi4Vx4CDc,34. Final Course Review,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,133.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
validation,why do we need to make the eigen vector as small as possible ?,wrEcHhoJxjM,23. Accelerating Gradient Descent (Use Momentum),"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,206.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,Thanks for MIT OCW and Prof.Auroux.,24v9onS9Kcg,"Lec 35: Final review (cont.) | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,137.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
validation,With all my endurance I listen to this saga until this lecture just to understand Linear Regression,Y_Ac6KiQ1t0,15. Projections onto Subspaces,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,161.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,1.0,1.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,i dont think this is what youtube was made for ._.,7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,181.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,"By the chain rule, (d/dx) f(g) when g is a function of x is g' * f '(g). |
|
In this case, the ""g"" is (kx) so you can get the answer by using the chain rule.",9v25gg2qJYE,"Lec 6 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,47.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,,0.0,0.0,,0.0,1.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,"I admire this person a lot, it shows his passion and dedication. At his advanced age, I hope I have the same love at work as he does.",7UJ4CFRGd-U,An Interview with Gilbert Strang on Teaching Linear Algebra,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,74.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,Engineering students must love this,BSAA0akmPEU,"Lec 9 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,55.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,"This man just truly loves linear algebra, and it's fantastic.",or6C4yBk_SY,Lecture 2: Multiplying and Factoring Matrices,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,148.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,He's such a great instructor,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,69.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,Thank you so much!!,RWvi4Vx4CDc,34. Final Course Review,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,130.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
validation,Why not Leipzig continuous and just bounded?,78vN4sO7FVU,Lecture 2: Bounded Linear Operators,"MIT 18.102 Introduction to Functional Analysis, Spring 2021",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,157.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,Thanks Professor Gilbert Strang,7UJ4CFRGd-U,An Interview with Gilbert Strang on Teaching Linear Algebra,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,134.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
validation,"He's just showing applications of linear algebra, not teaching them. That's why it seems ""sloppy"". You just can't teach Fourier Transform in 30 mins.",M0Sa8fLOajA,26. Complex Matrices; Fast Fourier Transform,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,68.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,,1.0,,0.0,0.0,,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,"Wtf, i learnt this when I was 15. How is MIT one of the best uni in the world??",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,163.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,"Question: the x-particular needs to have 0 corresponding to the free variables in A, and in order to find the free variables in A you need to row reduce it, but after you have done this, why can't you just arbitrarily select numbers for the pivot variables in x-particular? Why solve for them by setting some ""b"" when the vector is going to give some valid ""b"" anyways?",9Q1q7s1jTzU,8. Solving Ax = b: Row Reduced Form R,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,113.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
validation,Excellent Prof!!! ElGrecoProf++,Tx7zzD4aeiA,20. Central Limit Theorem,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,57.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
validation,"for a good start into calculus, this might help you..... just take a look here : Big Picture of Calculus [Gilbert Strang] |
|
watch?v=UcWsDwg1XwM&list=SPBE9407EA64E2C318",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,175.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
validation,"this is so elementary to be taught on 1st year UNIVERSITY , but it's well taught GREAT ",kCPVBl953eY,"Lec 3 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,198.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,Just a phenomenal professor and course. I looked at a couple lectures to help one of my kids in her calc course and ended up doing the whole class because it was so enjoyable. ENJOYABLE! Great job MIT. The effort you put into these OCW courses is really appreciated.,--lPz7VFnKI,"Lec 39 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,95.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,1.0,0.0,1.0,1.0,1.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
validation,"24:44 thanks to prof. strang that he didn't get away by just taking an easy example and explained other cases where it didn't work. |
|
. |
|
. |
|
. |
|
32:20 the way professor strang visualise the 9 dimensions is so impressive",J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,10.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,1.0,1.0,1.0,,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,0.0,1.0,1.0,1.0,,1.0,,0.0,1.0,,0.0,1.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
validation,Great! But why there is no one audience to listen. Bad time they have missed the Vector Spaces introduction.,JibVXBElKL0,"5. Transposes, Permutations, Spaces R^n","MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,66.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
validation,15th jan 2022,MsIvs_6vC38,4. Factorization into A = LU,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,5.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
train,It's clear that he likes physics. That's good.,z5TPjZrsp2k,"Lec 21: Gradient fields and potential functions | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,93.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,This series is phenomenal. Every lecture a gem. Thank you Mr Strang!,YzZUIYRCE38,14. Orthogonal Vectors and Subspaces,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,149.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,I love this guy. He would deserve a 'Gilbert Strang fan club'.,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,84.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,"A derivative is the slope of a line. Like if the line is Y = 2X, the slope is 2, so Y=2 is the derivative of Y=2X. A limit is basically what the graph looks like it is going to do as it gets closer and closer to the limiting value.",ryLdyDrBfvI,"Lec 2 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,29.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,1.0,,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,,,,,,0.0,,,,,1.0,,0.0,,,,,0.0,1.0,1.0,,,,,,,,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
train,Every time I see one of Professor Strang’s lectures I run and get my old Finite Math with Calculus book just to see how much I’ve forgotten 😉 He’s awesome to listen too🙏,9Q1q7s1jTzU,8. Solving Ax = b: Row Reduced Form R,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,56.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,0.0,,,,,,,,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0 |
|
train,"Any subspace must contain vector (0,0,0), otherwise, if you do w*0 the answer would not belong to the subspace. |
|
|
|
If the plane doesn't go through the origin, it's not a subspace.",8o5Cmfpeo6g,6. Column Space and Nullspace,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,37.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,0.0,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,I just came to see the top comment XD ,7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,79.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,Anyone jee aspirant here?,Bv9kVDcj7yo,"Lec 27 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,39.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,Thanks Sir. Couldn't get well the big diff between posterior and prior distributions.,1jDBM9UM9xk,21. Bayesian Statistical Inference I,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,135.0,1.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,1.0,,,,,,,,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 |
|
train,"9:00: ""That's what to remember from this lecture..."" |
|
Me: ""Ight boys n gals. We can skip to the next lecture""",UCc9q_cAhho,25. Symmetric Matrices and Positive Definiteness,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,19.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,0.0,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,Great lecture. I am so happy that i found this.:),8o5Cmfpeo6g,6. Column Space and Nullspace,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,63.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,"Comman man's Justice League:- Batman, Superman, Wonderwoman, Aquaman |
|
|
|
My Justice League:- Gilbert Strang, Walter Lewin, Ben Polak, Patrick JMT, Ben Lambert",J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,50.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
train,"38:56 Distribution of \hat{B} |
|
54:45 MSE (Quadratic risk) of \hat{B} is the trace of the covariance matrix.",JBIz7UadY5M,14. Regression (cont.),"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,13.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,1.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"sweet, did this like a term and a half ago in higshcool. aced the test for it too :D |
|
gosh calculus is awesome!",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,194.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,1.0,,,,1.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
train,i wish the cameraman wouldn't move around so much!,FX4C-JpTFgY,3. Multiplication and Inverse Matrices,"MIT 18.06 Linear Algebra, Spring 2005",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,183.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
train,"""The memory of where you started washes out.."". Restatement of mathematical equations like this makes these videos a gem. Thank you, MIT!",ZulMqrvP-Pk,17. Markov Chains II,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,2.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,1.0,,1.0,1.0,1.0,1.0,1.0,1.0,1.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,"DR. Strang thank you for counting and verifying the Parameters in SVD, LU, QR and Saddle Points in numerical linear algebra.",xaSL8yFgqig,"Lecture 18: Counting Parameters in SVD, LU, QR, Saddle Points","MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,52.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,"This guy is great. I studied engineering at a university less prestigious than MIT, and I remember professors refusing to explain their algebra steps. They were like ""you should know this already"".",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,143.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Great lecturer . Thank you Dr. Strang,J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,64.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,"Its good they still use blackboards, significantly better and easier to follow than the bland powerpoints we seem to get at my uni..",kCPVBl953eY,"Lec 3 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,94.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,1.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0 |
|
train,THIS MAN IS A FUCKING GENIUS,6-wh6yvk6uc,"12. Graphs, Networks, Incidence Matrices","MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,125.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,With lectures this good I can watch this instead of Netflix. I have one professor who also hold phenomenal lectures and lectures this good bring me as much joy or even more than playing a good video game or watching a good show. It is interesting and entertaining and it blows my mind. Truly a fantastic job! Thank you professor Strang!,osh80YCg_GM,16. Projection Matrices and Least Squares,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,162.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,i think he admitted sine rules,24v9onS9Kcg,"Lec 35: Final review (cont.) | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,182.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,What a curiosity at this age!!!,sx00s7nYmRM,26. Structure of Neural Nets for Deep Learning,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,154.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,So that any slow writers have time to finish copying it down for their notes.,phk05iSMezA,"Lec 27: Vector fields in 3D; surface integrals & flux | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,122.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,Amazing lesson ♥♥,rYz83XPxiZo,6. Singular Value Decomposition (SVD),"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,32.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,"b turns out to be, right. This guy is brilliant.",8o5Cmfpeo6g,6. Column Space and Nullspace,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,168.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
train,lol at algebraic questions,7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,189.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"Thank you so much for refreshing my brain. I found you when I'm retired. Do we know-how this knowledge came from? I just start my channel not long ago, please check it out, I'll appreciate it, thanks.",J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,128.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 |
|
train,"I wonder if these videos are fair. I go a cheap college that charges me almost nothing. My multivariable teacher isn't very good. So I come here to learn for free. However, those MIT students pay a lot of money for the same education. Is that fair?",60e4hdCi1D4,"Lec 17: Double integrals in polar coords; applications | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,88.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"From this lecture, I really understand Positive Definite Matrices and Minima thanks to Dr. Gilbert Strang. The examples really help me to fully comprehend this important subject.",vF7eyJ2g3kU,27. Positive Definite Matrices and Minima,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,61.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,1.0,,,1.0,1.0,,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,,,,,,1.0,,,,,1.0,,1.0,,,,,,1.0,1.0,,,,,,,,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,I love this guy's accent.,PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,83.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,I like this guy.,9v25gg2qJYE,"Lec 6 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,81.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Oh.. my god.. the board and chalk are phenomenal..!,YP_B0AapU0c,"Lec 16: Double integrals | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,106.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0 |
|
train,and because they're rich,YBajUR3EFSM,"Lec 4: Square systems; equations of planes | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,165.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"I am having a hard time making sense of the notation at 11:22. I believe the notation should be the conditional probability P(k|t) rather than P(k,t). I interpreted the latter to be the joint probability and if it is the case, the summation over all k of P(k,t) given a fixed t could not be equal to 1. Anyone, please help knock some sense to my head!",jsqSScywvMc,14. Poisson Process I,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,76.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,1.0,,,,,,,,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,"@zionen01 exactly right m8, im the same",Ts3o2I8_Mxc,30. Linear Transformations and Their Matrices,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,28.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
train,"https://www.youtube.com/watch?v=dD60vgirt8g |
|
How to Create a New Admin !User Account in Windows-10",ImHAGH_OEow,Lecture 20: Taylor's Theorem and the Definition of Riemann Sums,"MIT 18.100A Real Analysis, Fall 2020",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,179.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 |
|
train,"@jshowa6 Maybe by your 'logic'... on which you're comparing A student, with THE CONTENT OF A lecture?... you're comparing non-comparable things... an acceptable refutal would've been to compare lecture contents. You're judging ONE item of student... here is ONE item of american higher education... don't know were your student is from, the quality of his/her university... but this lecture is supposed to be from the TOP OF USA'S TECHNOLOGICAL UNIVERSITIES. That makes you wonder......",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,24.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"22:29 it could be x2 = f(x^0) |
|
isn´t it?",8o5Cmfpeo6g,6. Column Space and Nullspace,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,9.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,When do we get to integration?,YN7k_bXXggY,"Lec 12 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,156.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,"Thanks for pointing this out. |
|
|
|
Also, when vectors (1, 1, 2), (2, 2, 5) and (3, 3, 8) are written as column vectors and placed in a 3x3 matrix A, A’s first two rows equal (1, 2, 3). |
|
|
|
Hence the rank of A is 2 and its nullity is 3 – 2 = 1 > 0. |
|
|
|
Thus A’s columns are linearly dependent. |
|
",yjBerM5jWsc,"9. Independence, Basis, and Dimension","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,138.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 |
|
train,MIT is a Scam,7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,102.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"Just finished single variable course 18/05/19 thanks a lot MIT, someday I'm going to donate money for this incredible resource thay you give us.. carry on with this quality at education!",--lPz7VFnKI,"Lec 39 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,96.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,1.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
train,Such cool stuff.,PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,124.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Brilliant./ Pura arivu external knowledge.Intelligence /Aha arivu ie internal knowledge.,57jzPlxf4fk,"Lec 5: Parametric equations for lines and curves | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,46.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
train,Satellite preferably above the earth.,ryLdyDrBfvI,"Lec 2 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,117.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"Formulas? Formulae, professor.",kCPVBl953eY,"Lec 3 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,60.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,Absolutely well done and definitely keep it up!!! 👍👍👍👍👍👍,AeRwohPuUHQ,22. Gradient Descent: Downhill to a Minimum,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,31.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,he is a fucking gift of god thx gilbert,23LLB9mNJvc,19. Determinant Formulas and Cofactors,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,178.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,sounds drunk on 0.5 speed,Xa2jPbURTjQ,3. Orthonormal Columns in Q Give Q'Q = I,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,193.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"at 36:20 i was yelling ""MOVE THE FUCKING CAMERA""",23xbkrpQuAo,"Lec 14: Non-independent variables | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,167.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
train,How did we arrive at Xpivot=-Fxfree,VqP2tREMvt0,"7. Solving Ax = 0: Pivot Variables, Special Solutions","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,72.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,very nice ....Thanks,FX4C-JpTFgY,3. Multiplication and Inverse Matrices,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,202.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
train,I don't even understand English so what am I doing here lol,PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,77.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"god bless gilbert strang |
|
and also thanks to MIT for putting them on the web |
|
what on earth did we do before the internet ?",JibVXBElKL0,"5. Transposes, Permutations, Spaces R^n","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,177.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,Professor Strang - thank you for the amazing lectures and all the insights that you share with us!,p-bXJIa7QVI,"Lecture 30: Completing a Rank-One Matrix, Circulants!","MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,110.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,"5/32 is the probability of x-1/2 |
|
(5/24)/(4/3) are the areas |
|
|
|
Probability is so more interesting that the integration formulas (what I'm doing at school)",R9a_NHXrBcg,"Lec 23 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,15.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,1.0,,,,1.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,25:10 it is NOT a toin coss,VPZD_aij8H0,1. Introduction to Statistics,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,11.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,@Stephane4JC What goes to @Arstim goes to @poyanator and everyone who's watching this lecture.There's plenty of tougher courses on the nternet if this doesnt suit.,5q_3FDOkVRQ,"Lec 5 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,22.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"@sacredsoma The lector doesn't explain it, but in that part he is working with one of the several definitions of the Poisson process, in which the process is defined as a Lévy process. Deep knowledge of mathematical analysis, specifically measure theory, is required to understand these infitesimal properties of Lévy processes, that's why he doesn't say much about those second order terms.",jsqSScywvMc,14. Poisson Process I,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,26.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,,0.0,0.0,,,,1.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
train,"This is such a wonderful intellectual lecture i really enjoy listening your lectures. |
|
Thanks very much. Education is the only way in my humble opinion to reduce the ignorance that tend to dominate the world in with we living today. ",19Ql_Q3l0GA,3. Independence,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,145.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,Thank you so much! My professor's lecture left me completely confused. I think I'd still be pretty lost without this lecture.,HgEqXhsIq_g,"Lec 29 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,129.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,"Lecture 1: Rate of Change |
|
|
|
Lecture 2: Limits |
|
Lecture 3: Derivatives |
|
Lecture 4: Chain Rule |
|
Lecture 5: Implicit Differentiation |
|
Lecture 6: Exponential and Log |
|
Lecture 7: Exam 1 Review |
|
Lecture 9: Linear and Quadratic Approximations |
|
Lecture 10: Curve Sketching |
|
Lecture 11: Max-min |
|
Lecture 12: Related Rates |
|
Lecture 13: Newton's Method |
|
Lecture 14: Mean Value Theorem |
|
Lecture 15: Antiderivative |
|
Lecture 16: Differential Equations |
|
Lecture 18: Definite Integrals |
|
Lecture 19: First Fundamental Theorem |
|
Lecture 20: Second Fundamental Theorem |
|
Lecture 21: Applications to Logarithms |
|
Lecture 22: Volumes |
|
Lecture 23: Work, Probability |
|
Lecture 24: Numerical Integration |
|
Lecture 25: Exam 3 Review |
|
Lecture 27: Trig Integrals |
|
Lecture 28: Inverse Substitution |
|
Lecture 29: Partial Fractions |
|
Lecture 30: Integration by Parts |
|
Lecture 31: Parametric Equations |
|
Lecture 32: Polar Coordinates |
|
Lecture 33: Exam 4 Review |
|
Lecture 35: Indeterminate Forms |
|
Lecture 36: Improper Integrals |
|
Lecture 37: Infinite Series |
|
Lecture 38: Taylor's Series |
|
Lecture 39: Final Review",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,100.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,can somebody explain me the explanation in 25:49 I can understand what he says but I'm confused about the text.,4sTKcvYMNxk,"Lec 4 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,170.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,"Compelling lecture (as always), but I'm unsettled about one thing: much of it is based on the fact that the first singular vector of A is the maximizing x in the definition of ||A||2. However, this fact just seems to be mentioned without proof or argument, and accordingly it doesn't feel as though the proof that ||A||2 = sigma1 is complete. Thoughts?",NcPUI7aPFhA,Lecture 8: Norms of Vectors and Matrices,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,51.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,1.0,,0.0,0.0,1.0,1.0,1.0,1.0,1.0,1.0,,,,,,1.0,,,,,1.0,,1.0,1.0,,,,,1.0,1.0,,,,,,,,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 |
|
train,"Thank you, MIT for making this publicly available and for free. |
|
Thanks, Youtube for storing and serving the content.",RDO6Py97IDg,1. A bridge between graph theory and additive combinatorics,"MIT 18.217 Graph Theory and Additive Combinatorics, Fall 2019",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,132.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
train,Brilliant,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,45.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"wish you a very long and healthy life, Professor Strang.",7UJ4CFRGd-U,An Interview with Gilbert Strang on Teaching Linear Algebra,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,208.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Traditional camera angles were WAY easier on the eye.,or6C4yBk_SY,Lecture 2: Multiplying and Factoring Matrices,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,152.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
train,"Best video I have watched so far, I was with him all the way and my concentration never dipped.",yjBerM5jWsc,"9. Independence, Basis, and Dimension","MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,43.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,This lecture plus 3blue1brown's videos are getting these concepts to stick for me. Thank you Prof. Strang!!!,J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,147.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,0.0,1.0,,,0.0,1.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,1.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
train,"""As independent as siamese twins""-priceless",19Ql_Q3l0GA,3. Independence,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"Rachei de rir: ""... anybody below four is in trouble....""",2IdtqGM6KWU,11. Matrix Spaces; Rank 1; Small World Graphs,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,114.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 |
|
train,Respectable professor Dr. Casey your each lecture is brain storming,W2pw1JWc9k4,"Lecture 12: Lebesgue Integrable Functions, the Lebesgue Integral and the Dominated Convergence...","MIT 18.102 Introduction to Functional Analysis, Spring 2021",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,116.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,6,19Ql_Q3l0GA,3. Independence,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,17.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
train,"8:55 Min why does he want to avoid A to be singular? don´t quite understand...the only reason i see (in hindsight), is that the step, he does at 12:30 Min (rewriting LU with a diagonal matrix in the middle) wouldn´t be unique. But i assume, there must be another reason.",MsIvs_6vC38,4. Factorization into A = LU,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,18.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,"Calculus is one of the most important inventions of human history. It has allowed us to explain the world around us. I hate d, I mean HATED math until I got to Calculus!! Then it all made sense. Sometimes i still get obsessed with Calculus.",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,48.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,1.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,Farmer Brown had two bushels of apples and one had twenty five apples and the other thirty apples - how many apples did the Government allow him to sell?,J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,59.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,what is book he is referring as readings ?,CadZXGNauY0,9. Multiple Continuous Random Variables,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,204.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,"Shouldn't the small amount of heat required to raise the temperature of water be something like dQ= mcdT, where m=density*volume and c is the specific heat capacity of water?? Why does he just write energy = temp*volume?",R9a_NHXrBcg,"Lec 23 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,118.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,1.0,,0.0,,,,,,0.0,0.0,,,,,,,,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
train,just wow,13r9QY6cmjc,22. Diagonalization and Powers of A,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,188.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,i confuse headache,J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,180.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"Anyone here, who took this class live in the classroom @MIT in 2005?",J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,38.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"I like how this free class even excludes the errors, making it ten times better than the paid for class above...those two mistakes with the x (should have been u) could greatly confuse one",4sTKcvYMNxk,"Lec 4 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,80.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 |
|
train,This is the best lecture for Linear Algebra.,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,146.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Thank you for this lecture: https://www.youtube.com/richcoast,u4qQ1oIQcW8,Lecture 25: Power Series and the Weierstrass Approximation Theorem,"MIT 18.100A Real Analysis, Fall 2020",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,126.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
train,"So good for doing this, even it's ordinary. Very good self-study material.",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,121.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Is this high school level in USA?? professor is really good! ,kCPVBl953eY,"Lec 3 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,92.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,"what a don! i wish my lecturer was this guy, he makes it so simple",8o5Cmfpeo6g,6. Column Space and Nullspace,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,203.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,1.0,1.0,1.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Please record your introductry number theory lectures,VPZD_aij8H0,1. Introduction to Statistics,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,107.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,thanks Prof and MIT,nHlE7EgJFds,10. The Four Fundamental Subspaces,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,195.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
train,"I take much more time to finish watching the lecture , its about double time of the lecture if not more",ryLdyDrBfvI,"Lec 2 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,87.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,1.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,"At 84 years of age, he continues to be a maverick in the world of math education. Inspiring, to say the least.",YiqIkSHSmyc,Lecture 1: The Column Space of A Contains All Vectors Ax,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,41.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Is it relevant for JEE,7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,91.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,I love the 21st Century,J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,82.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
train,2019.12.19 thank you Gilbert Strang!,rZS2LGiurKY,Lecture 36: Alan Edelman and Julia Language,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,7.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 |
|
train,v=vot + 1/2at^2,ShGBRUx2ub8,"Lec 22 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,201.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
train,So glad I came across this!! Thank you so much MIT!,3MOahpLxj6A,5. Discrete Random Variables I,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,120.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,,0.0,,0.0,,,,,,0.0,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
test,"""unapropriate"" LOL",seO7-TwXH_I,"Lec 30: Line integrals in space, curl, exactness... | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,3.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"Thanks a lot, great understanding.",gMTiAeE0NCw,13. Bernoulli Process,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,136.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
test,ok i am kinda lost in this lecture,57jzPlxf4fk,"Lec 5: Parametric equations for lines and curves | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,190.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"Amazing! I like linear algebra a lot, I already had this class in college, I keep reading about it and ... I didn't even notice the passing of 40 minutes of the first class you! No wonder MIT is a world reference!",J7DzL2_Na80,1. The Geometry of Linear Equations,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,33.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,1.0,1.0,,,0.0,1.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
test,"@rmhism Who Takes Calculus In Seventh Grade ? |
|
Better Yet , Who *Teaches* It ?",PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,25.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,david jerison,-MI0b4h3rS0,"Lec 15 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,171.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"Just the best.... MIT lov it |
|
|
|
",5q_3FDOkVRQ,"Lec 5 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,98.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,Im a bit confused when he talked that the gradient vector is perpendicular to the circle x^2 + y^2 = c ? did he mean that it's perpendicular to the tangent of the circle. And is that vector's direction parallel to the xy plane ?,2XraaWefBd8,"Lec 12: Gradient; directional derivative; tangent plane | MIT 18.02 Multivariable Calculus, Fall 07","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,89.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
test,Good Job Professor,CXKoCMVqM9s,"Lec 28 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,62.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
test,"OCW really helps me with my studies. I will donate once I have money again :,D",-qCEoqpwjf4,6. Discrete Random Variables II,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,105.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"What an elegant way of lecturing. Thank you, sir.",j9WZyLZCBzs,1. Probability Models and Axioms,6.041 Probabilistic Systems Analysis and Applied Probability,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,155.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,1.0,,1.0,1.0,1.0,1.0,1.0,1.0,0.0,0.0,,,,,,0.0,,,,1.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
test,@DiireStraiits Lecture 8 :),PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,21.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
test,"00:00 - 14:20 |
|
equation of plane formed by 3 points in space |
|
|
|
14:21 - 19:13 |
|
matrices |
|
|
|
19:14 - 29:02 |
|
matrix multiplication |
|
|
|
29:03 - 37:29 |
|
identity matrix |
|
transformations |
|
|
|
37:30 |
|
matrix inverse",bHdzkFrgRcA,"Lec 3: Matrices; inverse matrices | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,4.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,1.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,1.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,Deepest thanks Prof. Strang!!!!,FX4C-JpTFgY,3. Multiplication and Inverse Matrices,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,53.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
test,"Hm, MIT got good chalks... =O",ryLdyDrBfvI,"Lec 2 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,71.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
test,I WOULD LIKE TO THANK HIM FOR HIS CALCULS ON THE WEB.,7UJ4CFRGd-U,An Interview with Gilbert Strang on Teaching Linear Algebra,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,73.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 |
|
test,"Sometimes I do prefer the more imaginative, romantic approach to the first several lectures in linear algebra, beginning with defining vector spaces and how operations are made, how these spaces are defined. I do however really enjoy this approach as well, it's actually a bit more applicable.",QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,123.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,1.0,1.0,,,1.0,1.0,,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,,,,,,0.0,,,,1.0,1.0,,1.0,,,,,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
test,"An Indian edition of this book is available only in India |
|
http://www.wellesleypublishers.com/buy.html",FX4C-JpTFgY,3. Multiplication and Inverse Matrices,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,35.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,Let's go with integrals,YP_B0AapU0c,"Lec 16: Double integrals | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,101.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"just a small note 15:43) A_t*A is positive ""semi""-definite, in general. If A is non-singular(if all columns are independent), _then_ A_t*A is positive definite. |
|
26:02) It can be solved by 'calculating' like u2 = 1/sigma2*A*v2, but the essence is that [the sigmas don't have to be positive, it's just that WE CHOOSE them to be positive]. |
|
[u2 / sigma2 / v2] - ANY one of them can have different sign. It's just our choice, like -p=q*r and p=-q*r and p=q*-r are all equivalent. |
|
(of course the order of the columns and sigmas are our choice as well - that's how the sigmas are ordered like that)",TX_vooSnhm8,29. Singular Value Decomposition,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,187.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,1.0,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
test,"i'm confused, he says at 12min that the ""we know.. the function stays constant, but we can also know how function changes using the chain rule"" didn't he just say it was constant? so confused.... T.T",2XraaWefBd8,"Lec 12: Gradient; directional derivative; tangent plane | MIT 18.02 Multivariable Calculus, Fall 07","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,184.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,1.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 |
|
test,U guys learn this in mit !! in india this is taught to 15-16 yr old,ryLdyDrBfvI,"Lec 2 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,153.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,I am afraid because Sir Gilbert you are 83.,nHlE7EgJFds,10. The Four Fundamental Subspaces,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,75.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,thought I was playing sims for a moment there..,23LLB9mNJvc,19. Determinant Formulas and Cofactors,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,200.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"Really is an excellent lecture, helped a lot with my understanding of Bayesian Stats. In the Non-informative priors section, is the unbounded case actually where an improper prior needed?",k2inA31Gups,18. Bayesian Statistics (cont.),"MIT 18.650 Statistics for Applications, Fall 2016",1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,115.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,1.0,1.0,,,0.0,0.0,,1.0,1.0,0.0,1.0,1.0,1.0,1.0,0.0,,,,,,0.0,,,,1.0,0.0,,0.0,,,,,0.0,1.0,1.0,0.0,1.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 |
|
test,"Just finished the whole Single Variable Calculus course. |
|
Thank you MIT OCW for making this possible. |
|
Chile, 2019. |
|
|
|
|
|
-One thing to rule them all |
|
One thing to find them |
|
One thing to bring them all |
|
And in a matrix bind them-",--lPz7VFnKI,"Lec 39 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,97.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,1.0 |
|
test,Any Indian is here,7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,36.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,Thanks for these lectures and for the effort in putting together these videos!,or6C4yBk_SY,Lecture 2: Multiplying and Factoring Matrices,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,139.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0 |
|
test,He is French isn't he?,phbw9r1iUDI,7. Parametric Hypothesis Testing,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,67.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,53:24 I guess that's what everyone feels after talking so quick for a long time :D,VPZD_aij8H0,1. Introduction to Statistics,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,16.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,Great teacher.,7UJ4CFRGd-U,An Interview with Gilbert Strang on Teaching Linear Algebra,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,65.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
test,@iDiViNeXx yeah he's french,PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,23.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,even my grandma can undestand,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,172.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
test,I find it really helpful to pause and try to proof the theorems before they are worked out in lecture.,9_xG0AGRa-w,Lecture 2: Cantor's Theory of Cardinality (Size),"MIT 18.100A Real Analysis, Fall 2020",0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,78.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,1.0,,,,1.0,0.0,,,0.0,1.0,,1.0,1.0,0.0,1.0,1.0,1.0,1.0,0.0,,,,,,1.0,,,,1.0,1.0,,1.0,,,,,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
test,"in 47:38 what's the point of limiting the boundaries of the integral to exclude zero regions? wouldn't integrating over those areas (over the entire x) just ""add"" zeroes, thus yielding the same result?",CadZXGNauY0,9. Multiple Continuous Random Variables,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,185.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
test,Yo.,CXKoCMVqM9s,"Lec 28 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,164.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"The best joke of all occurs at 19:00. |
|
|
|
“Five minutes of 18.06 is enough to take care of 18.03.” |
|
|
|
And it’s not a joke, provided you have a thorough understanding of 18.06. |
|
",2IdtqGM6KWU,11. Matrix Spaces; Rank 1; Small World Graphs,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,140.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,what's the story with all the spam below,PnPIqh7Frlw,"Lec 24: Simply connected regions; review | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,205.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,the interation is wrong cuz the answer is 3pi/16,YP_B0AapU0c,"Lec 16: Double integrals | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,197.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
test,"Wish this guy taught me Math 293 and 294 at Cornell. My guy could barely speak English, let alone explain what we were trying to accomplish. I understood that if we wanted eigenvectors perpendicular to x we'd get lift relative to flow...but this guy would have made the math a bit simpler.",cdZnhQjJu4I,21. Eigenvalues and Eigenvectors,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,160.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,1.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
test,why is iteration in newtons done..i cant understand the logic behind this,sRIDVAcoG5A,"Lec 13 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,207.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
test,Why the number of operation is equal to n^2 ? Can anyone ?,MsIvs_6vC38,4. Factorization into A = LU,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,159.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
test,"This is an excellent, step-by-step proof that I will never be a mathematician.",9v25gg2qJYE,"Lec 6 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,144.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,,,,,,0.0,,,,1.0,1.0,,1.0,,,,,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
test,Professor Strang is the BEST! Makes me motivated to enter MIT,Y_Ac6KiQ1t0,15. Projections onto Subspaces,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,111.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
test,Can someone say u-not better than Gilbert Strang?,lGGDIGizcQ0,24. Markov Matrices; Fourier Series,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,49.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,Absolutely well done and definitely keep it up!!! 👍👍👍👍👍,C_W1adH-NVE,2. Introduction to Statistics (cont.),"MIT 18.650 Statistics for Applications, Fall 2016",1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,30.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"An Indian edition of this book is available only in India |
|
http://www.wellesleypublishers.com/buy.html",Cx5Z-OslNWE,Course Introduction of 18.065 by Professor Strang,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,34.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,"@ 50:18 |
|
|
|
Now I see, guess I made the same mistake. It is -1.",aeXp1zC6Hls,"Lec 30 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,20.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,,0.0,0.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,I remember when the 4 color was a lived controversy.,G3mAXHuoDSw,Lecture 4: The Open Mapping Theorem and the Closed Graph Theorem,"MIT 18.102 Introduction to Functional Analysis, Spring 2021",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,85.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,1.0,1.0,,,,0.0,0.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
test,Big picture starts at 20:47,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,44.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,,0.0,0.0,,,,0.0,1.0,,,0.0,0.0,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,,,,,,0.0,,,,0.0,0.0,,0.0,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
train,Where can I find the online homework? I can't find it in OCW.,xsP-S7yKaRA,5. Positive Definite and Semidefinite Matrices,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,271.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,,,,,,,,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,46:27 Top 10 saddest anime betrayals,9FLItlbBUPY,"Lec 2: Determinants; cross product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,228.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 |
|
expansion,06:58 Anybody knows that why the dimension of symmetric matrix and upper triangular matrix are 6?,2IdtqGM6KWU,11. Matrix Spaces; Rank 1; Small World Graphs,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,209.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,13:23 Those who want to audition.,wOHrNt9ScYs,"Lec 38 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,211.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,so there is really a mistake at 33:11... it has to be 3*2^(2/3)*V^(2/3),YN7k_bXXggY,"Lec 12 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,276.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"This guy screwed up at 26:27 by by using d(e^w)/dw = e^w, which is the exact conclusion he needs to prove. He should've calculated derivative of ln(x) using definition of derivative. Then he can find derivative of e^x using inverse function derivative property. Can't believe this is MIT.",9v25gg2qJYE,"Lec 6 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,268.0,0.0,0.0,1.0,,0.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,1.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 |
|
expansion,48:50 I had to look up why that is a right angle. It wasn't so obvious to me. But great lecture!,60e4hdCi1D4,"Lec 17: Double integrals in polar coords; applications | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,229.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,1.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"24:38 16*(first row) -9* (second row)= (third row) |
|
Welcome back Doc Strang!",YiqIkSHSmyc,Lecture 1: The Column Space of A Contains All Vectors Ax,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,218.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,1.0,0.0,0.0,1.0,0.0 |
|
expansion,"Wouldn't the derivative of the magnitude of the position vector 14:57 tell you the speed with respect to the origin? |
|
|
|
The magnitude of the position vector is the distance between the position and the origin. The derivative tells you if the magnitude is getting larger or smaller, but that's the same as saying the distance to the origin is getting larger or smaller.",0D4BbCa4gHo,"Lec 6: Velocity, acceleration; Kepler's second law | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,272.0,0.0,0.0,1.0,,0.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"I tried to reach same conclusion using the fact that p is perpendicular to (b-p) i.e. p(t) * (b-p) = 0 and then substitute p=Ax but failed to get similar result as @28:31. There is an x(t) hanging around. |
|
-> p(t) *(b-p) = 0 |
|
-> (Ax)(t) * (b-Ax)=0 |
|
-> x(t) * A(t) * (b-Ax)=0 |
|
-> x(t) * [ A(t) * b - A(t) * A * x ] = 0 |
|
|
|
--?? can I justify that the second term must equate to zero? |
|
|
|
any suggestions how to move forward? |
|
|
|
where "" ""(t) is "" "" transpose.",Y_Ac6KiQ1t0,15. Projections onto Subspaces,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,261.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,1.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"At 22:04, shouldn't it be 'E[XY]=theta0*E[X]+theta1E[X^2] -> E[XY]=(E[Y]-theta1*E[X])*E[X]+theta1E[X^2] -> E[XY]-E[X]E[Y]=theta1(E[X^2]-E[X]^2)'?",tBUHRpFZy0s,24. Classical Inference II,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,249.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"When he is looking up at the board above @17:17, and I'm scrolling the mouse to see it...... 2005 problem became 2017 problem.... doh! I troll myself. :(",QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,269.0,0.0,1.0,0.0,,0.0,,0.0,0.0,,1.0,1.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
expansion,"I disagree with the statement at 9:13 and 11:13. Should it not be combinations of columns of B or combinations of rows of A? Cj = A * Bj where Bj is the jth column of B. Similarly for rows, it's Ci.T = Ai.T * B where Ai.T is the ith row of A.",FX4C-JpTFgY,3. Multiplication and Inverse Matrices,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,258.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,Where I stopped. 0:00 Dec 26,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,270.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,0.0,1.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,20:33 standard 4-step procedure of calculating probability.,CadZXGNauY0,9. Multiple Continuous Random Variables,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,214.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"He said ""toin coss"" twice at 25:10. :D",VPZD_aij8H0,1. Introduction to Statistics,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,255.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"42:16 For some cuts of s1, it cannot be pumped I think (there is even/odd fashion there, but it's proof by contradiction, so one example should be enough). Let's say p is odd, then we cut <vxy> where x=""1"" and (v, y) has same number of ""0""s in that case it can be pumped, but what if p is even ? then x=""1"" and we are left with (p-1) which clearly cuts in unequal parts for v and y, so the number of zeros in one side (whichever) will be smaller than other side. As I mentioned earlier, since we are proving by contradiction, one example should be sufficient to show the contradiction. Any thoughts, why this wouldn't work ? |
|
|
|
Edit: I got it wrong at first glance, the answer to my statement above is: You need to show there is NO way to cut the string such that it can be pumped.",IycOPFmEQk8,"5. CF Pumping Lemma, Turing Machines","MIT 18.404J Theory of Computation, Fall 2020",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,226.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,Can someone please explain the significance of the answer to his last question at 47:13 ?,6-wh6yvk6uc,"12. Graphs, Networks, Incidence Matrices","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,250.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,43:20 ㅋㅋㅋㅋㅋㅋㅋㅋㅋㅋ,0D4BbCa4gHo,"Lec 6: Velocity, acceleration; Kepler's second law | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,227.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 |
|
expansion,At 1:11:30 why does he use the biased sample variance and not the unbiased one? Conceptually it doesn't matter?,4HRhg4eUiMo,8. Parametric Hypothesis Testing (cont.),"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,248.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,35:20 Being an orthogonal matrix does not mean that it is square.,QNpj-gOXW9M,"20. Cramer's Rule, Inverse Matrix, and Volume","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,222.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,Its amazing how the previous recitation perfectly syncs with the lecture at 16:17,9Q1q7s1jTzU,8. Solving Ax = b: Row Reduced Form R,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,263.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,22:52 so yağğğğğhhh y'know,TpWQlKHPyJ4,"Lec 31 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,217.0,0.0,0.0,0.0,,1.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,37:28 lol,MsIvs_6vC38,4. Factorization into A = LU,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,224.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"41:53 These are questions that should be asked in recitation, not in lecture.",BGE3wb7H2PA,"Lec 33 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,225.0,0.0,0.0,0.0,,0.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
expansion,"I'm confused at around 17:25, detA*I shouldn't be detA^n*I? cause it is the matrix of detA,so each row can be divided over detA, so that is detA^n*I!",QNpj-gOXW9M,"20. Cramer's Rule, Inverse Matrix, and Volume","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,262.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,I dont get the bit @ 15:50 about small interval probabilities being a limiting case. Why are there any second order terms?,jsqSScywvMc,14. Poisson Process I,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,259.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"53:38 H is positive definite when we are testing for a minimum. But for convex, if we only need d2f/dx2>=0, then I only need 2 diagonal entries if H be positive. No requirement for fxx and fyy.",nvXRJIBOREc,Lecture 21: Minimizing a Function Step by Step,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,230.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"@31:00""The proper word, of course, is parallelepiped. But for obvious reasons, uh..., I wrote box."" rofl!",QNpj-gOXW9M,"20. Cramer's Rule, Inverse Matrix, and Volume","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,232.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,1:13:51 v1 ZULUL,WW3ZJHPwvyg,19. Principal Component Analysis,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,213.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,"25:16 is a bit weird to me. If mu2 surpasses lambda1, both mu2 and mu1 would be larger than lambda1 then, the interlacing wouldn't hold anymore. Anyone understands what's going on?",AdTvkFsqcDc,16. Derivatives of Inverse and Singular Values,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,219.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,1.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,10:06 is not a function. Function is a relation in which there is a unique image of every element in the domain.,5q_3FDOkVRQ,"Lec 5 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,210.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,Do we also need to take absolute value of x at 48:22?,-MI0b4h3rS0,"Lec 15 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,252.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,36:09 😍😍 جميل,YP_B0AapU0c,"Lec 16: Double integrals | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,223.0,0.0,0.0,0.0,,1.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,"30:00, the way to remember it is that the work is a straightforward dot product of F with <dx, dy>, M goes with x and N goes with y and we add, and the flux is a dot product of F with the same vector rotated pi/2 so N goes with x and a minus sign with few choices left for M. Auroux missed a nice opportunity at the beginning to clarify the sign convention for flux by foreshadowing the result for closed curves with + being from the inside, out. I'm not faulting anyone, I couldn't give a lecture on this and keep possession of both my hands when erasing blackboards operated by hazardous machines. If he loses his hands, he'll never erase anything again. Be careful out there, Denis, we don't want to lose a great teacher.",_CdoRiNSrqI,"Lec 23: Flux; normal form of Green's theorem | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,1.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,221.0,0.0,0.0,0.0,,0.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0 |
|
expansion,"22:36 : positive semidefinite, not positive definite",osh80YCg_GM,16. Projection Matrices and Least Squares,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,215.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,22:38 That's the camera man,mHfn_7ym6to,8. Continuous Random Variables,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,216.0,0.0,1.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
expansion,13:50 - it is a kind of reverse-engineering proof...,xaSL8yFgqig,"Lecture 18: Counting Parameters in SVD, LU, QR, Saddle Points","MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,212.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"Could someone please illustrate how the 7 has the most high probability for the problem at 31:00 |
|
Chose a number between 2 and 12 |
|
Win 100$ if you chose the sum of the 2 dice. |
|
|
|
I didn't quite understand how its 7",VPZD_aij8H0,1. Introduction to Statistics,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,251.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,28:20 polar coordinates,XRkgBWbWvg4,"Lec 32 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,220.0,0.0,0.0,1.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"For the Expected number of fish ( ~ 19:00 ), shouldn't he include the possibility of catching N fish during the first two hours? Is the probability of 3 or more fish in 0 < t < 2 insignificant?",XsYXACeIklU,15. Poisson Process II,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,254.0,0.0,0.0,1.0,,0.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"@hypnoticpoisons Use Euler’s formula with x =6t: |
|
|
|
e^ix = cos x + i sin x |
|
|
|
|e^ix| = |cos x + i sin x| |
|
= sqrt((cos x + i sin x)(cos x – i sin x)) |
|
= sqrt(cos^2 x + sin^2 x) |
|
= sqrt(1) |
|
= 1",IZqwi0wJovM,23. Differential Equations and exp(At),"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,242.0,0.0,0.0,0.0,,0.0,,1.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"@bjrcboy whats funny is the boards they have are real slate and cost way more then white boards |
|
and im with you white on back is easier on the eyes to read",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,240.0,0.0,1.0,0.0,,0.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 |
|
expansion,"@sententia8 I thought for like 5 mins. on your sentence, and then eventually figured out that it was so non-mathematical! lolzz",ryLdyDrBfvI,"Lec 2 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,247.0,0.0,0.0,0.0,,0.0,,1.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"@kaga13 yes ok, but I'm sorry. Didn't want to offend you, I just wondering..",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,244.0,0.0,0.0,0.0,,0.0,,1.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,@MITOCW can you up the course Real Analysis 18.100B?,cqdUuREzGuo,Lecture 8: Lebesgue Measurable Subsets and Measure,"MIT 18.102 Introduction to Functional Analysis, Spring 2021",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,236.0,0.0,0.0,0.0,,0.0,,1.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"Hi, anybody knows if there is a way to get for free the book of the course? please email me primaveramanual@gmail.com",YiqIkSHSmyc,Lecture 1: The Column Space of A Contains All Vectors Ax,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,256.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,@gerax89gunner1 My 10th grade math teacher took 2 months to explain what this guy explained in like 40 mins :/,PxCxlsl_YwY,"Lec 1: Dot product | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,241.0,0.0,0.0,0.0,,0.0,,1.0,1.0,,0.0,0.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"@HyperBorealOperator (First of all, I don´t speak english as a native language, so, sorry for my mistakes) I think that is kinda stupid to criticize these videos that are free, I mean, if Somebody were paying it would be ok to give ""not flattering"" comments... |
|
|
|
There is a expression in my country ""A caballo regalado, no le mires el diente"" (Never look a gift horse in the mouth) |
|
|
|
This is free men/woman...don´t worry, I know you are just making a observation and not criticizing",9v25gg2qJYE,"Lec 6 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,233.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,@NothingPersonalTA Sorry third row would then be -5/2,QVKj3LADCnA,2. Elimination with Matrices.,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,238.0,0.0,0.0,1.0,,0.0,,1.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,@michalchik it's the math dragon theorem,kCPVBl953eY,"Lec 3 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,246.0,0.0,0.0,0.0,,0.0,,1.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"@TryppingWins Yeah you're right, basically first year at Uni is a lot easier than A levels. Second year is a rude awakening, you work your ass off. Of course all the genius' posting here were discovering dark energy in their freshman year (or you would think that was the case the way they act so patronising and smug about this lecture.) It hasn't dawned on them that MIT have some experience in educating some of the brightest minds and recapping first principles might just have a valid purpose",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,239.0,0.0,0.0,0.0,,0.0,,1.0,1.0,,0.0,0.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"@23:28 nice trick for figuring out how to get particular entries in matrix multiplication! (I wish I had realized this sooner.) At my school, vectors geometry and linear algebra are taught as separate courses. This professor is amazingly patient and illustrative. I wish I could have gone to his classes.",bHdzkFrgRcA,"Lec 3: Matrices; inverse matrices | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,231.0,0.0,0.0,0.0,,0.0,,0.0,1.0,,0.0,0.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 |
|
expansion,@MrDeathMental LOL,7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,237.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"Hi, anybody knows if there is a way to get for free the book of the course? please email me primaveramanual@gmail.com",or6C4yBk_SY,Lecture 2: Multiplying and Factoring Matrices,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,257.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"I indexed this video on MyVindex for quick video referencing between the geometric, analytical interpretations and examples. - http://www.myvindex.co/app.html#/home/iSKiXlzQ . Email me if you're interested in MyVindex at jmoseman01@gmail.com",7K1sB05pE0A,"Lec 1 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,,,,,,,,,,260.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,"@MIT OpenCourseWare |
|
Can you please tell me the difference between this course and the on in this link: https://ocw.mit.edu/resources/res-6-012-introduction-to-probability-spring-2018/index.htm |
|
|
|
They look similar to me in content. The one in the link looks more structured.",j9WZyLZCBzs,1. Probability Models and Axioms,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,235.0,0.0,0.0,0.0,,0.0,,1.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,"@KatherineRogers Actually, if a constant k=1/1m is used, then in the final formula for V you will end up with subtracting m^1 from m^2 which is apparently not correct.",ShGBRUx2ub8,"Lec 22 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,234.0,0.0,0.0,1.0,,0.0,,1.0,0.0,,0.0,0.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0 |
|
expansion,@joeglimmix by contrast his brain probably has 8 pack abbs ,0D4BbCa4gHo,"Lec 6: Velocity, acceleration; Kepler's second law | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,243.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,"@lolololort |
|
1/2(1 + sqrt(5)) is also the golden ratio! Math is amazing =] I'm sure the professor knew the answer and didn't calculate it in his head on the spot.",13r9QY6cmjc,22. Diagonalization and Powers of A,"MIT 18.06 Linear Algebra, Spring 2005",1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,245.0,0.0,0.0,0.0,,0.0,,1.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,항상성과 매개변수가 같다면 오진법과 십진법에서 7로 잡아서 문제를 풀고 대비숫자를 기준으로 문제를 도출해서 문제를 풀면된다. 의사결정 지원 시스템에 문제를 기입하고 풀면 됩니다.,MsIvs_6vC38,4. Factorization into A = LU,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,288.0,0.0,0.0,0.0,,1.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,❤️,yP1S37BiEsQ,13. Regression,"MIT 18.650 Statistics for Applications, Fall 2016",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,280.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,怎么从空间变换角度理解满秩只有一解的意义?,9Q1q7s1jTzU,8. Solving Ax = b: Row Reduced Form R,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,283.0,0.0,0.0,1.0,,1.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,Prof. Strang can get it. no homo,Cx5Z-OslNWE,Course Introduction of 18.065 by Professor Strang,"MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,266.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 |
|
expansion,第一次知道三位直线参数方程是怎么来的,57jzPlxf4fk,"Lec 5: Parametric equations for lines and curves | MIT 18.02 Multivariable Calculus, Fall 2007","MIT 18.02 Multivariable Calculus, Fall 2007",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,284.0,0.0,0.0,0.0,,1.0,,0.0,1.0,,0.0,1.0,,,,1.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
expansion,"Que se supone que está explicando..???? está demostrando saltándose pasos de álgebra como si por eso fuera a ser mejor demostración... que quiere demostrar... que sabe y que es chingón... a los estudiantes hay que poner cada paso y exponer con claridad, nadie puede asimilar eso en una clase... PÉSIMA EXPLICACIÓN",4sTKcvYMNxk,"Lec 4 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,267.0,0.0,0.0,0.0,,1.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
expansion,"Es fácil, no se porque se complican tanto. Gracias por el material, gracias al MIT.",kCPVBl953eY,"Lec 3 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,253.0,1.0,0.0,0.0,,1.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,1.0 |
|
expansion,buen curso si señor,kCPVBl953eY,"Lec 3 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,273.0,0.0,0.0,0.0,,1.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
expansion,대칭 행렬의 경우 피봇들의 부호와 고유값의 부호가 같다.,UCc9q_cAhho,25. Symmetric Matrices and Positive Definiteness,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,286.0,0.0,0.0,0.0,,1.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,Nasıl bu kadar iyi öğretebiliyorsunuz bilmiyorum ama bir saniye bile dikkatim dağılmıyor. Bütün dönem boyunca birleştiremediğim parçaları 2 videoda anlayıp 'nasıl yaniiiiiiiii!' diye bağırdım. Thank you Sir you are the best :)),7UJ4CFRGd-U,An Interview with Gilbert Strang on Teaching Linear Algebra,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,265.0,1.0,0.0,0.0,,1.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
expansion,❤️ ❤️ ❤️,9syvZr-9xwk,"1. Introduction, Finite Automata, Regular Expressions","MIT 18.404J Theory of Computation, Fall 2020",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,281.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,Εύχομαι κάθε δάσκαλος να είναι τόσο καλός όσο ο καθηγητής Γιάννης Τσιτσικλής!,j9WZyLZCBzs,1. Probability Models and Axioms,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,278.0,0.0,0.0,0.0,,1.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
expansion,"Ένα μεγάλο ευχαριστώ, από Έλληνες φοιτητές!",jsqSScywvMc,14. Poisson Process I,6.041 Probabilistic Systems Analysis and Applied Probability,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,277.0,1.0,0.0,0.0,,1.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 |
|
expansion,"Охеренная лекция! Чувак рулит. Лучше, чем препы с мехмата объясняет!",JibVXBElKL0,"5. Transposes, Permutations, Spaces R^n","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,279.0,0.0,0.0,0.0,,1.0,,0.0,1.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 |
|
expansion,❤️❤️❤️❤️❤️❤️❤️🇧🇩🇧🇩🇧🇩,BSAA0akmPEU,"Lec 9 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,282.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,itülüler yeter diyordu davut hoca dinlemiyordu,yjBerM5jWsc,"9. Independence, Basis, and Dimension","MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,274.0,0.0,0.0,0.0,,1.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,美帥酷 正負零 神國人,8o5Cmfpeo6g,6. Column Space and Nullspace,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,285.0,0.0,0.0,0.0,,1.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,Jedna planéta od povedala napi sa vody aby si mohol lietať ďalej,cdZnhQjJu4I,21. Eigenvalues and Eigenvectors,"MIT 18.06 Linear Algebra, Spring 2005",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,264.0,0.0,0.0,0.0,,1.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,이게 계속 쓰지 말라던 로피탈이구나,PNTnmH6jsRI,"Lec 35 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,287.0,0.0,0.0,1.0,,1.0,,0.0,0.0,,0.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
expansion,"moises cañas, mira como en el mit si responden las preguntas. Y teniendo muchas mas razones para ser creidos que en tu universidad.",ryLdyDrBfvI,"Lec 2 | MIT 18.01 Single Variable Calculus, Fall 2007","MIT 18.01 Single Variable Calculus, Fall 2006",0.0,,,,,,,,,,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,275.0,0.0,0.0,0.0,,0.0,,0.0,0.0,,1.0,0.0,,,,0.0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 |
|
|