{"id":6915,"date":"2022-04-28T20:48:56","date_gmt":"2022-04-28T20:48:56","guid":{"rendered":"https:\/\/www.tun.com\/courses\/2019\/12\/23\/mathematics-for-machine-learning-pca\/"},"modified":"2022-04-28T20:48:57","modified_gmt":"2022-04-28T20:48:57","slug":"mathematics-for-machine-learning-pca","status":"publish","type":"post","link":"https:\/\/www.tun.com\/courses\/mathematics-for-machine-learning-pca\/imperial-college-london\/","title":{"rendered":"Mathematics for Machine Learning: PCA"},"content":{"rendered":"<div class=\"single_post\" style=\"margin-top:16px;\";>\n<div class=\"post-single-content box mark-links entry-content\">\n<div class=\"thecontent\">\n<h2>Description<\/h2>\n<p>This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We&#8217;ll cover some basic statistics of data sets, such as mean values and variances, we&#8217;ll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these tools, we&#8217;ll then derive PCA as a method that minimizes the average squared reconstruction error between data points and their reconstruction.<\/p>\n<p>At the end of this course, you&#8217;ll be familiar with important mathematical concepts and you can implement PCA all by yourself. If you\u2019re struggling, you&#8217;ll find a set of jupyter notebooks that will allow you to explore properties of the techniques and walk you through what you need to do to get on track. If you are already an expert, this course may refresh some of your knowledge.<\/p>\n<p>The lectures, examples and exercises require:<br \/>\n1. Some ability of abstract thinking<br \/>\n2. Good background in linear algebra (e.g., matrix and vector algebra, linear independence, basis)<br \/>\n3. Basic background in multivariate calculus (e.g., partial derivatives, basic optimization)<br \/>\n4. Basic knowledge in python programming and numpy<\/p>\n<p>Disclaimer: This course is substantially more abstract and requires more programming than the other two courses of the specialization. However, this type of abstract thinking, algebraic manipulation and programming is necessary if you want to understand and develop machine learning algorithms.<\/p>\n<div style=\"height:45px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<h2 class=\"has-text-align-center\">Price: Enroll For Free!<\/h2>\n<div style=\"height:45px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<div class=\"wp-block-button aligncenter\"><a class=\"wp-block-button__link has-text-color has-very-light-gray-color has-background has-vivid-red-background-color\" href=\"https:\/\/www.coursera.org\/learn\/pca-machine-learning\">View Class<\/a><\/div>\n<div style=\"height:55px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<div class=\"wp-block-columns\">\n<div class=\"wp-block-column\">\n<p class=\"has-text-align-center\"><em><strong>Language:<\/strong> <\/em>English<\/p>\n<\/div>\n<div class=\"wp-block-column\">\n<p class=\"has-text-align-center\"><em><strong>Subtitles<\/strong>: <\/em>English<\/p>\n<\/div>\n<\/div>\n<p style=\"background-color:#496d89\" class=\"has-text-color has-background has-text-align-center has-very-light-gray-color\"><a href=\"https:\/\/www.coursera.org\/learn\/pca-machine-learning\">Mathematics for Machine Learning: PCA<strong> &#8211; Imperial College London<\/strong><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Description This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We&#8217;ll cover some basic statistics of data sets, such as mean values and variances, we&#8217;ll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":19408,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_uag_custom_page_level_css":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[165],"tags":[],"class_list":["post-6915","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-imperial-college-london"],"aioseo_notices":[],"uagb_featured_image_src":{"full":["https:\/\/www.tun.com\/courses\/wp-content\/uploads\/2019\/12\/Imperial-College-Londononline-education.png",378,224,false],"thumbnail":["https:\/\/www.tun.com\/courses\/wp-content\/uploads\/2019\/12\/Imperial-College-Londononline-education-150x150.png",150,150,true],"medium":["https:\/\/www.tun.com\/courses\/wp-content\/uploads\/2019\/12\/Imperial-College-Londononline-education-300x178.png",300,178,true],"medium_large":["https:\/\/www.tun.com\/courses\/wp-content\/uploads\/2019\/12\/Imperial-College-Londononline-education.png",378,224,false],"large":["https:\/\/www.tun.com\/courses\/wp-content\/uploads\/2019\/12\/Imperial-College-Londononline-education.png",378,224,false],"1536x1536":["https:\/\/www.tun.com\/courses\/wp-content\/uploads\/2019\/12\/Imperial-College-Londononline-education.png",378,224,false],"2048x2048":["https:\/\/www.tun.com\/courses\/wp-content\/uploads\/2019\/12\/Imperial-College-Londononline-education.png",378,224,false]},"uagb_author_info":{"display_name":"Axiom Pegasus","author_link":"https:\/\/www.tun.com\/courses\/author\/magic\/"},"uagb_comment_info":0,"uagb_excerpt":"Description This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We&#8217;ll cover some basic statistics of data sets, such as mean values and variances, we&#8217;ll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these&hellip;","featured_media_src_url":"https:\/\/www.tun.com\/courses\/wp-content\/uploads\/2019\/12\/Imperial-College-Londononline-education.png","_links":{"self":[{"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/posts\/6915","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/comments?post=6915"}],"version-history":[{"count":0,"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/posts\/6915\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/media\/19408"}],"wp:attachment":[{"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/media?parent=6915"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/categories?post=6915"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tun.com\/courses\/wp-json\/wp\/v2\/tags?post=6915"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}