Difference between revisions of "Talk:What does clustering tell us"
Jump to navigation
Jump to search
Line 3: | Line 3: | ||
== How to deal with highly correlated variables? == | == How to deal with highly correlated variables? == | ||
− | It is called collinearity and there are many tutorials right in the first google search. Though its only a problem for simpler ML algorithms you may want to dig deeper for EDA. | + | It is called collinearity and there are many tutorials right in the first google search. Though its only a problem for simpler ML algorithms you may want to dig deeper for EDA. - [[User:DG|DG]] |
+ | |||
+ | : In a way that's what the dimensionality reduction tries to achieve, it takes all the variables and rearranges them in a way by correlation between them, that's why in this case I end up with the three main directions (three groups of variables that are highly correlated with each other, two of those groups are in turn negatively correlated with each other). - [[User:Gedankenstuecke|Gedankenstuecke]] ([[User talk:Gedankenstuecke|talk]]) 09:52, 13 June 2022 (UTC) |
Latest revision as of 09:52, 13 June 2022
-
[edit | edit source]
It is called collinearity and there are many tutorials right in the first google search. Though its only a problem for simpler ML algorithms you may want to dig deeper for EDA. - DG
- In a way that's what the dimensionality reduction tries to achieve, it takes all the variables and rearranges them in a way by correlation between them, that's why in this case I end up with the three main directions (three groups of variables that are highly correlated with each other, two of those groups are in turn negatively correlated with each other). - Gedankenstuecke (talk) 09:52, 13 June 2022 (UTC)