The scaling matrix in LDA is not normalized because its purpose is to maximize the between-class variability while minimizing the within-class variability. Normalizing the scaling matrix would change its magnitude and alter its ability to achieve this objective. Additionally, the scaling matrix is not unique, as any scalar multiple of the matrix would also achieve the same objective. Therefore, normalization of the scaling matrix is not necessary for achieving the desired results in LDA.
Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss
Asked: 2023-06-17 02:13:12 +0000
Seen: 10 times
Last updated: Jun 17 '23
What benefits does DolphinDB's database-oriented partitioning approach offer?
How can one ensure that sub-classes have uniform method parameters in TypeScript?
How can the calculation of matrix determinant be performed using CUDA?
How can code repetition be prevented when using (box)plot functions?
When I attempt to generate a database, why does the azure-cosmos-emulator become unresponsive?