-
Enhancing Performance of Quadratic Discriminant Analysis with Marginal Mahalanobis Distance Transformation
- Aparna Shrivastava and Raghu Vamsi Potukuchi
-
2025, 21(9):
485-495.
doi:10.23940/ijpe.25.09.p2.485495
-
Abstract
PDF (989KB)
-
References |
Related Articles
Quadratic Discriminant Analysis (QDA) is a widely used technique in supervised classification that model data with Gaussian distributions and class-specific covariance matrices. However, in real-world scenarios, many datasets do not conform to the Gaussian assumption or exhibit challenges such as high-dimensional feature spaces. To address these issues, this paper introduces a novel method, MMD-QDA, which transforms data into class-specific marginal Mahalanobis distance (MMD) representations. The transformed data is then classified using QDA to leverage its ability to handle varying class covariance effectively. Furthermore, this approach reduces the feature dimension to match the number of class labels in the dataset. This makes it particularly useful for datasets with a large number of features relative to the number of classes. The performance of the proposed MMD-QDA method is evaluated on 10 benchmark datasets obtained from UCI Machine Learning repository using metrics such as accuracy, precision, recall, f1-score, G-mean, and Cohen's Kappa. It is observed from simulation results that the proposed MMD-QDA achieved significant improvements, with average performance enhancements of 9.68%, 10.91%, 9.68%, 10.80%, 10.36%, and 14.83% respectively as compared to LDA and 11.99%, 7.07%, 11.99%, 12.35%, 9.44%, and 22.17 respectively as compared to QDA across the respective metrics.