张家豪, 古恒洋. 基于L1范数的线性判别分析[J]. 内江师范学院学报, 2025, 40(2): 21-24,35. DOI: 10.13603/j.cnki.51-1621/z.2025.02.004
    引用本文: 张家豪, 古恒洋. 基于L1范数的线性判别分析[J]. 内江师范学院学报, 2025, 40(2): 21-24,35. DOI: 10.13603/j.cnki.51-1621/z.2025.02.004
    ZHANG Jiahao, GU Hengyang. Linear discriminant analysis based on L1 paradigms[J]. Journal of Neijiang Normal University, 2025, 40(2): 21-24,35. DOI: 10.13603/j.cnki.51-1621/z.2025.02.004
    Citation: ZHANG Jiahao, GU Hengyang. Linear discriminant analysis based on L1 paradigms[J]. Journal of Neijiang Normal University, 2025, 40(2): 21-24,35. DOI: 10.13603/j.cnki.51-1621/z.2025.02.004

    基于L1范数的线性判别分析

    Linear discriminant analysis based on L1 paradigms

    • 摘要: 线性判别分析(LDA)是一种著名的分类和降维工具,常用于人脸识别、文本挖掘等领域,但在高维环境中,即样本特征数量远远大于样本个数时,传统的线性判别分析存在奇异性问题.为解决此问题,将LDA等同为一个最小二乘问题,并提出了一种基于块坐标下降法(BCD)的LDA新方法.此外,为了使LDA的解更具稀疏性和鲁棒性,在模型中引入了L1范数的正则项,使得LDA可以处理高维数据.通过稀疏化,可以降低数据的维度,减少计算量,同时也可以提高数据的鲁棒性,抑制噪声和异常值的影响.通过选择重要的特征,也可以增强数据的可解释性,方便分析和理解.

       

      Abstract: Linear discriminant analysis (LDA) is a well-known classification and dimensionality reduction tool commonly used in face recognition, text mining, and other fields. However, in high-dimensional environments, i.e., the number of sample features is much larger than the number of samples, the traditional linear discriminant analysis suffers from the singularity problem. To solve this problem, this paper equates LDA to a least squares problem and proposes a new method for LDA based on block coordinate descent (BCD). In addition, in order to make the solution of LDA more interpretable and sparse, we introduce the regular term of L1 paradigm in the model, which makes LDA can handle high-dimensional data. Through sparsification, the dimensionality of the data can be reduced and the computation can be reduced, and the robustness of the data can also be improved to suppress the influence of noise and outliers, and the interpretability of the data can also be enhanced by selecting important features for easy analysis and understanding.

       

    /

    返回文章
    返回