EAAI Journal 2025 Journal Article
Three-level-fused attribute reductions based on three-view uncertainty measures of three-view weighted neighborhood rough sets
- Jiang Chen
- Xianyong Zhang
- Xiao Tang
- Xiaoling Yang
- Zhiying Lv
Attribute reductions facilitate classification learning, and they rely on uncertainty measures related to knowledge granulations of various rough sets. Weighted neighborhood rough sets (WNRSs) introduce attribute weights to optimize granular structures and improve neighborhood models, but their current attribute reductions and corresponding algorithms utilize algebra weights and dependency degrees from only a single algebraic perspective. For enriching WNRSs and their attribute reductions, three-view attribute weights and three-view granulation measures are constructed from algebraic, informational, and algebra-information-fused viewpoints, and thus 3 × 3 = 9 attribute reductions are systematically proposed to generate heuristic algorithms on three-level fusion. First based on existing algebraic weights, informational weights are supplemented by information entropy, while fused weights are constructed by algebra-information integration; thus, three-view weights induce three-view weighted neighborhood granulations and three-view WNRSs, and the latter’s granulation monotonicity is achieved. Then based on WNRSs and dependency degrees, decision entropies are supplemented by information function on roughness, while fused measures are constructed by multiplication operation; thus, three-view measures are formulated to acquire granulation monotonicity. Furthermore, three-view weights and measures motivate attribute reductions of WNRSs, and 3 × 3 = 9 heuristic reduction algorithms are systematically designed, thus extending and improving the existing algorithm; in terms of fusion strength, these 9 algorithms exhibit a three-level structure of 4 + 4 + 1 on non-fusion, single-fusion, double-fusion. Finally through data experiments, relevant uncertainty measures and attribute reductions are validated; all 9 reduction algorithms are comprehensively compared in classification learning, the new algorithms generally outperform the contrastive algorithm, and the double-fused algorithm on fused weights and measures acquires the best performance.