Elsevier

Knowledge-Based Systems

Volume 172, 15 May 2019, Pages 130-140
Knowledge-Based Systems

Incremental approaches to updating reducts under dynamic covering granularity

https://doi.org/10.1016/j.knosys.2019.02.014Get rights and content

Abstract

In real-world situations, knowledge acquisition of dynamic covering decision information systems(DCDISs) under variations of object sets, covering sets and covering granularity is an important research topic of covering-based rough set theory. In this paper, firstly, we introduce the concepts of the refining and coarsening coverings when revising attribute value sets and investigate the updating mechanisms of related families in DCDISs with dynamic covering granularity. Meanwhile, we discuss the relationship between reducts of the original covering decision information systems(OCDISs) and those of DCDISs and provide the incremental algorithms for updating reducts by making full use of the existing results from OCDISs. Finally, we perform the experiment on eight data sets downloaded from UCI Machine Learning Repository, which verifies that the proposed algorithms achieve better performance in terms of stability and computational time.

Introduction

Covering rough set theory, introduced by Zakowski [1] in the 1980s, has been considered as an important mathematical tool for knowledge acquisition of information systems with incomplete, inconsistent and insufficient information. With more than 30 years of development, researchers [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16] have investigated covering-based rough sets from theoretical and application aspects. Especially, it has been successfully applied to many fields such as pattern recognition, rule induction and feature selection.

Many scholars have discussed the relationship between information granulation and knowledge acquisition of information systems from two aspects. On one hand, incremental methods [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36], [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50], [51], [52], [53], [54], [55], [56], which make full use of the existing results from the original data sets, are effective for knowledge acquisition of dynamic information systems with varying attribute value sets in terms of running times. For example, Hu et al. [44] investigated the dynamic mechanisms for updating approximations in multigranulation rough sets while refining and coarsening attribute values. Jing et al. [29] developed a group incremental reduction algorithm with varying data values. Li et al. [23] proposed the incremental approach to maintaining approximations of dominance-based rough sets approach when attribute values vary over time. Luo et al. [31] presented the updating properties for dynamic maintenance of approximations when the criteria values in the set-valued decision system evolve with time. Qian et al. [35] addressed the attribute reduction problem for sequential three-way decisions under dynamic granulation. Zhang et al. [55] provided a dynamic three-way decision model for dynamic information systems with updating attribute values. On the other hand, many investigations have focused on multi-scale information tables [44], [57], [58], [59], [60], [61], [62], [63], in which objects are depicted with different scales under the same attribute from a finer to a coarser labelled value. For example, Hao et al. [57] developed a sequential three-way decision model to investigate the optimal scale selection problem in a dynamic multi-scale decision table. Huang et al. [44] proposed multi-granulation decision-theoretic rough sets models and optimal scale selection methods for acquiring knowledge from multi-scale intuitionistic fuzzy information tables. Luo et al. [58] exploited the updating mechanisms of decision granules induced by the similarity relation with the cut refinement and coarsening through the attribute value taxonomies. She et al. [60] performed the selection of the optical level of scale and attribute reduction in a pointwise manner instead of the global one in multi-scale decision tables. Wu et al. [61] introduced the concepts of lower and upper approximations of sets with reference to different levels of scales and discussed optimal scale selection with various requirements in incomplete multi-scale decision tables. Besides, there are many other researches of dynamic attribute values, multi-scale information systems and multi-granulation rough sets, which aim to study knowledge acquisition of information systems and give the support for decision making with less risk in a dynamic environment. Especially, the relationships among them and the motivations of this study are given by answering the following three questions:

(1) What are the differences between this study and the existing results(in terms of refining and coarsening idea in multi-scale data environment)? We observe that they have different objectives on different subjects with different methods as follows. On one hand, this study focuses on knowledge acquisition of DCDISs based on the relationship between the original coverings and the refining and coarsening coverings. Concretely, we study how to update the related families of DCDISs with dynamic covering granularity based on those of OCDISs. We also investigate how to update reducts of DCDISs with dynamic covering granularity based on those of OCDISs. On the other hand, we observe that the existing studies(in terms of refining and coarsening idea in multi-scale data environment) investigate how to perform optimal level selection in multi-scale information systems. Especially, we find that objects are described with different scales under the same attribute from a finer to a coarser labelled value, and all partitions derived with respect to the same attribute satisfy a partial order relation in multi-scale information systems.

(2) What is the relationship between refining and coarsening coverings and multi-granularity? We see that they have different objectives with respect to different backgrounds. On one hand, we observe that there are some refining and coarsening coverings caused by dynamic attribute value sets in real-time situations. Especially, there are the original coverings for the refining and coarsening coverings, the granularity of coarsening coverings is larger than that of the original coverings, and the granularity of the original coverings is larger than that of the refining coverings. We find that most of studies discuss refining and coarsening coverings for updating reducts of dynamic information systems, in which it describes a target set by the lower and upper approximations under one granulation. On the other hand, we observe that multi-scale data sets need multiple granulations for constructing set approximations, which inspires to put forward pessimistic multi-granulation rough sets and optimistic multi-granulation rough sets for multi-source information fusion. Especially, we notice that multi-granulation rough set theory employs at least two granulations to depict the lower and upper approximations of sets.

(3) What are motivations of this study? Firstly, Yang et al. [64] provided the related families-based method for attribute reduction of CDISs with respect to the third lower and upper approximation operators and illustrated its effectiveness and feasibility with the experimental results. Due to the characteristics of data collections, there are many DCDISs with dynamic object sets, covering sets and covering granularity such as medical diagnosis systems and transportation systems, which make the related families-based non-incremental approaches extremely inefficient for knowledge acquisition of these DCDISs. Secondly, Lang et al. [20] developed the related families-based approaches to updating reducts of DCDISs with dynamic attribute sets, which illustrates that the incremental approaches are more effective than non-incremental approaches in terms of computational times. In a dynamic environment, there are many DCDISs with dynamic covering granularity, and we find that it is time-consuming to construct reducts of these DCDISs with the non-incremental approaches. Especially, we have not seen the related families-based incremental methods for updating reducts of DCDISs under dynamic covering granularity, and we should make full use of the existing results from OCDISs and study how to perform knowledge acquisition of DCDISs when revising covering granularity.

This study provides effective approaches for attribute reduction of DCDISs with dynamic covering granularity, which contains three innovations as follows. First, we develop the related families-based incremental learning methods for attribute reduction of DCDISs when refining coverings. Concretely, we introduce the concepts of the refining and coarsening coverings resulted by variations of attribute value sets and study the relationship between the related sets of OCDISs and those of DCDISs with refining coverings. We provide the mechanisms of updating reducts of DCDISs when refining coverings and employ examples to illustrate how to compute reducts of DCDISs with the incremental algorithm. Second, we propose the related families-based incremental approaches for attribute reduction of DCDISs while coarsening coverings. Concretely, we study the relationship between the related sets of OCDISs and those of DCDISs and propose the incremental algorithm for attribute reduction of DCDISs with coarsening coverings. We take examples to demonstrate how to perform attribute reduction of DCDISs with the incremental algorithm. Finally, we conduct the experiment on eight data sets downloaded from UCI Machine Learning Repository [65] with the incremental heuristic algorithms and employ the experimental results to illustrate that the proposed algorithms are effective for updating reducts of DCDISs with dynamic covering granularity.

This paper is structured as follows: In Section 2, we briefly review some concepts of covering-based rough set theory. Section 3 provides the related families-based incremental methods for attribute reduction of DCDISs when refining coverings. In Section 4, we develop the related families-based incremental approaches for attribute reduction of DCDISs when coarsening coverings. Section 5 demonstrates the effectiveness of the incremental algorithms with the experimental results. All conclusions are drawn in Section 6.

Section snippets

Preliminaries

In this section, we briefly review some concepts of covering information systems.

Definition 2.1

[44]

Let S=(U,A,V,f) be an information system, where U={x1,x2,,xn} is a non-empty universe, A={a1,a2,,am} is a non-empty attribute set, V=aAVa is the set of attribute values, where Va is the domain of attribute a, f:U×AV is an information function such that f(x,a)Va for any aA and xU, [x]a={yf(x,a)=f(y,a)}, and f(x,a)f(z,a), where x,y,zU, and aA.

(1) If we take f(y,a)=vVa for some y[x]a, then f(y,a) is

Updating reducts of DCDISs with refining coverings

In this section, we investigate how to update reducts of DCDISs while refining coverings.

Definition 3.1

Let U={x1,x2,,xn}, C1={C11,C12,,C1m1}, and C2={C21,C22,,C2m2} are coverings of the universe U. For any C1jC1, if there exists a subset C of C2 such that C=C1j, then C2 is called a refining covering of C1. Otherwise, C1 is called a coarsening covering of C2.

We observe that the concepts of the refining and coarsening coverings given by Definition 3.1 are generalizations of those given by Definition 2.1.

Updating reducts of DCDISs with coarsening coverings

In this section, we study how to update reducts of DCDISs while coarsening coverings.

Definition 4.1

Let U={x1,x2,,xn}, Δ={C1,C2,,Cm1,Cm}, Δ={C1,C2,,Cm1,Cm}, and D={D1,D2,,Dk}. Then (U,Δ,D) is called a DCDIS of (U,Δ,D).

We see that (U,Δ,D) is an ICDIS if (U,Δ,D) is an ICDIS. For simplicity, we only investigate (U,Δ,D), in which Ci=Ci (1im1) and CmCm, and denote Δ={C1,C2,,Cm1,Cm} as Δ={C1,C2,,Cm1,Cm} in this section.

Example 4.2

Continuation from Example 2.7

(1) Let U={x1,x2,,x8}, Δ={C1,C2,C3,C4,C5}, D={{x1,x2},{x3,x4,x5

Experimental results

In this section, we demonstrate that IHVR and IHVC are feasible for attribute reduction of DCDISs under dynamic covering granularity with the experimental results.

To test NIHV, IHVR and IHVC, we transform eight data sets depicted by Table 1 into CDISs. Concretely, we normalize all attribute values into the interval [0,1] and derive CDISs with the neighbourhood operator N(x)={y|d(x,y)0.05,yU}, where d(x,y)=[cA|c(x)c(y)|2]12. Moreover, we perform all computations on a PC with a Intel(R)

Conclusions

Due to the characteristics of data collections, there are many DCDISs under dynamic covering granularity in real-time situations, and knowledge acquisition of which is an important topic of covering rough set theory. In this paper, firstly, we have provided the concepts of the refining and coarsening coverings caused by revising attribute value sets and illustrated the updating mechanisms of the related families in DCDISs when varying covering granularity. Then we have investigated the

Acknowledgements

We would like to thank the anonymous reviewers very much for their professional comments and valuable suggestions. This work is supported by the National Natural Science Foundation of China (Nos. 61603063,61673301,11771059,61573255), Hunan Provincial Natural Science Foundation of China (Nos.2018JJ2027, 2018JJ3518), the Scientific Research Fund of Hunan Provincial Key Laboratory of Mathematical Modeling and Analysis in Engineering, China (No. 2018MMAEZD10).

References (65)

  • LiJ.H. et al.

    Three-way cognitive concept learning via multi-granularity

    Inform. Sci.

    (2017)
  • WuZ.J. et al.

    Semi-monolayer cover rough set: concept, property and granular algorithm

    Inform. Sci.

    (2018)
  • YaoY.Y.

    Three-way decision and granular computing

    Int. J. Approx. Reason.

    (2018)
  • YaoY.Y. et al.

    Rough set models in multigranulation spaces

    Inform. Sci.

    (2016)
  • ZhuW.

    Relationship among basic concepts in covering-based rough sets

    Inform. Sci.

    (2009)
  • LangG.M. et al.

    Characteristic matrices-based knowledge reduction in dynamic covering decision information systems

    Knowl.-Based Syst.

    (2015)
  • LangG.M. et al.

    Knowledge reduction of dynamic covering decision information systems when varying covering cardinalities

    Inform. Sci.

    (2016)
  • LangG.M. et al.

    Related Families-based attribute reduction of dynamic covering decision information systems

    Knowl.-Based Syst.

    (2018)
  • YangY.Y. et al.

    Fuzzy rough set based incremental attribute reduction from dynamic data with sample arriving

    Fuzzy Sets Syst.

    (2017)
  • LiS.Y. et al.

    Incremental update of approximations in dominance-based rough sets approach under the variation of attribute values

    Inform. Sci.

    (2015)
  • ChenD.G. et al.

    An incremental algorithm for attribute reduction with variable precision rough sets

    Appl. Soft Comput.

    (2016)
  • JingY.G. et al.

    An incremental attribute reduction approach based on knowledge granularity under the attribute generalization

    Int. J. Approx. Reason.

    (2016)
  • JingY.G. et al.

    An incremental attribute reduction approach based on knowledge granularity with a multi-granulation view

    Inform. Sci.

    (2017)
  • LiT.R. et al.

    A rough sets based characteristic relation approach for dynamic attribute generalization in data mining

    Knowl.-Based Syst.

    (2007)
  • LuoC. et al.

    Fast algorithms for computing rough approximations in set-valued decision systems while updating criteria values

    Inform. Sci.

    (2015)
  • PratamaM. et al.

    An incremental meta-cognitive-based scaffolding fuzzy neural network

    Neurocomputing

    (2016)
  • PratamaM. et al.

    Scaffolding type-2 classifier for incremental learning under concept drifts

    Neurocomputing

    (2016)
  • QianJ. et al.

    Attribute reduction for sequential three-way decisions under dynamic granulation

    Int. J. Approx. Reason.

    (2017)
  • YangX.B. et al.

    Updating multigranulation rough approximations with increasing of granular structures

    Knowl.-Based Syst.

    (2014)
  • ZhangY.Y. et al.

    Incremental updating of rough approximations in interval-valued information systems under attribute generalization

    Inform. Sci.

    (2016)
  • HuangY.Y. et al.

    Matrix-based dynamic updating rough fuzzy approximations for data mining

    Knowl.-Based Syst.

    (2017)
  • HuangY.Y. et al.

    Dynamic variable precision rough set approach for probabilistic set-valued information systems

    Knowl.-Based Syst.

    (2017)
  • Cited by (31)

    • Exploiting fuzzy rough mutual information for feature selection

      2022, Applied Soft Computing
      Citation Excerpt :

      Entropy is an important measure of uncertainty in rough set theory. The uncertainty measures of rough set theory include rough entropy [5], information entropy [6], knowledge granularity [7], granularity measure [8] and so on. To evaluate uncertainty of a system, Shannon introduced the concept of entropy in physics to communication theory [9].

    • Multi-scale covering rough sets with applications to data classification

      2021, Applied Soft Computing
      Citation Excerpt :

      Zhu et al. [20,21] studied the axiomization of covering based upper and lower approximation operators, and discussed the relationship between different covering rough set models. Cai et al. [22] and Lang et al. [23] investigated incremental approaches for attribute reduction in dynamic covering decision tables. In general, each attribute of traditional covering information tables has only one scale, that is, single-scale covering information tables.

    View all citing articles on Scopus
    View full text