Arrow Research search

Author name cluster

Jiang Chen

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

5 papers
2 author rows

Possible papers

5

EAAI Journal 2025 Journal Article

Three-level-fused attribute reductions based on three-view uncertainty measures of three-view weighted neighborhood rough sets

  • Jiang Chen
  • Xianyong Zhang
  • Xiao Tang
  • Xiaoling Yang
  • Zhiying Lv

Attribute reductions facilitate classification learning, and they rely on uncertainty measures related to knowledge granulations of various rough sets. Weighted neighborhood rough sets (WNRSs) introduce attribute weights to optimize granular structures and improve neighborhood models, but their current attribute reductions and corresponding algorithms utilize algebra weights and dependency degrees from only a single algebraic perspective. For enriching WNRSs and their attribute reductions, three-view attribute weights and three-view granulation measures are constructed from algebraic, informational, and algebra-information-fused viewpoints, and thus 3 × 3 = 9 attribute reductions are systematically proposed to generate heuristic algorithms on three-level fusion. First based on existing algebraic weights, informational weights are supplemented by information entropy, while fused weights are constructed by algebra-information integration; thus, three-view weights induce three-view weighted neighborhood granulations and three-view WNRSs, and the latter’s granulation monotonicity is achieved. Then based on WNRSs and dependency degrees, decision entropies are supplemented by information function on roughness, while fused measures are constructed by multiplication operation; thus, three-view measures are formulated to acquire granulation monotonicity. Furthermore, three-view weights and measures motivate attribute reductions of WNRSs, and 3 × 3 = 9 heuristic reduction algorithms are systematically designed, thus extending and improving the existing algorithm; in terms of fusion strength, these 9 algorithms exhibit a three-level structure of 4 + 4 + 1 on non-fusion, single-fusion, double-fusion. Finally through data experiments, relevant uncertainty measures and attribute reductions are validated; all 9 reduction algorithms are comprehensively compared in classification learning, the new algorithms generally outperform the contrastive algorithm, and the double-fused algorithm on fused weights and measures acquires the best performance.

JMLR Journal 2009 Journal Article

Learning Acyclic Probabilistic Circuits Using Test Paths

  • Dana Angluin
  • James Aspnes
  • Jiang Chen
  • David Eisenstat
  • Lev Reyzin

We define a model of learning probabilistic acyclic circuits using value injection queries, in which fixed values are assigned to an arbitrary subset of the wires and the value on the single output wire is observed. We adapt the approach of using test paths from the Circuit Builder algorithm (Angluin et al., 2009) to show that there is a polynomial time algorithm that uses value injection queries to learn acyclic Boolean probabilistic circuits of constant fan-in and log depth. We establish upper and lower bounds on the attenuation factor for general and transitively reduced Boolean probabilistic circuits of test paths versus general experiments. We give computational evidence that a polynomial time learning algorithm using general value injection experiments may not do much better than one using test paths. For probabilistic circuits with alphabets of size three or greater, we show that the test path lemmas (Angluin et al., 2009, 2008b) fail utterly. To overcome this obstacle, we introduce function injection queries, in which the values on a wire may be mapped to other values rather than just to themselves or constants, and prove a generalized test path lemma for this case. [abs] [ pdf ][ bib ] &copy JMLR 2009. ( edit, beta )

TCS Journal 2008 Journal Article

A dynamic data structure for top-k queries on uncertain data

  • Jiang Chen
  • Ke Yi

In an uncertain data set S = ( S, p, f ) where S is the ground set consisting of n elements, p: S → [ 0, 1 ] a probability function, and f: S → R a score function, each element i ∈ S with score f ( i ) appears independently with probability p ( i ). The top- k query on S asks for the set of k elements that has the maximum probability of appearing to be the k elements with the highest scores in a random instance of S. Computing the top- k answer on a fixed S is known to be easy. In this paper, we consider the dynamic problem, that is, how to maintain the top- k query answer when S changes, including element insertions and deletions in the ground set S, changes in the probability function p and in the score function f. We present a fully dynamic data structure that handles an update in O ( k log n ) time, and answers a top- j query in O ( log n + j ) time for any j ≤ k. The structure has O ( n ) size and can be constructed in O ( n log k ) time. As a building block of our dynamic structure, we present an algorithm for the all-top- k problem, that is, computing the top- j answers for all j = 1, …, k, which may be of independent interest.

JMLR Journal 2006 Journal Article

Learning a Hidden Hypergraph

  • Dana Angluin
  • Jiang Chen

We consider the problem of learning a hypergraph using edge-detecting queries. In this model, the learner may query whether a set of vertices induces an edge of the hidden hypergraph or not. We show that an r -uniform hypergraph with m edges and n vertices is learnable with O (2 4 r m · poly ( r,log n )) queries with high probability. The queries can be made in O (min(2 r (log m+r ) 2, (log m+r ) 3 )) rounds. We also give an algorithm that learns an almost uniform hypergraph of dimension r using O (2 O ((1+Δ/2)r) · m 1+Δ/2 · poly (log n )) queries with high probability, where Δ is the difference between the maximum and the minimum edge sizes. This upper bound matches our lower bound of Ω(( m /(1+Δ/2)) 1+Δ/2 ) for this class of hypergraphs in terms of dependence on m. The queries can also be made in O ((1+Δ) · min(2 r (log m+r ) 2, (log m+r ) 3 )) rounds. [abs] [ pdf ][ bib ] &copy JMLR 2006. ( edit, beta )