Arrow Research search

Author name cluster

Stuart J. Russell

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

9 papers
1 author row

Possible papers

9

AAAI Conference 2021 Conference Paper

PAC Learning of Causal Trees with Latent Variables

  • Prasad Tadepalli
  • Stuart J. Russell

Learning causal probabilistic models with latent variables from observational and experimental data is an important problem. In this paper we present a polynomial-time algorithm that PAC-learns the structure and parameters of a rooted, tree-structured causal network of bounded degree where the internal nodes of the tree cannot be observed or manipulated. Our algorithm is the first of its kind to provably learn the structure and parameters of tree-structured causal models with latent internal variables from random examples and active experiments.

IJCAI Conference 1993 Conference Paper

Provably bounded optimal agents

  • Stuart J. Russell
  • Devika Subramanian
  • Ronald Parr

A program is bounded optimal for a given computational device for a given environment, if the expected utility of the program running on the device in the environment is at least as high as that of all other programs for the device. Bounded optimality differs from the decision-theoretic notion of rationality in that it explicitly allows for the finite computational resources of real agents. It is thus a central issue in the foundations of artificial intelligence. In this paper we consider a restricted class of agent architectures, in which a program consists of a sequence of decision procedures generated by a learning program or given a prion. For this class of agents, we give an efficient construction algorithm that generates a bounded optimal program for any episodic environment, given a set of training examples. The algorithm includes solutions to a new class of optimization problems, namely scheduling computational processes for real-time environments. This class appears to contain significant practical applications.

IJCAI Conference 1989 Conference Paper

Execution Architectures and Compilation

  • Stuart J. Russell

This paper introduces a partition of the possible forms of knowledge according to their rela­ tionship to the basic objective of an intelligent agent, namely to act successfully in response to its environment. The resulting classes of knowl­ edge range from fully declarative to fully com­ piled. From these classes, it is possible to gen­ erate 1) a set of execution architectures, each of which combines some of the classes to produce decisions; and 2) a set of compilation methods, that transform knowledge into more efficient but (approximately) behaviourally equivalent, compiled forms. Existing compilation methods can be understood within this framework, and new compilation methods and execution archi­ tectures are indicated. It is proposed that sys­ tems with the ability to learn, use and trans­ form between all the types of knowledge may be able to achieve simultaneously higher levels of competence, efficiency and flexibility.

AAAI Conference 1987 Conference Paper

A Declarative Approach to Bias in Concept Learning

  • Stuart J. Russell

We give a declarative formulation of the biases used in inductive concept learning, particularly the Version-Space approach. We then show how the process of learning a concept from examples can be implemented as a first-order deduction from the bias and the facts describing the instances. This has the following advantages: 1) multiple sources and forms of knowledge can be incorporated into the learning process; 2) the learning system can be more fully integrated with the rest of the beliefs and reasoning of a complete intelligent agent. Without a semantics for the bias, we cannot generally and practically build machines that generate inductive biases automatically and hence are able to learn independently. With this in mind, we show how one part of the bias for Meta-DENDRAL, its instance description language, can be represented using first-order axioms called determinations, and can be derived from basic background knowledge about chemistry. The second part of the paper shows how bias can be represented as defaults, allowing shift of bias to be accommodated in a nonmonotonic framework.

IJCAI Conference 1987 Conference Paper

A Logical Approach to Reasoning by Analogy

  • Todd R. Davies
  • Stuart J. Russell

We analyze the logical form of the domain knowledge that grounds analogical inferences and generalizations from a single instance. The form of the assumptions which justify analogies is given schematically as the "determination rule", so called because it expresses the relation of one set of variables determining the values of another set. The determination relation is a logical generalization of the different types of dependency relations denned in database theory. Specifically, we define determination as a relation between schemata of first order logic that have two kinds of free variables: (1) object variables and (2) what we call "polar" variables, which hold the place of truth values. Determination rules facilitate sound rule inference and valid conclusions projected by analogy from single instances, without implying what the conclusion should be prior to an inspection of the instance. They also provide a way to specify what information is sufficiently relevant to decide a question, prior to knowledge of the answer to the question. 1

AAAI Conference 1986 Conference Paper

A Quantitative Analysis of Analogy by Similarity

  • Stuart J. Russell

In the absence of specific relevance information, the traditional assumption in the study of analogy has been that the most similar analogue is most likely to provide the correct solutions; a justification for this assumption has been lacking, as has any relation between the similarity measure used and the probability of correctness of the analogy. We show how a statistical analysis can be performed to give the probability that a given source will provide a successful analogy, using only the assumption that there are some relevant features somewhere in the source and target descriptions. The predicted variation of the probability with source-target similarity corresponds closely to empirical analogy data obtained by Shepard for human and animal subjects for a wide variety of domains. The utility of analogy by similarity seems to rest on some very fundamental assumptions about the nature of our representations.