knowledge representation & reasoning i · 2019-07-25 · knowledge representation and reasoning •...

Click here to load reader

Upload: others

Post on 27-Jun-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    1

    Wissensverarbeitung

    Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    Inffeldgasse 16b/2

    A-8010 Graz

    Austria

    Wissensverarbeitung - Knowledge Representation & Reasoning I -

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    2

    Wissensverarbeitung

    Motivation

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    3

    Wissensverarbeitung

    Artificial Intelligence and Knowledge

    Representation & Reasoning • Artificial Intelligence (AI) can be described as:

    • the study of intelligent behavior achieved trough computational

    means

    • Knowledge Representation & Reasoning can be

    viewed as: • the study of how to reason (compute) with knowledge in order to

    decide what to do

    Knowledge

    Acquisition

    Knowledge

    Representation Reasoning

    Decision on

    Actions

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    4

    Wissensverarbeitung

    Clear Need for proper KR & R

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    5

    Wissensverarbeitung

    Agents

    • perceive the environment through sensors (→ percepts)

    • act upon the environment through actuators (→ actions)

    • examples: humans and animals, robots and software

    agents (softbots), temperature control, ABS, . . .

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    6

    Wissensverarbeitung

    Wumpus World

    • performance measure • gold +1000

    • death -1000

    • step -1

    • using arrow -10

    • environment • squares adjacent to wumpus are smelly

    • squares adjacent to pit are breezy

    • glitter iff gold in the same square

    • shooting kills wumpus if facing it

    • grabbing picks up gold if in the same square

    • releasing drops the gold in the same square

    • actuators • left turn, right turn, grab, release, shoot

    • sensing • breeze, glitter, smell, bump, hear

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    7

    Wissensverarbeitung

    Rational Agent

    • … does the “right thing”!

    • rational behavior is dependent on • performance measures (goals)

    • percept sequences

    • knowledge of the environment

    • possible actions

    • for each possible percept sequence, a rational agent

    should select an action that is expected to maximize

    its performance measure, given the evidence

    provided by the percept sequence and whatever

    built-in knowledge the agent has.

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    8

    Wissensverarbeitung

    Environment of Rational Agents I

    • accessible vs. inaccessible (fully observable vs.

    partially observable) • are the relevant aspects of the environment accessible to the

    sensors?

    • deterministic vs. stochastic • is the next state of the environment completely determined by the

    current state and the selected action? if only actions of other agents

    are nondeterministic, the environment is called strategic.

    • episodic vs. sequential • can the quality of an action be evaluated within an episode

    (perception + action), or are future developments decisive for the

    evaluation of quality?

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    9

    Wissensverarbeitung

    Environment of Rational Agents II

    • static vs. dynamic • can the environment change while the agent is deliberating? if the

    environment does not change but if the agent’s performance score

    changes as time passes by the environment is denoted as semi-

    dynamic.

    • discrete vs. continuous • is the environment discrete (chess) or continuous (a robot moving

    in a room)?

    • single agent vs. multi-agent • which entities have to be regarded as agents? there are

    competitive and cooperative scenarios.

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    10

    Wissensverarbeitung

    Examples of Environments

    task observable deterministic episodic static discrete agents

    crossword

    puzzle fully deterministic sequential static discrete single

    chess with

    clock fully strategic sequential semi discrete multi

    poker partially stochastic sequential static discrete multi

    backgammon fully stochastic sequential static discrete multi

    taxi driving partially stochastic sequential dynamic continuous multi

    medical

    diagnosis partially stochastic sequential dynamic continuous single

    image analysis fully deterministic episodic semi continuous single

    part-picking

    robot partially stochastic episodic dynamic continuous single

    refinery

    controller partially stochastic sequential dynamic continuous single

    interactive

    English tutor partially stochastic sequential dynamic discrete multi

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    11

    Wissensverarbeitung

    What’s about the Wumpus World ?

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    12

    Wissensverarbeitung

    Simple Reflex Agent

    • direct use of perceptions is often not possible due to the large space

    required to store them (e.g., video images)

    • input therefore is often interpreted before decisions are made

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    13

    Wissensverarbeitung

    Interpretive Reflex Agent

    • since storage space required for perceptions is too

    large, direct interpretation of perceptions

    function SIMPLE-REFLEX-AGENT(percept) returns an action

    persistent: rules, a set of condition–action rules

    state ← INTERPRET-INPUT(percept)

    rule ← RULE-MATCH(state, rules)

    action ← rule.ACTION

    return action

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    14

    Wissensverarbeitung

    Model-based Reflex Agent

    • in case the agent’s history in addition to the actual

    percept is required to decide on the next action, it

    must be represented in a suitable form (model)

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    15

    Wissensverarbeitung

    Model-based Reflex Agent

    function MODEL-BASED-REFLEX-AGENT(percept) returns an action

    persistent: state, the agent’s current conception of the world state

    model, a description of how the next state depends on

    current state and action

    rules, a set of condition–action rules

    action, the most recent action, initially none

    state ← UPDATE-STATE(state, action, percept, model)

    rule ← RULE-MATCH(state, rules)

    action ← rule.ACTION

    return action

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    16

    Wissensverarbeitung

    Goal-based Agent

    • often, percepts alone are insufficient to decide what to do

    • model-based, goal-based agents use an explicit representation

    of goals and consider them for the choice of actions

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    17

    Wissensverarbeitung

    Utility-based Agents

    • usually, there are several possible actions that can be taken in a

    given situation

    • in such cases, the utility of the next achieved state can come

    into consideration to arrive at a decision

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    18

    Wissensverarbeitung

    Knowledge Representation and Reasoning

    • often, our agents need knowledge before they can

    start to act intelligently

    • they then also need some reasoning component to

    exploit the knowledge they have

    • examples: • knowledge about the important concepts in a domain

    • knowledge about actions one can perform in a domain

    • knowledge about temporal relationships between events

    • knowledge about the world and how properties are related to

    actions

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    19

    Wissensverarbeitung

    Knowledge Bases

    • knowledge base (KB) = set of sentences in a formal language

    • declarative approach to build an agent

    TELL it what it needs to know

    • then it can ASK itself what to do – answers should follow from

    the KB

    • agents can be viewed at the knowledge level

    i.e. what they know, regardless of the implementation

    • or on the implementation level

    i.e. data structures in KB and algorithms to manipulate them

    inference engine domain-independent algorithms

    knowledge base domain-specific content

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    20

    Wissensverarbeitung

    Knowledge-Based Agents

    function KB-AGENT(percept) returns an action

    persistent: KB, a knowledge base

    t, a counter, initially 0, indicating time

    TELL(KB, MAKE-PERCEPT-SENTENCE(percept,t))

    action ← ASK(KB, MAKE-ACTION-QUERY(t))

    TELL(KB, MAKE-ACTION-SENTENCE(action,t))

    t ← t + 1

    return action

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    21

    Wissensverarbeitung

    What’s about first order logic?

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    22

    Wissensverarbeitung

    Wumpus Models

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    23

    Wissensverarbeitung

    Logic-Based Reflex Agent

    • … only reacts to percepts

    • example for a percept statement (at time 5):

    𝑝𝑒𝑟𝑐𝑒𝑝𝑡 𝑠𝑡𝑒𝑛𝑐ℎ, 𝑏𝑟𝑒𝑒𝑧𝑒, 𝑔𝑙𝑖𝑡𝑡𝑒𝑟, 𝑛𝑜𝑛𝑒, 𝑛𝑜𝑛𝑒, 5

    1. ∀𝑏, 𝑔, 𝑢, 𝑐, 𝑡. 𝑝𝑒𝑟𝑐𝑒𝑝𝑡 𝑠𝑡𝑒𝑛𝑐ℎ, 𝑏, 𝑔, 𝑢, 𝑐 ⇒ 𝑆𝑡𝑒𝑛𝑐ℎ 𝑡 ∀𝑠, 𝑔, 𝑢, 𝑐, 𝑡. 𝑝𝑒𝑟𝑐𝑒𝑝𝑡 𝑠, 𝑏𝑟𝑒𝑒𝑧𝑒, 𝑔, 𝑢, 𝑐 ⇒ 𝐵𝑟𝑒𝑒𝑧𝑒 𝑡 ∀𝑠, 𝑏, 𝑢, 𝑐, 𝑡. 𝑝𝑒𝑟𝑐𝑒𝑝𝑡 𝑠, 𝑏, 𝑔𝑙𝑖𝑡𝑡𝑒𝑟, 𝑢, 𝑐 ⇒ 𝐴𝑡𝐺𝑜𝑙𝑑 𝑡

    2. choice of action

    ∀𝑡. [𝐴𝑡𝐺𝑜𝑙𝑑(𝑡) ⇒ 𝐴𝑐𝑡𝑖𝑜𝑛(𝑔𝑟𝑎𝑏, 𝑡)]

    • query: ∃𝑥. 𝐴𝑐𝑡𝑖𝑜𝑛(𝑥, 𝑡)

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    24

    Wissensverarbeitung

    Encyclopedic Knowledge

    • knowledge about structure and properties of a

    domain

    • categories: allow to organize objects

    • taxonomy: description using subclass relations

    • inheritance: objects inherit properties based on

    relations

    • physical composition: description of larger structures

    • instance relationship: obtain categories for an object

    • common sense: general knowlegde shared by a large

    group of individuals

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    25

    Wissensverarbeitung

    A General Category Hierarchy

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    26

    Wissensverarbeitung

    Knowledge about Dynamics

    • agents have an internal model • of all basic aspects of their environment,

    • of the executability and effects of their actions,

    • of further basic laws of the world, and

    • of their own goals

    • important aspect: how does the world change?

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    27

    Wissensverarbeitung

    Situation Calculus

    • a way to describe dynamic worlds with FOL

    • states are represented by terms

    • the world is in state 𝑠 and can only be altered through the execution of an action: 𝑑𝑜(𝑎, 𝑠) is the resulting situation, if 𝑎 is executed

    • actions have preconditions and are described by their effects

    • relations whose truth value changes over time are called fluents.

    represented through a predicate with two arguments: the fluent

    and a state term. for example, 𝑎𝑡(𝑥, 𝑦, 𝑠) means, that in situation 𝑠, the agent is at position 𝑥, 𝑦. ℎ𝑜𝑙𝑑𝑖𝑛𝑔(𝑦, 𝑠) means that in situation 𝑠, the agent holds object 𝑦.

    • Atemporal or eternal predicates, e.g., 𝑝𝑜𝑟𝑡𝑎𝑏𝑙𝑒(𝑔𝑜𝑙𝑑)

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    28

    Wissensverarbeitung

    Example Wumpus World

    let 𝑠0 be the initial situation and

    𝑠1 = 𝑑𝑜(𝑓𝑜𝑟𝑤𝑎𝑟𝑑, 𝑠0) 𝑠2 = 𝑑𝑜(𝑡𝑢𝑟𝑛(𝑟𝑖𝑔ℎ𝑡), 𝑠1) 𝑠3 = 𝑑𝑜(𝑓𝑜𝑟𝑤𝑎𝑟𝑑, 𝑠2)

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    29

    Wissensverarbeitung

    Description of Actions

    • preconditions: in order to pick something up, it must

    be both present and portable:

    ∀𝑥, 𝑠[𝑃𝑜𝑠𝑠 𝑔𝑟𝑎𝑏 𝑥 , 𝑠 ⇔ 𝑝𝑟𝑒𝑠𝑒𝑛𝑡 𝑥, 𝑠 ∧ 𝑝𝑜𝑟𝑡𝑎𝑏𝑙𝑒(𝑥)]

    • in the wumpus world:

    𝑝𝑜𝑟𝑡𝑎𝑏𝑙𝑒 𝑔𝑜𝑙𝑑 , ∀𝑠[𝑎𝑡𝑔𝑜𝑙𝑑(𝑠) ⇒ 𝑝𝑟𝑒𝑠𝑒𝑛𝑡(𝑔𝑜𝑙𝑑, 𝑠)]

    • positive effect axioms:

    ∀𝑥, 𝑠 𝑃𝑜𝑠𝑠 𝑔𝑟𝑎𝑏 𝑥 , 𝑠 ⇒ ℎ𝑜𝑙𝑑𝑖𝑛𝑔 𝑥, 𝑑𝑜 𝑔𝑟𝑎𝑏 𝑥 , 𝑠

    • negative effect axioms:

    ∀𝑥, 𝑠[𝑃𝑜𝑠𝑠(𝑟𝑒𝑙𝑒𝑎𝑠𝑒 𝑥 , 𝑠) ⇒ ¬ℎ𝑜𝑙𝑑𝑖𝑛𝑔(𝑥, 𝑑𝑜(𝑟𝑒𝑙𝑒𝑎𝑠𝑒 𝑥 , 𝑠))]

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    30

    Wissensverarbeitung

    Frame Problem

    • we had: ℎ𝑜𝑙𝑑𝑖𝑛𝑔(𝑔𝑜𝑙𝑑, 𝑠0)

    • following situation:

    ¬ℎ𝑜𝑙𝑑𝑖𝑛𝑔(𝑔𝑜𝑙𝑑, 𝑑𝑜(𝑟𝑒𝑙𝑒𝑎𝑠𝑒(𝑔𝑜𝑙𝑑); 𝑠0))?

    • we had: ¬ℎ𝑜𝑙𝑑𝑖𝑛𝑔(𝑔𝑜𝑙𝑑, 𝑠0)

    • following situation:

    ¬ℎ𝑜𝑙𝑑𝑖𝑛𝑔(𝑔𝑜𝑙𝑑, 𝑑𝑜(𝑡𝑢𝑟𝑛 𝑟𝑖𝑔ℎ𝑡 , 𝑠0))? • we must also specify which fluents remain unchanged!

    • the frame problem: specification of the properties that do not

    change as a result of an action.

    • successor state axioms must also be specified

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    31

    Wissensverarbeitung

    Successor State Axioms

    • an elegant way to solve the frame problem is to fully

    describe the successor situation:

    • true after action [ action made it true or,

    already true and the action did not falsify it ]

    • example for grab:

    ∀𝑎, 𝑥, 𝑠[ℎ𝑜𝑙𝑑𝑖𝑛𝑔(𝑥, 𝑑𝑜(𝑎, 𝑠)) ⇔ { 𝑎 = 𝑔𝑟𝑎𝑏 𝑥 ∧ 𝑃𝑜𝑠𝑠 𝑎, 𝑠 ∨ (ℎ𝑜𝑙𝑑𝑖𝑛𝑔 𝑥, 𝑠 ∧ 𝑎 ≠ 𝑟𝑒𝑙𝑒𝑎𝑠𝑒(𝑥))}]

    • can also be automatically compiled by only giving the

    effect axioms (and then applying explanation

    closure). Here we suppose that only certain effects

    can appear

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    32

    Wissensverarbeitung

    Knowledge and Belief

    • so far the agent has no notation about knowlegde

    • its related to the mental state

    • belief: the agent believes that a proposition holds

    until proven otherwise, also related to defaults

    𝑏𝑒𝑙𝑖𝑒𝑣𝑒𝑠(𝑎, 𝑝)

    • knowlegde: is the justified true belief, there are no

    doubts

    𝑘𝑛𝑜𝑤𝑠 𝑎, 𝑝 𝑘𝑛𝑜𝑤𝑠𝑤ℎ𝑒𝑡𝑒𝑟 𝑎, 𝑝 ⇔ 𝑘𝑛𝑜𝑤𝑠 𝑎, 𝑝 ∨ 𝑘𝑛𝑜𝑤𝑠(𝑎, ¬𝑝)

    • incomplete knowlegde: there is not enough

    information to answer a query, closely related to

    sensing

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    33

    Wissensverarbeitung

    Tradeoffs

    • we like to build agents that react timely (tractability)

    • all concepts seen so far can be mapped to FOL

    • FOL is quite handy for knowlegde representation

    • … but it is a problem for the reasoning

    • there is always the tradeoff between expressiveness

    and tractability

    • solution: limit the expressive power, e.g. Horn

    clauses

    • use different representations for different aspects

    (hybrid reasoning)

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    34

    Wissensverarbeitung

    Our Goals

    • what are the goals we will tackle in the course • get to know example knowledge representations related to

    intelligent agents

    • use this representations for a concrete problem

    • be aware of the modelling issues of different representations

    • be able to automatically reason with different representations

    (manually and using tools)

    • understand the performance issues of different representations

    • be able to manage specific aspects of KR & R

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    35

    Wissensverarbeitung

    Content

    • next units

    • foundations of different representations

    • representing knowledge with different representations

    • automated reasoning with different representations

    • special issues with different representations

    • we focus on • First Order Logic (FOL)

    • Answer Set Programming (ASP)

    • Description Logic (DL)

    • Planning

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    36

    Wissensverarbeitung

    Tool Support

    • we will use state-of-the-art tools

    • please install them for the lecture and the practical

    work

    • FOL • Automated Theorem Prover – Prover9 (GUI)

    • https://www.cs.unm.edu/~mccune/mace4/

    • ASP • Potsdam Answer Set Solving Collection – clingo (command line)

    • http://potassco.sourceforge.net/

    • DL • ontology editor - Protégé (GUI)

    • http://protege.stanford.edu/

    https://www.cs.unm.edu/~mccune/mace4/https://www.cs.unm.edu/~mccune/mace4/http://potassco.sourceforge.net/http://potassco.sourceforge.net/http://potassco.sourceforge.net/http://protege.stanford.edu/http://protege.stanford.edu/

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    37

    Wissensverarbeitung

    Practical Work

    • support Gargamel in setting the table

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    38

    Wissensverarbeitung

    Logic-based Knowledge Representation

    • logic: properties of the world (domain) are

    represented in the form of propositions and

    sentences.

    • syntax: expressions (formulae) ;

    admissible/syntactically correct sentences

    • semantics: – meaning of the sentences

    – true sentences basis for logical consequences, for example,

    from 1, 2, …, n we can derive .

    – inference systems (calculi): allow „calculations“

    – natural representation of facts and rules, for example:

    from_Crete(Epimenides).

    x(from_Crete(x) lies(x)).

    logical conclusion: lies(Epimenides).

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    39

    Wissensverarbeitung

    Logic-based Knowledge Representation

    • reference language – first order (predicate) logic

    – most important formalism of logic

    – expressive: all computable functions specifiable

    (Church‘s thesis)

    – simple, natural syntax and intuitive semantics

    • disadvantages – non-decidable but decidable fragments

    – high computational complexity further restrictions

    of expressivity needed

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    40

    Wissensverarbeitung

    Logical KR

    • use of FOL

    • domain theory – a general description of the problem

    domain

    – e.g. groups in mathematics, geometry

    • axioms – general valid sentences

    • facts – specific knowledge

    – literals, e.g., child(Bart,Homer), ¬likes(Moe,Bart)

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    41

    Wissensverarbeitung

    Logical KR

    • general rules

    – universal quantified sentences

    – x1x2 …xn(1˄…˄n)→

    – i: condition, premise

    – : conclusion

    – similar to if-then statement

    • universal quantified Horn Clauses

    – all i and are atoms

    – subset of FOL, reduced expressiveness

    – e.g., x(human(x)→mortal(x))

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    42

    Wissensverarbeitung

    Logical KR

    • universe of Discourse

    – in FOL in general no special domain

    – in KR a particular domain D and interpretation

    – e.g., arithmetic: {0,1,2,3} (constants), {+,*} (functions )

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    43

    Wissensverarbeitung

    Methodology of Modeling

    • what is a good way to model in logic

    • no formal approach

    • clarity of names, concepts

    – names of constants, predicates etc. are meaningful, e.g. partent(x,y)

    • disclose relations

    – x(senior(x)→discount(x))

    – fact: senior(Peter)

    – problem: gender? why discount ?

    – xy[(y=gender(x)˄age(x)>age_limit(y))→ discount(x)]

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    44

    Wissensverarbeitung

    Methodology of Modeling

    • validation

    – why a sentence gets valid ?

    • generality

    – can one express a sentence more

    generally ?

    • requirement of predicates

    – are new predicates necessary ?

    – relation to other predicates

    – description of super/sub classes

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    45

    Wissensverarbeitung

    Approach

    1. conceptualization

    – decide what to represent

    – abstract concept

    2. choice of vocabulary

    – translation of the abstract concept to FOL

    – the resulting vocabulary is an ontology of the problem domain

    3. coding of the domain theory

    – specify all relations and rules

    4. coding of the specific knowledge

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    46

    Wissensverarbeitung

    Example – Alpine Club

    • Tony, Mike and John belong to the Alpine Club. Every member

    of the Alpine Club who is not a skier is a mountain climber.

    Mountain climbers do not like rain, and anyone who does not

    like snow is not a skier. Mike dislikes whatever Tony likes, and

    likes whatever Tony dislikes. Tony likes rain and snow.

    • Prove that the given sentences logically entail that there is a

    member of Alpine Club who is a mountain climber but not a

    skier.

    • Suppose we had been told that Mike likes whatever Tony

    dislikes, but we had not been told that Mike dislikes whatever

    Tony likes. Prove that the resulting set of sentences no longer

    logically entails that there is a member of Alpine Club who is a

    mountain climber but not a skier.

  • Alexander Felfernig and Gerald Steinbauer

    Institute for Software Technology

    47

    Wissensverarbeitung

    Thank You!