aamas08trust.ppt
DESCRIPTION
TRANSCRIPT
![Page 1: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/1.jpg)
1
AAMAS’08 Tutorial 2: Computational Trust and
Reputation Models
Dr. Guillaume Muller Dr. Laurent Vercouter
7th International Conference on Autonomous Agents & Multi-Agent Systems
![Page 2: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/2.jpg)
2
MAIA – Intelligent Autonomous MachineINRIA – LORIA – Laboratory of IT Research and its Applications
Dr. Guillaume Muller
Dr. Laurent Vercouter
G2I – Division for Industrial Engineering and Computer SciencesEMSE – Ecole des Mines of St-Etienne
![Page 3: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/3.jpg)
3
Presentation outline• Motivation • Approaches to control the interaction • Some definitions • The computational perspective
Computational trust and reputation models – OpenPGP– Marsh – eBay/OnSale– Sporas & Histos– TrustNet– Fuzzy Models– LIAR– ReGret
• ART– The testbed – An example
![Page 4: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/4.jpg)
4
Motivation
![Page 5: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/5.jpg)
5
What we are talking about...
Mr. Yellow
![Page 6: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/6.jpg)
6
What we are talking about...
Mr. Yellow
Direct experiencesTwo years ago... Trust based on...
![Page 7: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/7.jpg)
7
What we are talking about...
Mr. Yellow
Third party informationTrust based on...
![Page 8: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/8.jpg)
8
What we are talking about...
Mr. Yellow
Third party informationTrust based on...
![Page 9: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/9.jpg)
9
What we are talking about...
Mr. Yellow
ReputationTrust based on...
![Page 10: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/10.jpg)
10
What we are talking about...
Mr. Yellow
![Page 11: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/11.jpg)
11
What we are talking about...
?
![Page 12: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/12.jpg)
12
Advantages of trust and reputation mechanisms
• Each agent is a norm enforcer and is also under surveillance by the others. No central authority needed.
• Their nature allows to arrive where laws and central authorities cannot.
• Punishment is based usually in ostracism.
![Page 13: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/13.jpg)
13
Problems of trust and reputation mechanisms
• Bootstrap problem.
• Exclusion must be a punishment for the outsider.
• Not all kind of environments are suitable to apply these mechanisms.
![Page 14: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/14.jpg)
14
Approaches to control the interaction
![Page 15: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/15.jpg)
15
Different approaches to control the interaction
Security approach
![Page 16: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/16.jpg)
16
• Security approach
Different approaches to control the interaction
Agent identity validation.Integrity, authenticity of messages....
I’m Alice
![Page 17: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/17.jpg)
17
Different approaches to control the interaction
Security approach
Institutional approach
![Page 18: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/18.jpg)
18
• Institutional approach
Different approaches to control the interaction
![Page 19: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/19.jpg)
19
Different approaches to control the interaction
Security approach
Institutional approach
Social approach
![Page 20: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/20.jpg)
20
Example: P2P systems
![Page 21: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/21.jpg)
21
Example: P2P systems
![Page 22: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/22.jpg)
22
Example: P2P systems
![Page 23: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/23.jpg)
23
Different approaches to control the interaction
Security approach
Institutional approach
Social approachTrust and reputation mechanisms are at this level.
They are complementary and cover different aspects of interaction.
![Page 24: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/24.jpg)
24
Definitions
![Page 25: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/25.jpg)
25
Trust
Some statements we like:
“Trust begins where knowledge ends: trust provides a basis dealing with uncertain,complex,and threatening images of the future.” [Luhmann,1979]
“Trust is the outcome of observations leading to the belief that the actions of another may be relied upon, without explicit guarantee, to achieve a goal in a risky situation.” [Elofson, 2001]
“There are no obvious units in which trust can be measured,” [Dasgupta, 2000]
![Page 26: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/26.jpg)
26
Trust
There are many ways of considering Trust.
• Trust as Encapsulated Interest [Russell Hardin, 2002]
“I trust you because I think it is in your interest to take my interests in the relevant matter seriously. And this is because you value the continuation of our relationship.
You encapsulate my interests in your own interests.”
![Page 27: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/27.jpg)
27
Trust
There are many ways of considering Trust.
• Instant trust
“Trust is only a matter of the characteristics of the trusted, characteristics that are not grounded in the relationship between the truster and the trusted.”
Example:
Rug merchant in a bazaar
![Page 28: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/28.jpg)
28
Trust
There are many ways of considering Trust.
• Trust as Moral
Trust is expected, and distrust or lack of trust is seen as a moral fault.
“One migh argue that to act as though I do trust someone who is not evidently (or not yet) trustworthy is to acknowledge the person’s humanity and possibilities or to encourage the person’s trustworthiness.” [Russel Hardin, 2002]
![Page 29: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/29.jpg)
29
Trust
There are many ways of considering Trust.
• Trust as Noncognitive
Trust based on affects, emotions...
“To say that we trust on other in a non cognitive way is to say that we are disposed to be trustful of them independently of our beliefs or expetations about their trustworthiness” [Becker 1996]
• Trust as Ungrounded Faith
Example: • infant towards her parents• follower towards his leader
Notice here there is a power relation between the truster and the trusted.
![Page 30: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/30.jpg)
30
Trust
There are many ways of considering Trust. And therefore, many definitions of Trust.
“Conceptual morass” [Barber, 83]“Confusing pot-pourri” [Shapiro, 87]
Just leave this to philosophers, psycologists and sociologists...
...but let’s have an eye on it.
![Page 31: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/31.jpg)
31
Reputation
Some definitions:
• “The estimation of the consistency over time of an attribute or entity” [Herbig et al.]
• “Information that individuals receive about the behaviour of their partners from third parties and that they use to decide how to behave themselves” [Buskens, Coleman...]
• “The expectation of future opportunities arising from cooperation” [Axelrod, Parkhe]
• “The opinion others have of us”
![Page 32: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/32.jpg)
32
Computational perspective
![Page 33: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/33.jpg)
33
Computational trust
Castelfranchi & Falcone make a clear distinction between:
– Trust as an evaluative belief• A truster agent believes that the trustee is trustful
e.g.: I believe that my doctor is a good surgeon
– Trust as a mental attitude• A truster agent relies on a trustee for a given behaviour
e.g.: I accept that my doctor makes a surgical operation on me
![Page 34: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/34.jpg)
34
Trust as a belief
“A truster i trusts a trustee j to do an action in order to
achieve a goal ” [Castelfranchi & Falcone]
– Agent i has the goal – Internal attribution of trust
• i believes that j intends to do
– External attribution of trust
• i believes that j is capable to do • i believes that j has the power to achieve by doing
The goal component can be generalized to consider norm-
obedience. [Demolombe & Lorini]
![Page 35: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/35.jpg)
35
Occurrent trust
Occurent trust happens when a truster believes that the
trustee is going to act here and now [Herzig et al, 08].
OccTrust(i, j, , ) = Goal(i, ) ΛBelieves(i, OccCap(j, )) ΛBelieves(i, OccPower(j, , )) ΛBelieves(i, OccIntends(j, ))
![Page 36: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/36.jpg)
36
Dispositional trust
Dispositional trust happens when a truster believes that the trustee is
going to act whenever some conditions are satisfied [Herzig et al, 08].
DispTrust(i, j, , ) = PotGoal(i, ) ΛBelieves(i, CondCap(j, )) ΛBelieves(i, CondPower(j, , )) ΛBelieves(i, CondIntends(j, ))
![Page 37: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/37.jpg)
37
Trust and delegation
Trust (as a belief) can lead to delegation, when the truster i
relies on the trustee j.
• Weak delegation– j is not aware of the fact that i is exploiting his action
• Strong delegation– i elicits or induces j’s expected behaviour to exploit it
There can be trust without delegation (insufficient trust,
prohibitions)
There can be delegations without trust (no information,
obligations)
![Page 38: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/38.jpg)
38
Computational reputation
Reputation adds a collective dimension to the truster.
“Reputation is an objective social property that emerges
from a propagating cognitive representation” [Conte &
Paolucci]. This definition includes both :– a process of transmitting a target’s image
– a cognitive representation resulting from this propagation
![Page 39: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/39.jpg)
39
The Functional Ontology of Reputation [Casare & Sichman, 05]
• The Functional Ontology of Reputation (FORe) aims at
defining standard concepts related to reputation
• FORe includes:– Reputation processes
– Reputation types and natures
– Agent roles
– Common knowledge (information sources, entities, time)
• Facilitate the interoperability of heterogeneous
reputation models
![Page 40: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/40.jpg)
40
Reputation processes
• Reputation transmission / reception– An agent sends/receive a reputation information to/from another one
• Reputation evaluation– Production of a reputation measurement that can contain several
valued attributes (content evaluation) or an unexplained estimation (esteem level). Values can be quantitative or qualitative.
• Reputation maintenance– The reputation alterations over time that can take into account the
incremental impact of agents’ behavior (aggregation) or the history of behaviors (historical process)
![Page 41: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/41.jpg)
41
Agent roles [Conte & Paolucci, 02]
Target
…
Participants
Observer
observationsEvaluator
Beneficiary
evaluations
Beneficiary
reputations
Propagator
Propagator
Propagator
![Page 42: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/42.jpg)
42
Reputation types [Mui, 02]
• Primary reputation– Direct reputation– Observed reputation
• Secondary reputation– Collective reputation– Propagated reputation– Stereotyped reputation
![Page 43: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/43.jpg)
43
What is a good trust model?
A good trust model should be [Fullam et al, 05]:• Accurate
provide good previsions• Adaptive
evolve according to behaviour of others• Quickly converging
quickly compute accurate values• Multi-dimensional
Consider different agent characteristics• Efficient
Compute in reasonable time and cost
![Page 44: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/44.jpg)
44
Why using a trust model in aMAS ?
• Trust models allow:– Identifying and isolating
untrustworthy agents
Bob
I don’t trust Bob
I don’t trust Bob
I don’t trust Bob
![Page 45: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/45.jpg)
45
Why using a trust model in aMAS ?
• Trust models allow:– Identifying and isolating
untrustworthy agents– Evaluating an interaction’s
utility Bob
I can sell you the information you
require
I don’t trust Bob
No, thank you !
![Page 46: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/46.jpg)
46
Why using a trust model in aMAS ?
• Trust models allow:– Identifying and isolating
untrustworthy agents– Evaluating an interaction’s
utility– Deciding whether and with
whom to interact
BobI can sell you the information you
require
I can sell you the information you
require
I trust Bob more than Charles
Ok, Bob. It’s a deal
No, thank you !
Charles
![Page 47: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/47.jpg)
47
Presentation outline• Motivation • Approaches to control de interaction • Some definitions • The computational perspective
Computational trust and reputation models – OpenPGP– Marsh – eBay/OnSale– Sporas & Histos– TrustNet– Fuzzy Models– LIAR– ReGret
• ART– The testbed – Example
![Page 48: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/48.jpg)
48
Computational trust and reputation models
• OpenPGP• Marsh • eBay/OnSale• Sporas & Histos• TrustNet• Fuzzy Models• LIAR• ReGret
![Page 49: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/49.jpg)
49
OpenPGP model [Adbul-Rahman, 97]
• Context: replace the centralized Trusted Authorities in Public Key management
(message)key
certifies
Authority
Bob Alice
trusts
Certification:
signs
Bob’s ID
Bob’s pubkey
Authority
![Page 50: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/50.jpg)
50
OpenPGP model [Adbul-Rahman, 97]
• Context: replace the centralized Trusted Authorities in Public Key management
message
certifies
Authority
Bob Alice
trusts
message
‘Web ofTrust’
certifies
![Page 51: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/51.jpg)
51
OpenPGP model [Adbul-Rahman, 97]
• 2 kinds of trusts:
• Tc Trust in the certificate: {undefined,marginal,complete}
• Ti Trust as an introducer: {untrustworthy,marginal,full,don’tknow}
• OpenPGP computes reputations based on transitivity along all existing pathes
• >X complete OR >Y marginal c
• >0 marginal (or >) m
• Humans set all Ti & some Tc and take decisions
• Parameters are:
• X: min. number of complete
• Y: min. number of marginal
• length of the trust pathes.
‘Web ofTrust’
TiTc
Tc
Ti
Ti
message
certifies
![Page 52: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/52.jpg)
52
Computational trust and reputation models
• OpenPGP• Marsh • eBay/OnSale• Sporas & Histos• TrustNet• Fuzzy Models• LIAR• ReGret
![Page 53: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/53.jpg)
53
Marsh’s model [Marsh, 94]
• Context: collaborative work
• Addresses only direct interactions, does not consider gossips
• Two kinds of trust: • General Trust: Tx(y), Trust of x in y in general• Situational Trust: Tx(y,x), contextualized trust
• Trust is modelled as a probability,in fact a value in [0,1)
• Computation: • Tx(y) = average of the Tx(y,x), in all possible contexts• Tx(y,x) = Tx(y, x) = Ux(x) × Ix(x) × Ťx(y)
• Decision to trust: • Tx(y,x) CooperThresholdx(x) WillCooper(x,y,x)
• CooperThreshold depends on the risks, perceived competence, importance
![Page 54: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/54.jpg)
54
Computational trust and reputation models
• OpenPGP• Marsh • eBay/OnSale• Sporas & Histos• TrustNet• Fuzzy Models• LIAR• ReGret
![Page 55: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/55.jpg)
55
eBay model
Context: e-commerce
• Model oriented to support trust between buyer and seller
• Buyer has no physical access to the product of interest
• Seller or buyer may decide not to commit the transaction
• Centralized: all information remains on eBay Servers
![Page 56: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/56.jpg)
56
eBay model
• Buyers and sellers evaluate each other after transactions
• The evaluation is not mandatory and will never be removed
• evaluation = a comment + a rating
• comment = a line of text
• rating = numeric evaluation in {-1,0,1}
• Each eBay member has a “reputation” (feedback score) that is the summation of the numerical evaluations.
![Page 57: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/57.jpg)
57
eBay model
![Page 58: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/58.jpg)
58
eBay model
![Page 59: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/59.jpg)
59
• Specifically oriented to scenarios with the following characteristics:
• A lot of users (we are talking about milions)
• Few chances of repeating interaction with the same partner
• Human oriented
• Considers reputation as a global property and uses a single value that is not dependent on the context.
• A great number of opinions that “dilute” false or biased information is the only way to increase the reliability of the reputation value.
eBay model
![Page 60: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/60.jpg)
60
+ Advantages:
+ Used everyday
+ In a real life application
+ Very simple
– Limits: [Dellarocas, 00&01] [Steiner, 03]
– Fear of reciprocity
– What is the semantic of a high reputation?
– Problem of electronic commerce: change of identity
– The textual comment makes the efficiency
– Few public papers, evolves frequently
eBay model
![Page 61: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/61.jpg)
61
• OnSale specialized on computer-related stuff
• Newcomers:
• OnSale: no reputation
• eBay: zero feedback points (lowest reputation)
• Bidders:
• OnSale: not rated at all, register with credit card
• eBay: are rated, used internally, bought PayPal
OnSale model
![Page 62: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/62.jpg)
62
Computational trust and reputation models
• OpenPGP• Marsh • eBay/OnSale• Sporas & Histos• TrustNet• Fuzzy Models• LIAR• ReGret
![Page 63: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/63.jpg)
63
SPORAS & HISTOS [Zacharias et al., 99]
• Context: e-commerce, similar to eBay
• Reputations are faceted: an individual may enjoy a very high reputation in one domain, while she has a low reputation in another.
• Two models are proposed:• Sporas: works even with few ratings• Histos: assumes abundance of ratings
• Deterrent for agents to change their IDs:• Reputations can decrease, but it will never fall below a newcomer's value• A low-reputed agent can improve its status at the same rate as a beginner
• Ratings given by users with a high reputation are weighted more
• Measure against end-of-game strategies:• Reputation values are not allowed to increase at infinitum
![Page 64: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/64.jpg)
64
SPORAS
1. Reputations are in [0, 3000]. Newcommers = 0. Ratings are in [0.1, 1]
2. Reputations never get below 0, even in the case of very bad behaviours
3. After each rating the reputation is updated
4. Two users may rate each other only oncemore than one interaction => most recent rating considered
.5. Higher reputations are updated more moderatetly
Currentrating
Memory ofThe system
Reputation ofthe rater
Normalizedprev. reputation
Dumpingfactor
![Page 65: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/65.jpg)
65
Histos• Aim: compute a global ‘personalized reputation’ (PRp) value for each member
• ‘personalized reputation’ is computed by transitivity
1. Find all directed paths from A to AL
with length N2. Keep only the most recent ones3. Start a backward recursion
1. If path length = 1,PRp = rating
2. If path length > 1,PRp = f(Raters’PRp,rating)
![Page 66: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/66.jpg)
66
Computational trust and reputation models
• OpenPGP• Marsh • eBay/OnSale• Sporas & Histos• TrustNet• Fuzzy Models• LIAR• ReGret
![Page 67: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/67.jpg)
67
Trust Net [Schillo & Funk, 99]
Model designed to evaluate the agents’ honesty
• Completely decentralized
• Applied in a game theory context : the Iterated Prisonner’s Dilemma (IPD)
Coop. Defeat
Coop. 1
1
0
5Defeat 5
0
3
3• Each agent announces its strategy and choose an opponent according to its announced strategy
• If an agent does not follow the strategy it announced, its opponent decreases its reputation
• The trust value of agent A towards agent B isT(A,B) = number of honest rounds / number of total rounds
![Page 68: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/68.jpg)
68
• Agents can communicate their trust values to fasten the convergence of trust models
• An agent can build a Trust Net of trust values transmitted by witnesses
• The final trust value of an agent towards another aggregates direct experiences and testimonies with a probabilistic function on the lying behaviour of witnesses, which reduces the correlated evidence problem.
1.0
0.7
0.25
0.8
0.2
Trust Net [Schillo & Funk, 99]
– Binary evaluation
– Annouced behaviour
.65
![Page 69: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/69.jpg)
69
Computational trust and reputation models
• OpenPGP• Marsh • eBay/OnSale• Sporas & Histos• TrustNet• Fuzzy Models• LIAR• ReGret
![Page 70: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/70.jpg)
70
Fuzzy models [Rehák, 05]
• Trust modelled as a type-2 fuzzy set• Iterative building of the fuzzy set:
• Estimate the subjective utility of the cooperation• Compute the rating of 1 agent based on this utility:
• Flat• Proportional distribution: (trust of A×utility)/(trust of avg agent)
• Fuzzy set = membership function on sets of ratings
![Page 71: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/71.jpg)
71
Fuzzy models [Rehák, 05]
Trust Decision:
![Page 72: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/72.jpg)
72
Computational trust and reputation models
• OpenPGP• Marsh • eBay/OnSale• Sporas & Histos• TrustNet• Fuzzy Models• LIAR• ReGret
![Page 73: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/73.jpg)
73
The LIAR model [Muller & Vercouter, 08]
Model designed for the control of communications in a P2P network
• Completely decentralized
• Applied to a peer-to-peer protocol for query routings
• The global functionning of a p2p network relies on an expected behaviour of several nodes (or agents)
• Agents’ behaviour must be regulated by a social control [Castelfranchi, 00]
![Page 74: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/74.jpg)
74
The LIAR model [Muller & Vercouter, 07]
![Page 75: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/75.jpg)
75
The LIAR model [Muller & Vercouter, 07]
![Page 76: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/76.jpg)
76
The LIAR model [Muller & Vercouter, 07]
![Page 77: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/77.jpg)
77
The LIAR model [Muller & Vercouter, 07]
![Page 78: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/78.jpg)
78
The LIAR model [Muller & Vercouter, 07]
![Page 79: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/79.jpg)
79
The LIAR model [Muller & Vercouter, 07]
![Page 80: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/80.jpg)
80
The LIAR model [Muller & Vercouter, 07]
![Page 81: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/81.jpg)
81
LIAR: Social control of agent communications
Social Control
Interactions
Definition ofAcceptability
(Social norms)
(Reputations)
Representation(Social commitments)
Sanction
Trust intentions
![Page 82: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/82.jpg)
82
LIAR: Social commitments and norms
Social Commitment example:
Observer
Debtor(sender) Content
State
Utterance time
Creditor(receiver)
![Page 83: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/83.jpg)
83
LIAR: Social commitments and norms
Social Norm example:
Prohibition PunishersTargets Evaluators
Content
Condition
![Page 84: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/84.jpg)
84
The LIAR agent architecture
Interactions
Reputations
Socialcommitments
Observation
Socialpolicies
Socialnorms
Evaluation
Punishment
Initialisation
Trustintention
Context
Reasoning
Mentalstates
Decision
Sanction
Recommendationfiltering
Recommendations
![Page 85: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/85.jpg)
85
LIAR: partial observation
inform(p)Agent A Agent B
Agent C Agent D
![Page 86: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/86.jpg)
86
LIAR: partial observation
sc(A,B,8pm,[8pm-9pm],active,p)
A CSAB
D CSAB
C CSAB
B CSAB
![Page 87: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/87.jpg)
87
LIAR: partial observation
cancel(p)
A CSAB
D CSAB
C CSAB
B CSAB
![Page 88: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/88.jpg)
88
Detection of violations
Evaluator Propagatorobservations(ob)
socialcommitment
updatesocialpolicy
generationsocialpolicy
evaluation
JustificationProtocol[p
roof
rec
eive
d]it
erat
ion
![Page 89: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/89.jpg)
89
Reputation types in LIAR
Rptarget (facet,dimension,time) ∈[-1,+1] {unknown}Ս
7 different roles
target participant
observator evaluator punisher beneficiary propagator
5 reputation types based on:
direct interaction indirect interaction
recommendation about observation recommendation about evaluation recommendation about reputation
beneficiary
![Page 90: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/90.jpg)
90
Reputation computation
Direct Interaction based ReputationSeparate the social policies according to their state
associate a penalty to each set
reputation = weighted average of the penalties
Reputation Recommendation based Reputationbased on trusted recommendation
reputation = weighted average of received values
weighted by the reputation of the punisher
![Page 91: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/91.jpg)
91
LIAR decision process
ObsRcbRp EvRcbRp RpRcbRpObsRcbRpDIbRp GDtT
(*) -> (unknown) or not relevant or not discriminant
(*) (*) (*) (*) (*)
Trust_int = trust
Trust_int = distrust
>θ trustDIbRp
>θ trustIIbRp
>θ trustObsRcbRp
>θ trustEvRcbRp
>θ trustRpRcbRp
<θ distrustDIbRp
<θ distrustIIbRp
<θ distrustObsRcbRp
<θ distrustEvRcbRp
<θ distrustRpRcbRp
![Page 92: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/92.jpg)
92
LIAR: conclusion
• LIAR is adapted to P2P infrastructures• Partial observations/incomplete information
• Scalable
• Applied in a GNUtella–like network malicious nodes are excluded
• LIAR is fine-grained
• Different types of reputation maintained separately
• multi-facet and multi-dimension
• LIAR covers the whole loop of social control
• evaluation of a single behaviour decision to act in trust.
![Page 93: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/93.jpg)
93
Computational trust and reputation models
• OpenPGP• Marsh • eBay/OnSale• Sporas & Histos• TrustNet• Fuzzy Models• LIAR• ReGret
![Page 94: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/94.jpg)
94
ReGreT
What is the ReGreT system?
It is a modular trust and reputation system oriented to complex e-commerce environments where social relations among individuals play an important role.
![Page 95: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/95.jpg)
95
Reputationmodel
Witnessreputation
Systemreputation
Neigh-bourhoodreputation
ODB
DirectTrust
Credibility
IDB SDB
Trust
The ReGreTsystem
![Page 96: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/96.jpg)
96
Reputationmodel
Witnessreputation
Systemreputation
Neigh-bourhoodreputation
ODB
DirectTrust
Credibility
IDB SDB
Trust
The ReGreTsystem
![Page 97: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/97.jpg)
97
Outcome:
The initial contract
– to take a particular course of actions
– to establish the terms and conditions of a transaction.
AND
The actual result of the contract.
Outcomes and Impressions
Prize =c 2000Quality =c AQuantity =c 300
Prize =f 2000Quality =f CQuantity =f 295
Example:
Outcome
Contract
Fulfillment
![Page 98: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/98.jpg)
98
Outcomes and Impressions
Prize =c 2000Quality =c AQuantity =c 300
Prize =f 2000Quality =f CQuantity =f 295
Outcome
offers_good_prices
maintains_agreed_quantities
![Page 99: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/99.jpg)
99
Impression:
The subjective evaluation of an outcome from a specific point of view.
Outcomes and Impressions
Prize =c 2000Quality =c AQuantity =c 300
Prize =f 2000Quality =f CQuantity =f 295
Outcome),(Imp 1o
),(Imp 2o
),(Imp 3o
![Page 100: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/100.jpg)
100
Reputationmodel
Witnessreputation
Systemreputation
Neigh-bourhoodreputation
ODB
DirectTrust
Credibility
IDB SDB
Trust
The ReGreTsystem
Reliability of the value based on:
• Number of outcomes
• Deviation: The greater the variability in the rating values the more volatile will be the other agent in the fulfillment of its agreements.
![Page 101: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/101.jpg)
101
Reputationmodel
Witnessreputation
Systemreputation
Neigh-bourhoodreputation
ODB
DirectTrust
Credibility
IDB SDB
Trust
The ReGreTsystem
![Page 102: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/102.jpg)
102
Witness reputationReputation that an agent builds on another agent based on the beliefs gathered from society members (witnesses).
Problems of witness information:
– Can be false.
– Can be incomplete.
– “Correlated evidence” problem [Pearl, 88].
Functionning:
1. Find Witnesses
• Direct relation with target
• Use of sociograms (cut-points and central points)
2. Weight each recommendation with the credibility
Advantages:
+ Minimizes the correlated evidence problem.
+ Reduces the number of queries
![Page 103: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/103.jpg)
103
Credibility model
Two methods are used to evaluate the credibility of witnesses:
Credibility(witnessCr)
Social relations(socialCr)
Past history(infoCr)
![Page 104: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/104.jpg)
104
Reputationmodel
Witnessreputation
Systemreputation
Neigh-bourhoodreputation
ODB
DirectTrust
Credibility
IDB SDB
Trust
The ReGreTsystem
![Page 105: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/105.jpg)
105
ReGreT uses fuzzy rules to model this reputation.
IF is X AND coop(b, ) lowTHEN is X
)( d_qualityoffers_gooDTina
)( d_qualityoffers_gooR bain
in
IF is X’ AND coop(b, ) is Y’ THEN is T(X’,Y’)
)( d_qualityoffers_gooDTRLina
)( d_qualityoffers_gooRL bain
in
Neighbourhood reputation
The trust on the agents that are in the “neighbourhood” of the target agent and their relation with it are the elements used to calculate what we call the Neighbourhood reputation.
![Page 106: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/106.jpg)
106
Reputationmodel
Witnessreputation
Systemreputation
Neigh-bourhoodreputation
ODB
DirectTrust
Credibility
IDB SDB
Trust
The ReGreTsystem
![Page 107: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/107.jpg)
107
The idea behind the System reputation is to use the common knowledge about social groups and the role that the agent is playing in the society as a mechanism to assign reputation values to other agents.
The knowledge necessary to calculate a system reputation is usually inherited from the group or groups to which the agent belongs to.
System reputation
![Page 108: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/108.jpg)
108
Trust decision
Reputationmodel
Witnessreputation
Systemreputation
Neigh-bourhoodreputation
DirectTrust
Trust
If the agent has a reliable direct trust value, it will use that as a measure of trust. If that value is not so reliable then it will use reputation.
![Page 109: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/109.jpg)
109
Conclusions
• Computational trust and reputation models are an essential part of autonomous social agents. It is not possible to talk about social agents without considering trust and reputation.
• Current trust and reputation models are still far from covering the necessities of an autonomous social agent.
• We have to change the way the trust and reputation system is considered in the agent architecture.
![Page 110: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/110.jpg)
110
Conclusions
• Tight integration with the rest of the modules of the agent and proactivity are necessary to transform the trust and reputation system in a useful tool that be able to solve the kind of situations a real social agent will face in virtual societies.
• To achieve that, more collaboration with other artificial intelligence areas is needed.
![Page 111: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/111.jpg)
111
Presentation outline• Motivation • Approaches to control de interaction • Some definitions • The computational perspective
Computational trust and reputation models – OpenPGP– Marsh – eBay/OnSale– SPORAS & HISTOS– TrustNet– Fuzzy Models– LIAR– ReGret
• ART– The testbed – Example
![Page 112: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/112.jpg)
112
The Agent Reputation and Trust Testbed
![Page 113: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/113.jpg)
113
Motivation
• Trust in MAS is a young field of research, experiencing breadth-wise growth– Many trust-modeling technologies– Many metrics for empirical validation
• Lack of unified research direction– No unified objective for trust technologies– No unified performance metrics and
benchmarks
![Page 114: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/114.jpg)
114
An Experimental and Competition Testbed…
• Presents a common challenge to the research community– Facilitates solving of prominent research problems
• Provides a versatile, universal site for experimentation– Employs well-defined metrics– Identifies successful technologies
• Matures the field of trust research– Utilizes an exciting domain to attract attention of other
researchers and the public
![Page 115: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/115.jpg)
115
The ART Testbed
• A tool for– Experimentation: Researchers can perform
easily-repeatable experiments in a common environment against accepted benchmarks
– Competitions: Trust technologies compete against each other; the most promising technologies are identified
![Page 116: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/116.jpg)
116
Appraiser Agent
Appraiser Agent
Client
Client
Client
Client Share
Opinions and Reputations
Appraiser Agent
Appraiser Agent
Appraiser Agent
Testbed Game RulesAgents function as art appraisers with varying expertise in
different artistic eras.
For a fixed price, clients ask appraisers to provide
appraisals of paintings from various eras.
If an appraiser is not very knowledgeable
about a painting, it can purchase "opinions"
from other appraisers.
Appraisers can also buy and sell reputation information about other
appraisers.
Appraisers whose appraisals are more
accurate receive larger shares of the client base
in the future. Appraisers compete to achieve the highest earnings by the end of the game.
![Page 117: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/117.jpg)
117
Step 1: Client and Expertise Assignments
• Appraisers receive clients who pay a fixed price to request appraisals
• Client paintings are randomly distributed across eras
• As game progresses, more accurate appraisers receive more clients (thus more profit)
![Page 118: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/118.jpg)
118
Step 2: Reputation Transactions
• Appraisers know their own level of expertise for each era
• Appraisers are not informed (by the simulation) of the expertise levels of other appraisers
• Appraisers may purchase reputations, for a fixed fee, from other appraisers
• Reputations are values between zero and one – Might not correspond to
appraiser’s internal trust model– Serves as standardized format
for inter-agent communication
![Page 119: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/119.jpg)
119
Step 2: Reputation Transactions
ProviderRequester
Request
Accept
Payment
Reputation
Requester sends request message to a potential reputation provider, identifying
appraiser whose reputation is
requested
Potential reputation provider sends
“accept” message
Requester sends fixed payment to the
provider
Provider sends reputation
information, which may not be truthful
![Page 120: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/120.jpg)
120
Step 3: Certainty & Opinion Transactions
• For a single painting, an appraiser may request opinions (each at a fixed price) from as many other appraisers as desired
• The simulation “generates” opinions about paintings for opinion-providing appraisers
• Accuracy of opinion is proportional to opinion provider’s expertise for the era and cost it is willing to pay to generate opinion
• Appraisers are not required to truthfully reveal opinions to requesting appraisers
![Page 121: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/121.jpg)
121
Step 3: Certainty & Opinion Transactions
ProviderRequester
Request
Certainty
Request
Opinion
Requester sends certainty request
message to potential providers, identifying
an era
Potential provider sends a certainty
assessment about the opinion it can provide
for this era- Real number (0 – 1)
- Not required to truthfully report certainty
assessment
Requester sends opinion request
messages to potential providers, identifying
a painting
Provider sends opinion, which may not be truthful and
receive a fixed payment
![Page 122: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/122.jpg)
122
Step 4: Appraisal Calculation• Upon paying providers and
before receiving opinions, requesting appraiser submits to simulation a weight (self-assessed reputation) for each other appraiser
• Simulation collects opinions sent to appraiser (appraisers may not alter weights or received opinions)
• Simulation calculates “final appraisal” as weighted average of received opinions
• True value of painting and calculated final appraisal are revealed to appraiser
• Appraiser may use revealed information to revise trust models of other appraisers
![Page 123: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/123.jpg)
123
Analysis Metrics• Agent-Based Metrics
– Money in bank– Average appraisal accuracy– Consistency of appraisal accuracy– Number of each type of message passed
• System-Based Metrics– System aggregate bank totals– Distribution of money among appraisers– Number of messages passed, by type– Number of transactions conducted– Evenness of transaction distribution across appraisers
![Page 124: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/124.jpg)
124
Conclusions
• The ART Testbed provides a tool for both experimentation and competition– Promotes solutions to prominent trust
research problems– Features desirable characteristics that
facilitate experimentation
![Page 125: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/125.jpg)
125
An example of using ART
1. Building an agent– creating a new agent class– strategic methods
2. Running a game– designing a game– running the game
3. Viewing the game– Running a game monitor interface
![Page 126: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/126.jpg)
126
Building an agent for ART
An agent is described by 2 files:
• a Java class (MyAgent.java)– must be in the testbed.participant package
– must extend the testbed.agent.Agent class
• an XML file (MyAgent.xml)– only specifying the agent Java class in the following way:
<agentConfig>
<classFile>
c:\ARTAgent\testbed\participants\MyAgent.class
</classFile>
</agentConfig>
![Page 127: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/127.jpg)
127
Strategic methods of the Agent class (1)
• For the beginning of the game– initializeAgent()
To prepare the agent for a game
• For reputation transactions– prepareReputationRequests()
To ask reputation information (gossips) to other agents
– prepareReputationAcceptsAndDeclines()To accept or refuse requests
– prepareReputationReplies()To reply to confirmed requests
![Page 128: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/128.jpg)
128
Strategic methods of the Agent class (2)
• For certainty transactions– prepareCertaintyRequests()
To ask certainty about eras to other agents– prepareCertaintyReplies()
To announce its own certainty about eras to requesters
• For opinion transactions– prepareOpinionRequests()
To ask opinion to other agents– prepareOpinionCreationOrders()
To produce evaluations of paintings– prepareOpinionReplies()
To reply to confirmed requests– prepareOpinionProviderWeights()
To weight the opinion of other agents
![Page 129: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/129.jpg)
129
The strategy of this example of agent
• We will implement an agent with a very simple reputation model:
• It associates a reputation value to each other agent (initialized at 1.0)
• It only sends opinion requests to agents with reputation > 0.5
• No reputation requests are sent
• If an appraisal of another agent is different from the real value by less than 50%, reputation is increased by 0.03
• Otherwise it is decreased by 0.03
• If our agent receives a reputation request from another agent with a reputation less than 0.5, it provides a bad appraisal (cheaper)
• Otherwise its appraisal is honest
![Page 130: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/130.jpg)
130
Initialization
Reputation values are assigned to every agent
The agent class is extended
![Page 131: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/131.jpg)
131
Opinion requests
Opinion requests are only sent to agents with a reputation over 0.5
![Page 132: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/132.jpg)
132
Opinion Creation Order
If a requester has a bad reputation value, a cheap and bad opinion is createdFor it. Otherwise It is an expensive and accurate one
![Page 133: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/133.jpg)
133
Updating reputations
According to the difference between opinions and real painting values,Reputations are increased or decreased
![Page 134: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/134.jpg)
134
Running a game with MyAgent
Parameters of the game :
• 3 agents: MyAgent, HonestAgent, CheaterAgent
• 50 time steps
• 4 painting eras
• average client share : 5 / agent
![Page 135: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/135.jpg)
135
How did my agent behaved ?
![Page 136: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/136.jpg)
136
References[AbdulRahman, 97] A. Abdul-Rahman. The PGP trust model. EDI-Forum: the Journal of Electronic
Commerce, 10(3):27–31, 1997.
[Barber, 83] B. Barber, The Logic and Limits of Trust, The meanings of trust: Technical competence and fiduciary responsibility, Rutgers University Press, Rutgers, NJ, United States of America, 1983, p. 7-25.
[Carbo et al., 03] J. Carbo and J. M. Molina and J. {Dávila Muro, Trust Management Through Fuzzy Reputation, International Journal of Cooperative Information Systems, 2003, vol. 12:1, p. 135-155.
[Casare & Sichman, 05] S. J. Casare and J. S. Sichman, Towards a functional ontology of reputation, Proceedings of AAMAS’05, 2005.
[Castelfranchi, 00] C. Castelfranchi, Engineering Social Order, Proceedings of ESAW’00, 2000.
[Castelfranchi & Falcone, 98] C. Castelfranchi and R. Falcone, Principles of trust for MAS: Cognitive anatomy, social importance and quantification. Proc of ICMAS’98, pages 72-79, 1998.
[Conte & Paolucci, 02] R. Conte and M. Paolucci, Reputation in Artificial Societies. Social Beliefs for Social Order, Kluwer Academic Publishers, G. Weiss (eds), Dordrecht, The Netherlands, 2002.
[Dellarocas, 00] C. Dellarocas, Immunizing online reputation reporting systems against unfair ratings and discriminatory behavior, p. 150-157, Proceedings of the ACM Conference on "Electronic Commerce" (EC'00), October, ACM Press, New York, NY, United States of America, 2000.
[Dellarocas, 01] C. Dellarocas, Analyzing the economic efficiency of {eBay-like} online reputation reporting mechanisms, p. 171-179, Proceedings of the ACM Conference on "Electronic Commerce" (EC'01), October, ACM Press, New York, NY, United States of America, 2001.
[Demolombe & Lorini, 08] R. Demolombe and E. Lorini, Trust and norms in the context of computer security: a logical formalization. Proc of DEON’08, LNAI, 1998.
![Page 137: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/137.jpg)
137
References[Fullam et al, 05] K. Fullam, T. Klos, G. Muller, J. Sabater-Mir, A. Schlosser, Z. Topol, S. Barber, J.
Rosenschein, L. Vercouter and M. Voss, A Specification of the Agent Reputation and Trust (ART) Testbed: Experimentation and Competition for Trust in Agent Societies, Proceedings of AAMAS’05, 2005.
[Herzig et al, 08] A. Herzig, E. Lorini, J. F. Hubner, J. Ben-Naim, C. Castelfranchi, R. Demolombe, D. Longin and L. Vercouyter. Prolegomena for a logic of trust and reputation, submitted to Normas 08.
[Luhmann, 79] N. Luhmann, Trust and Power, John Wiley \& Sons, 1979.
[McKnight & Chervany, 02] D. H. McKnight and N. L. Chervany, What trust means in e-commerce customer relationship: an interdisciplinary conceptual typology, International Journal of Electronic Commerce, 2002.
[Mui et al., 02] L. Mui and M. Mohtashemi and A. Halberstadt, Notions of Reputation in Multi-agent Systems: A Review, Proceedings of Autonomous Agents and Multi-Agent Systems (AAMAS'02), p. 280-287, 2002, C. Castelfranchi and W.L. Johnson (eds), Bologna, Italy, July, ACM Press, New York, NY, United States of America.
[Muller & Vercouter, 05] G. Muller and L. Vercouter, Decentralized Monitoring of Agent Communication with a Reputation Model, Trusting Agents for trusting Electronic Societies, LNCS 3577, 2005.
[Pearl, 88] Pearl, J. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference, Morgan Kaufmann, San Francisco, 1988.
[Rehák et al., 05] M. Rehák and M. Pěchouček and P. Benda and L. Foltỷn, Trust in Coalition Environment: Fuzzy Number Approach, Proceedings of the Workshop on "Trust in Agent Societies" at Autonomous Agents and Multi-Agent Systems (AAMAS'05), p. 132-144, 2005, C. Castelfranchi and S. Barber and J. Sabater and M. P. Singh (eds) Utrecht, The Netherlands, July.
[Sabater, 04] Evaluating the ReGreT system Applied Artificial Intelligence ,18 (9-10) :797-813.
[Sabater & Sierra, 05] Review on computational trust and reputation models Artificial Intelligence Review ,24 (1) :33-60.
![Page 138: AAMAS08Trust.ppt](https://reader031.vdocuments.us/reader031/viewer/2022012922/54be4d024a79592d518b465e/html5/thumbnails/138.jpg)
138
References[Sabater-Mir & Paolucci, 06] Repage: REPutation and imAGE among limited autonomous partners,
JASSS - Journal of Artificial Societies and Social Simulation ,9 (2), 2006.
[Schillo & Funk, 99] M. Schillo and P. Funk, Learning from and about other agents in terms of social metaphors, Agents Learning About From and With Other Agents, 1999.
[Sen & Sajja, 02] S. Sen and N. Sajja, Robustness of reputation-based trust: Boolean case, Proceedings of Autonomous Agents and Multi-Agent Systems (AAMAS'02), p. 288-293, 2002, Bologna, Italy, M. Gini and T. Ishida and C. Castelfranchi and W. L. Johnson (eds), ACM Press, New York, NY, United States of America, vol.1.
[Shapiro, 87] S. P. Shapiro, The social control of impersonal trust, American Journal of Sociology, 1987, vol. 93, p. 623-658.
[Steiner, 03] D. Steiner, Survey: How do Users Feel About eBay's Feedback System? January, 2003, http://www.auctionbytes.com/cab/abu/y203/m01/abu0087/s02 .
[Zacharia et al., 99] G. Zacharia and A. Moukas and P. Maes, Collaborative Reputation Mechanisms in Electronic Marketplaces, Proceedings of the Hawaii International Conference on System Sciences (HICSS-32), vol. 08, 1999, p. 8026, IEEE Computer Society, Washington, DC, United States of America.