Title: Computational Trust and Reputation Models Dr' Jordi Sabater Mir Dr' Laurent Vercouter
1Computational Trust and Reputation Models Dr.
Jordi Sabater Mir Dr. Laurent
Vercouter
9th European Agent Systems Summer School
2Dr. Laurent Vercouter
G2I Division for Industrial Engineering and
Computer Sciences EMSE Ecole des Mines of
St-Etienne
Dr. Jordi Sabater-Mir
IIIA Artificial Intelligence Research
Institute CSIC Spanish National Research Council
3Presentation index
- Motivation
- Approaches to control de interaction
- Some definitions
- The computational perspective
- Computational trust and reputation models
- eBay
- TrustNet
- LIAR
- ReGret
- Repage
- Prospective view
break
break
4Motivation
5A complete absence of trust would prevent one
even getting up in the morning. Luhmann, 1979.
6What we are talking about...
Mr. Yellow
7What we are talking about...
Two years ago...
Trust based on...
Direct experiences
Mr. Yellow
8What we are talking about...
Trust based on...
Third party information
Mr. Yellow
9What we are talking about...
Trust based on...
Third party information
Mr. Yellow
10What we are talking about...
Trust based on...
Reputation
Mr. Yellow
11What we are talking about...
Mr. Yellow
12What we are talking about...
13Advantages of trust and reputation mechanisms
- Each agent is a norm enforcer and is also under
surveillance by the others. No central authority
needed. - Their nature allows to arrive where laws and
central authorities cannot. - Punishment is based usually in ostracism.
14Problems of trust and reputation mechanisms
- Bootstrap problem.
- Exclusion must be a punishment for the outsider.
- Not all kind of environments are suitable to
apply these mechanisms.
15Approaches to control the interaction
16Different approaches to control the interaction
Security approach
17Different approaches to control the interaction
Agent identity validation. Integrity,
authenticity of messages. ...
Im Alice
18Different approaches to control the interaction
Institutional approach
Security approach
19Different approaches to control the interaction
20Different approaches to control the interaction
Social approach
Institutional approach
Security approach
21Example P2P systems
22Example P2P systems
23Example P2P systems
24Different approaches to control the interaction
Social approach
Trust and reputation mechanisms are at this level.
Institutional approach
Security approach
They are complementary and cover different
aspects of interaction.
25Definitions
26Trust
A couple of definitions that I like Trust
begins where knowledge ends trust provides a
basis dealing with uncertain,complex,and
threatening images of the future.
(Luhmann,1979) Trust is the outcome of
observations leading to the belief that the
actions of another may be relied upon, without
explicit guarantee, to achieve a goal in a risky
situation. (Elofson, 2001)
27Trust
- There are many ways of considering Trust.
- Trust as Encapsulated Interest
- Russell Hardin, 2002
I trust you because I think it is in your
interest to take my interests in the relevant
matter seriously. And this is because you value
the continuation of our relationship. You
encapsulate my interests in your own interests.
28Trust
- There are many ways of considering Trust.
- Instant trust
Trust is only a matter of the characteristics of
the trusted, characteristics that are not
grounded in the relationship between the truster
and the trusted. Example
Rug merchant in a bazaar
29Trust
- There are many ways of considering Trust.
- Trust as Moral
Trust is expected, and distrust or lack of trust
is seen as a moral fault. One migh argue that
to act as though I do trust someone who is not
evidently (or not yet) trustworthy is to
acknowledge the persons humanity and
possibilities or to encourage the persons
trustworthiness. Russel Hardin, 2002
30Trust
- There are many ways of considering Trust.
- Trust as Noncognitive
Trust based on affects, emotions... To say that
we trust on other in a non cognitive way is to
say that we are disposed to be trustful of them
independently of our beliefs or expetations about
their trustworthiness Becker 1996
- Trust as Ungrounded Faith
Notice here there is a power relation between the
truster and the trusted.
- Example
- infant towards her parents
- follower towards his leader
31Trust
There are many ways of considering Trust. Just
leave this to philosophers, psicologists and
sociologists... ...but lets have an eye on it.
32Reputation
- Some definitions
- The estimation of the consistency over time of
an attribute or entity Herbig et al. - Information that individuals receive about the
behaviour of their partners from third parties
and that they use to decide how to behave
themselves Buskens, Coleman... - The expectation of future opportunities arising
from cooperation Axelrod, Parkhe - The opinion others have of us
33Computational perspective
34Dimensions of trust McKnight Chervany, 02
Disposition to trust
Trusting intention
Trust related behaviour
Trusting beliefs
Institution- based trust
35The Functional Ontology of Reputation Casare
Sichman, 05
- The Functional Ontology of Reputation (FORe) aims
at defining standard concepts related to
reputation - FORe includes
- Reputation processes
- Reputation types and natures
- Agent roles
- Common knowledge (information sources, entities,
time) - Facilitate the interoperability of heterogeneous
reputation models
36Processes needed for trust computation
- Initialisation
- first default value
- Evaluation
- judgement of a behaviour
- Punishment/Sanction
- calculation of reputation values
- Reasoning
- inference of trust intentions
- Decision
- decision to trust
- Propagation
- communication about reputation/trust information
37Agent roles
38Reputation types Casare Sichman, 05
- Primary reputation
- Direct reputation
- Observed reputation
- Secondary reputation
- Collective reputation
- Propagated reputation
- Stereotyped reputation
39What is a good trust model ?
- A good trust model should be Fullam et al, 05
- Accurate
- provide good previsions
- Adaptive
- evolve according to behaviour of others
- Quickly converging
- quickly compute accurate values
- Multi-dimensional
- Consider different agent characteristics
- Efficient
- Compute in reasonable time and cost
40Why using a trust model in aMAS ?
Bob
- Trust models allow
- Identifying and isolating untrustworthy agents
41Why using a trust model in aMAS ?
- Trust models allow
- Identifying and isolating untrustworthy agents
- Evaluating an interactions utility
I can sell you the information you require
Bob
42Why using a trust model in aMAS ?
- Trust models allow
- Identifying and isolating untrustworthy agents
- Evaluating an interactions utility
- Deciding whether and with whom to interact
I can sell you the information you require
Charles
I can sell you the information you require
Bob
43Presentation index
- Motivation
- Approaches to control de interaction
- Some definitions
- The computational perspective
- Computational trust and reputation models
- eBay
- TrustNet
- LIAR
- ReGret
- Repage
- Prospective view
break
break
44Computational trust and reputation models
- eBay
- TrustNet
- LIAR
- ReGret
- Repage
45eBay model
- Model oriented to support trust between buyer and
seller. - Completely centralized.
- Buyers and sellers may leave comments about each
other after transactions. - Comment a line of text numeric evaluation
(-1,0,1) - Each eBay member has a Feedback score that is
the summation of the numerical evaluations.
46eBay model
47eBay model
48eBay model
- Specifically oriented to scenarios with the
following characteristics - A lot of users (we are talking about milions)
- Few chances of repeating interaction with the
same partner - Easy to change identity
- Human oriented
- Considers reputation as a global property and
uses a single value that is not dependent on the
context. - A great number of opinions that dilute false
or biased information is the only way to increase
the reliability of the reputation value.
49Computational trust and reputation models
- eBay
- TrustNet
- LIAR
- ReGret
- Repage
50Trust Net Schillo Funk, 99
- Model designed to evaluate the agents honesty
- Completely decentralized
- Applied in a game theory context the Iterated
Prisonners Dilemma (IPD) - Each agent announce its strategy and choose an
opponent according to its announced strategy - If an agent does not follow the strategy it
announced, its opponent decreases its reputation - The trust value of agent A towards agent B is
- T(A,B) number of honest rounds / number of
total rounds
51Trust Net Schillo Funk, 99
- Agents can communicate their trust values to
fasten the convergence of trust models - An agent can build a Trust Net of trust values
transmitted by witnesses - The final trust value of an agent towards another
aggregate direct experiences and testimonies with
a probabilistic function on the lying behaviour
of witnesses
1.0
0.25
0.8
0.7
0.2
52Computational trust and reputation models
- eBay
- TrustNet
- LIAR
- ReGret
- Repage
53The LIAR model Muller Vercouter, 07
- Model designed for the control of communications
in a P2P network - Completely decentralized
- Applied to a peer-to-peer protocol for query
routings - The global functionning of a p2p network relies
on an expected behaviour of several nodes (or
agents) - Agents behaviour must be regulated by a social
control Castelfranchi, 00
54LIAR Social control of agent communications
Social Control
Definition of acceptability
Trust intentions
Reputations
Social commitments
Sanction
Representation
Interactions
55The LIAR agent architecture
Reputations
Interactions
56Detection of violations
Evaluator
Propagator
observations(ob)
social commitment update
social policy generation
social policy evaluation
proof receivediteration
Justification Protocol
57Reputation types in LIAR
- Rptargetbeneficary(facet,dimension,time) ?
-1,1 ? unknown
- 7 different roles
- target
- participant
- observator
- evaluator
- punisher
- beneficiary
- propagator
- 5 reputation types based on
- direct interaction
- indirect interaction
- recommendation about observation
- recommendation about evaluation
- recommendation about reputation
58Reputation computation
- Direct Interaction based Reputation
- Separate the social policies according to their
state - associate a penalty to each set
- reputation weighted average of the penalties
- Reputation Recommendation based Reputation
- based on trusted recommendation
- reputation weighted average of received values
- weighted by the reputation of the punisher
59LIAR decision process
Trust_int trust
ObsRcbRp
EvRcbRp
RpRcbRp
ObsRcbRp
DIbRp
GDtT
()
()
()
()
()
Trust_int distrust
() -gt (unknown) or not relevant or not
discriminant
60Computational trust and reputation models
- eBay
- TrustNet
- LIAR
- ReGret
- Repage
61ReGreT
What is the ReGreT system? It is a modular trust
and reputation system oriented to complex
e-commerce environments where social relations
among individuals play an important role.
62The ReGreT system
ODB
IDB
SDB
Credibility
Witness reputation
Neigh- bourhood reputation
Reputation model
Direct Trust
System reputation
Trust
63The ReGreT system
ODB
IDB
SDB
Credibility
Witness reputation
Neigh- bourhood reputation
Reputation model
Direct Trust
System reputation
Trust
64Outcomes and Impressions
- Outcome
- The initial contract
- to take a particular course of actions
- to establish the terms and conditions of a
transaction. - AND
- The actual result of the contract.
Example
Prize c 2000 Quality c A Quantity c 300
Contract
Outcome
Prize f 2000 Quality f C Quantity f 295
Fulfillment
65Outcomes and Impressions
66Outcomes and Impressions
- Impression
- The subjective evaluation of an outcome from a
specific point of view.
Outcome
Prize c 2000 Quality c A Quantity c 300
Prize f 2000 Quality f C Quantity f 295
67The ReGreT system
ODB
IDB
SDB
Credibility
Witness reputation
Neigh- bourhood reputation
Reputation model
Direct Trust
System reputation
Trust
68The ReGreT system
ODB
IDB
SDB
Credibility
Witness reputation
Neigh- bourhood reputation
Reputation model
Direct Trust
System reputation
Trust
69Witness reputation
- Reputation that an agent builds on another agent
based on the beliefs gathered from society
members (witnesses).
- Problems of witness information
- Can be false.
- Can be incomplete.
- It may suffer from the correlated evidence
problem.
70o
o
o
D
u4
u1
u5
u2
u3
u8
u9
u6
u7
o
o
71o
o
o
D
u4
u1
u5
u2
u3
u8
u9
u6
u7
o
o
Big exchange of sincere infor-mation and some
kind of predispo-sition to help if it is possible.
72o
o
o
D
u4
u1
u5
u2
u3
u8
u9
u6
u7
o
o
Agents tend to use all the available mechanisms
to take some advantage from their competitors.
73Witness reputation
Step 1 Identifying the witnesses
74Witness reputation
Step 1 Identifying the witnesses
75Witness reputation
u7
Heuristic to identify groups and the best agents
to represent them
u6
u3
- Identify the components of
- the graph.
u8
u2
- For each component, find the
- set of cut-points.
b2
3. For each component that does not have any
cut-point, select a central point (node with
larger degree).
u5
u4
cooperation
76Witness reputation
Step 1 Identifying the witnesses
- Grouping and selecting
- the most representative
- witnesses
77Witness reputation
Step 1 Identifying the witnesses
u3
u2
u2
- Grouping and selecting
- the most representative
- witnesses
u5
u5
trade
trade
78Witness reputation
u2
u3
Step 1 Identifying the witnesses
u5
Step 2 Who can I trust?
79The ReGreT system
ODB
IDB
SDB
Credibility
Witness reputation
Neigh- bourhood reputation
Reputation model
Direct Trust
System reputation
Trust
80Credibility model
- Two methods are used to evaluate the
credibility of - witnesses
Credibility (witnessCr)
81The ReGreT system
ODB
IDB
SDB
Credibility
Witness reputation
Neigh- bourhood reputation
Reputation model
Direct Trust
System reputation
Trust
82Neighbourhood reputation
- The trust on the agents that are in the
neighbourhood of the target agent and their
relation with it are the elements used to
calculate what we call the Neighbourhood
reputation.
ReGreT uses fuzzy rules to model this reputation.
IF is X AND coop(b, ) low THEN
is X
IF is X AND coop(b, ) is Y
THEN is T(X,Y)
83The ReGreT system
ODB
IDB
SDB
Credibility
Witness reputation
Neigh- bourhood reputation
Reputation model
Direct Trust
System reputation
Trust
84System reputation
- The idea behind the System reputation is to use
the common knowledge about social groups and the
role that the agent is playing in the society as
a mechanism to assign reputation values to other
agents. - The knowledge necessary to calculate a system
reputation is usually inherited from the group or
groups to which the agent belongs to.
85Trust
- If the agent has a reliable direct trust value,
it will use that as a measure of trust. If that
value is not so reliable then it will use
reputation.
86Computational trust and reputation models
- eBay
- TrustNet
- LIAR
- ReGret
- Repage
87RepAge
What is the RepAge model? It is a reputation
model evolved from a cognitive theory by Conte
and Paolucci. The model is designed with an
special attention to the internal representation
of the elements used to build images and
reputations as well as the inter-relations of
these elements.
88Image vs Reputation
- Both are social evaluations.
- They concern other agents' (targets) attitudes
toward socially desirable behaviour but... - whereas image consists of a set of evaluative
beliefs about the characteristics of a target,
reputation concerns the voice that is circulating
on the same target. - Reputation is a belief about the existence of a
communicated evaluation. It is a meta-belief. - This has important consequences
- To accept a meta-belief does not imply to accept
the contained belief.
89The Repage system
90(No Transcript)
91RepAge memory
P
P
P
P
P
P
P
P
P
P
P
92(No Transcript)
93(No Transcript)
94(No Transcript)
95(No Transcript)
96Image(w,informant)
CandImage(a,seller)
Image(a,seller)
ValComm(w, image(a,seller))
Comm(w, image(a,seller))
97(No Transcript)
98(No Transcript)
99The analyzer
- The interplay between image and reputation might
be a cause of uncertainty and inconsistency. - Inconsistencies do not necessarily lead to a
state of cognitive dissonance, nor do they always
urge the system to find a solution. - For example, an inconsistency between own image
of a given target and its reputation creates no
problem to the system. - However, a contradiction between own evaluations
is sometimes possible - my direct experience may be confirmed in further
interaction, but at the same time it may be
challenged by the image I believe others, whom I
trust a lot, have formed about the same target - What will I do in such a condition? Will I go
ahead and sign a contract, may be a low-cost one,
just to acquire a new piece of direct evidence,
or will I check the reliability of my informants?
- The picture is rather complex, and the number of
possibilities is bound to increase at any step,
making the application of rule-based reasoning
computationally heavy.
100The Analyzer
- The main task of the analyzer is to propose
actions that - can improve the accuracy of the predicates in the
Repage memory and - can solve cognitive dissonances trying to produce
a situation of certainty. - For each possible action
- two copies of the current memory will be
istantiated, - to which the hypotetical new information
(good/bad) will be added. - effects will be evaluated about change in the
image of the target in the specified role - the wrapper agent architecture will try to
combine informational value with other costs and
benefits - a situated ordering,
101Current work with the RepAge architecture
- Agents that are able to justify the values of
Images and reputations (the LRep language). - Formalization that allows an agent to reason
about the elements that conform an image and/or a
reputation. - Dynamic ontology mapping.
102A prospective view
Is that enough?
103The lieutenant and the merchant Trifonov
(From The Brothers Karamazov Dostoyevsky)
Lieutenant
Trifonov
- It was a secret exchange. No contract.
104The lieutenant and the merchant Trifonov
(From The Brothers Karamazov Dostoyevsky)
Replacement notification
Ive never received any money from you... and
couldnt possibly have received any.
Lieutenant
Trifonov
105The lieutenant and the merchant Trifonov
- If instead of the lieutenant it was a virtual
agent using a current trust and reputation model,
it will be also in a trouble. - Previous interactions were perfect so Trifonov
trustworthiness would be very high. - All current models would suggest to go ahead with
the interaction. - To be successful in this situation implies to
ask about the reasons for thinking the relevant
party to be trustworthy.
106A prospective view
Current models
Planner
Trust Reputation system
Decision mechanism
Inputs
Comm
Black box
Agent
Reactive
107A prospective view
Current models
Planner
Trust Reputation system
Value
Decision mechanism
Inputs
Comm
Black box
Agent
Reactive
108A prospective view
The next generation?
Planner
Trust Reputation system
Decision mechanism
Inputs
Comm
Agent
109A prospective view
The next generation?
Planner
Decision mechanism
Inputs
Comm
Agent
110Conclusions
- Computational trust and reputation models are an
essential part of autonomous social agents. It is
not possible to talk about social agents without
considering trust and reputation. - Current trust and reputation models are still
far from covering the necessities of an
autonomous social agent. - We have to change the way the trust and
reputation system is considered in the agent
architecture.
111Conclusions
- Tight integration with the rest of the modules
of the agent and proactivity are necessary to
transform the trust and reputation system in a
useful tool that be able to solve the kind of
situations a real social agent will face in
virtual societies. - To achieve that, more collaboration with other
artificial intelligence areas is needed.
112Comparison among models
- Slide 11, Guillaumes thesis
- Some slides to show that there are a lot of
models and they are quite different.
Jordi
113Presentation index
- Motivation
- Approaches to control de interaction
- Some definitions
- The computational perspective
- Computational trust and reputation models
- eBay
- TrustNet
- LIAR
- ReGret
- Repage
- Prospective view
break
break
114The Agent Reputation and Trust Testbed
115Motivation
- Trust in MAS is a young field of research,
experiencing breadth-wise growth - Many trust-modeling technologies
- Many metrics for empirical validation
- Lack of unified research direction
- No unified objective for trust technologies
- No unified performance metrics and benchmarks
116An Experimental and Competition Testbed
- Presents a common challenge to the research
community - Facilitates solving of prominent research
problems - Provides a versatile, universal site for
experimentation - Employs well-defined metrics
- Identifies successful technologies
- Matures the field of trust research
- Utilizes an exciting domain to attract attention
of other researchers and the public
117The ART Testbed
- A tool for
- Experimentation Researchers can perform
easily-repeatable experiments in a common
environment against accepted benchmarks - Competitions Trust technologies compete against
each other the most promising technologies are
identified
118Testbed Game Rules
If an appraiser is not very knowledgeable about a
painting, it can purchase "opinions" from other
appraisers.
For a fixed price, clients ask appraisers to
provide appraisals of paintings from various eras.
Agents function as art appraisers with varying
expertise in different artistic eras.
Opinions and Reputations
Client Share
Appraisers whose appraisals are more accurate
receive larger shares of the client base in the
future.
Appraisers can also buy and sell reputation
information about other appraisers.
Appraisers compete to achieve the highest
earnings by the end of the game.
119Step 1 Client and Expertise Assignments
- Appraisers receive clients who pay a fixed price
to request appraisals - Client paintings are randomly distributed across
eras - As game progresses, more accurate appraisers
receive more clients (thus more profit)
120Step 2 Reputation Transactions
- Appraisers know their own level of expertise for
each era - Appraisers are not informed (by the simulation)
of the expertise levels of other appraisers - Appraisers may purchase reputations, for a fixed
fee, from other appraisers - Reputations are values between zero and one
- Might not correspond to appraisers internal
trust model - Serves as standardized format for inter-agent
communication
121Step 2 Reputation Transactions
Requester sends request message to a potential
reputation provider, identifying appraiser whose
reputation is requested
Provider
Requester
- Potential reputation provider sends accept
message
Requester sends fixed payment to the provider
Provider sends reputation information, which may
not be truthful
122Step 3 Opinion Transactions
- For a single painting, an appraiser may request
opinions (each at a fixed price) from as many
other appraisers as desired - The simulation generates opinions about
paintings for opinion-providing appraisers - Accuracy of opinion is proportional to opinion
providers expertise for the era and cost it is
willing to pay to generate opinion - Appraisers are not required to truthfully reveal
opinions to requesting appraisers
123Step 3 Opinion Transactions
Potential provider sends a certainty assessment
about the opinion it can provide - Real number (0
1) - Not required to truthfully report
certainty assessment
Requester sends request message to a potential
opinion provider, identifying painting
Provider
Requester
Requester sends fixed payment to the provider
Provider sends opinion, which may not be truthful
124Step 4 Appraisal Calculation
- Upon paying providers and before receiving
opinions, requesting appraiser submits to
simulation a weight (self-assessed reputation)
for each other appraiser - Simulation collects opinions sent to appraiser
(appraisers may not alter weights or received
opinions) - Simulation calculates final appraisal as
weighted average of received opinions - True value of painting and calculated final
appraisal are revealed to appraiser - Appraiser may use revealed information to revise
trust models of other appraisers
125Analysis Metrics
- Agent-Based Metrics
- Money in bank
- Average appraisal accuracy
- Consistency of appraisal accuracy
- Number of each type of message passed
- System-Based Metrics
- System aggregate bank totals
- Distribution of money among appraisers
- Number of messages passed, by type
- Number of transactions conducted
- Evenness of transaction distribution across
appraisers
126Conclusions
- The ART Testbed provides a tool for both
experimentation and competition - Promotes solutions to prominent trust research
problems - Features desirable characteristics that
facilitate experimentation
127An example of using ART
- Building an agent
- creating a new agent class
- strategic methods
- Running a game
- designing a game
- running the game
- Viewing the game
- Running a game monitor interface
128Building an agent for ART
- An agent is described by 2 files
- a Java class (MyAgent.java)
- must be in the testbed.participant package
- must extend the testbed.agent.Agent class
- an XML file (MyAgent.xml)
- only specifying the agent Java class in the
following way - ltagentConfiggt
- ltclassFilegt
- c\ARTAgent\testbed\participants\MyAgent.class
- lt/classFilegt
- lt/agentConfiggt
129Strategic methods of the Agent class (1)
- For the beginning of the game
- initializeAgent()
- To prepare the agent for a game
- For reputation transactions
- prepareReputationRequests()
- To ask reputation information (gossips) to other
agents - prepareReputationAcceptsAndDeclines()
- To accept or refuse requests
- prepareReputationReplies()
- To reply to confirmed requests
130Strategic methods of the Agent class (2)
- For opinion transactions
- prepareOpinionRequests()
- To ask opinion to other agents
- prepareOpinionCertainties()
- To announce its own expertise to a requester
- prepareOpinionRequestConfirmations()
- To confirm/cancel requests to providers
- prepareOpinionCreationOrders()
- To produce evaluations of paintings
- prepareOpinionProviderWeights()
- To weight the opinion of other agents
- prepareOpinionReplies()
- To reply to confirmed requests
131The strategy of this example of agent
- We will implement an agent with a very simple
reputation model - It associates a reputation value to each other
agent (initialized at 1.0) - It only sends opinion requests to agents with
reputation gt 0.5 - No reputation requests are sent
- If an appraisal of another agent is different
from the real value by less than 50, reputation
is increased by 0.03 - Otherwise it is decreased by 0.03
- If our agent receives a reputation request from
another agent with a reputation less than 0.5, it
provides a bad appraisal (cheaper) - Otherwise its appraisal is honest
132Initialization
The agent class is extended
Reputation values are assigned to every agent
133Opinion requests
Opinion requests are only sent to agents with a
reputation over 0.5
134Opinion Creation Order
If a requester has a bad reputation value, a
cheap and bad opinion is created For it.
Otherwise It is an expensive and accurate one
135Updating reputations
According to the difference between opinions and
real painting values, Reputations are increased
or decreased
136Running a game with MyAgent
- Parameters of the game
- 3 agents MyAgent, HonestAgent, CheaterAgent
- 50 time steps
- 4 painting eras
- average client share 5 / agent
137How did my agent behaved ?
Wow ! I should think about participating to the
next competition !
138References
- Casare Sichman, 05 S. J. Casare and J. S.
Sichman, Towards a functional ontology of
reputation, Proceedings of AAMAS05, 2005 - Castelfranchi, 00 C. Castelfranchi, Engineering
Social Order, Proceedings of ESAW00, 2000 - Fullam et al, 05 K. Fullam, T. Klos, G. Muller,
J. Sabater-Mir, A. Schlosser, Z. Topol, S.
Barber, J. Rosenschein, L. Vercouter and M. Voss,
A Specification of the Agent Reputation and Trust
(ART) Testbed Experimentation and Competition
for Trust in Agent Societies, Proceedings of
AAMAS05, 2005 - McKnight Chervany, 02 D. H. McKnight and N.
L. Chervany, What trust means in e-commerce
customer relationship an interdisciplinary
conceptual typology, International Journal of
Electronic Commerce, 2002 - Muller Vercouter, 05 G. Muller and L.
Vercouter, Decentralized Monitoring of Agent
Communication with a Reputation Model, Trusting
Agents for trusting Electronic Societies, LNCS
3577, 2005 - Sabater, 04 Evaluating the ReGreT system
Applied Artificial Intelligence ,18 (9-10)
797-813 - Sabater Sierra, 05 Review on computational
trust and reputation models Artificial
Intelligence Review ,24 (1) 33-60 - Sabater-Mir Paolucci, 06 Repage REPutation
and imAGE among limited autonomous partners,
JASSS - Journal of Artificial Societies and
Social Simulation ,9 (2), 2006 - Schillo Funk, 99 M. Schillo and P. Funk,
Learning from and about other agents in terms of
social metaphors, Agents Learning About From and
With Other Agents, 1999