System to establish trust between policy systems and users

Information security – Access control or authentication – Stand-alone

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C726S001000, C713S164000, C713S165000, C713S166000

Reexamination Certificate

active

07958552

ABSTRACT:
A system and method are provided to establish trust between a user and a policy system that generates recommended actions in accordance with specified policies. Trust is introduced into the policy-based system by assigning a value to each execution of each policy with respect to the policy-based system, called the instantaneous trust index. The instantaneous trust indices for each one of the policies, for the each execution of a given policy or for both are combined into the overall trust index for a given policy or for a given policy-based system. The recommended actions are processed in accordance with the level or trust associated with a given policy as expressed by the trust indices. Manual user input is provided to monitor or change the recommended actions. In addition, reinforcement learning algorithms are used to further enhance the level of trust between the user and the policy-based system.

REFERENCES:
patent: 6052723 (2000-04-01), Ginn
patent: 6088801 (2000-07-01), Grecsek
patent: 6654732 (2003-11-01), Naito et al.
patent: 6785728 (2004-08-01), Schneider et al.
patent: 6854016 (2005-02-01), Kraenzel et al.
patent: 7086085 (2006-08-01), Brown et al.
patent: 7233935 (2007-06-01), Chandler
patent: 2002/0026576 (2002-02-01), Das-Purkayastha et al.
patent: 2005/0044209 (2005-02-01), Doyle et al.
patent: 2005/0210448 (2005-09-01), Kipman et al.
patent: 2006/0168022 (2006-07-01), Levin et al.
S. Marsh, “Trust and Reliance in Multi-Agent Systems: a preliminary report”, (MAAMAW'92), Rome, Italy, 1992.
S. Marsh, Formalizing Trust as a Computation Concept, PhD Thesis, Dept. of Mathematics and Computer Sciences, U. of Stirling, UK, 1994.
P. Horn, “Autonomic Computing: IBM's Perspective on the State of Information Technology”, IBM Corporation, http://www.rsearch.ibm.com/autonomic/manifesto, Oct. 2001.
J.O. Kephart and D.M. Chess, “The Vision of Autonomic Computing”, IEEE Computer Magazine, Jan. 2003.
J.O. Kephart and W.E. Walsh, “An Artificial Intelligence Perspective on Autonomic Computing Policies”, Policies for Distributed Systems and Networks, 2004.
L.P. Kaelbling, M. Littman, A. Moore, “Reinforcement Learning: A Survey”, Journal of Artificial Intelligence Research, vol. 4, 1996.
T. Yu and M. Winslett, “A Unified Scheme for Resource Protection in Automated Trust Negotiation”, Proceedings of the 2003 IEEE Symposium on Security and Privacy.
R. Anderson, “Cryptography and Competition Policy—Issues with ‘Trusted Computing’”, PODC'03, Jul. 13-16, 2003.
T. Tang, P. Winoto and X. Niu, “Who Can I Trust? Investigating Trust Between Users and Agents in a Multi-Agent Portfolio Management System”, The University of Saskatchewan.
M. Lewis, “Designing for Human-Agent Interaction”, School of Information Sciences, University of Pittsburgh.
R. Parasuraman, “Designing Automation for Human Use: Emperical Studies and Quantitative Models”, The Catholic University of America (2000).
M. Itoh, G. Abe and K. Tanaka, “Trust in and Use of Automation: Their Dependence on Occurrence Patterns of Malfunctions”, IEEE (1999).
P. Madhaven and D. Wiegman, “A New Look at the Dynamics of Human-Automation Trust: Is Trust in Humans Comparable to Trust in Machines?”, Human Factors & Ergonomics Society (2004).
C. Jonker and J. Treur, “Formal Analysis of Models for the Dynamics of Trust Based on Experiences”, Vrije Universiteit Amsterdam, The Netherlands.
M. Schillo, P. Funk and M. Rovatsos, “Using Trust for Detecting Deceitful Agents in Artificial Societies”, Applied Artificial Intelligence Journal (2000).
G. Elofson, “Developing Trust with Intelligent Agents: an Exploratory Study”, Proceedings of 1st International Workshops on Trust, 1998.
E. Kandogan, P. Maglio, “Why Don't You Trust Me Anymore? Or the Role of Trust in Troubleshooting Activity of System Adminstrators”, CHI 2003 Conference.
A.K. Bandara, “A Goal-based Approach to Policy Refinement”, Proceedings 5th IEEE Workshop on Policies for Distributed Systems & Networks (Policy 2004).
S. Marsh, “Trust in Distributed Artificial Intelligence”, Artificial Social Systems, LNAI 830, Springer-Verlag, 94-112.
R. Barrett, “People and Policies: Transforming the Human-Computer Partnership”, Proceedings of the Fifth IEEE International Workshop on Policies for Distributed Systems and Networks (Policy 2004).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System to establish trust between policy systems and users does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System to establish trust between policy systems and users, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System to establish trust between policy systems and users will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2677359

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.