Formalizing Evidence and Trust for User Authorization

Project Award Number: IIS-0209059

Principal Investigator and Collaborator

Bharat Bhargava

Department of Computer Sciences

Purdue University

250 N. University Street

West Lafayette, IN 47907

Phone: 765-494-6013

Email: bb@cs.purdue.edu

http://www.cs.purdue.edu/people/bb

Leszek Lilien

Department of Computer Sciences

Purdue University

250 N. University Street

West Lafayette, IN 47907

Phone: 765-496-2718

Email: llilien@cs.purdue.edu

 http://www.cs.purdue.edu/people/llilien

Keywords

Access control, Authorization, Evidence, Trust, Security

Project Summary

Trust characterizes the probability that a user will not harm the operations of an information system. User or site trustworthiness is needed in transaction processing, distributed database processing (consistency, integrity), peer-to-peer systems, web based e-commerce systems, and building routes in ad hoc networks. We argue that credentials are not sufficient to certify that a user is trustworthy.

This research is developing formal model for trust, authorization, fraud, and privacy. It incorporates the comprehensive aspects of trust in social life and computer science applications. Considering associated contexts, it automates the evaluation of trust under uncertain evidence and dynamic interactions. Trust is being integrated with authorization and authentication mechanisms for use in an open computing environment so that applications can use these models.

This research presents an authorization framework based on uncertain evidence and dynamic trust. A prototype called TERA (Trust-Enhanced Role Assignment) has been built for experimental studies. The TERA prototype evaluates the trust of a user from her behaviors. It decides whether a user is authorized for an operation based on the policies, the evidence, and the degree of trust. The reliability of the evidence is based on the trust of the evidence provider. A user's trust value is dynamically updated when additional data on behaviors is available. The trust information is managed by a reputation server.

Four user behavior patterns have been identified. These patterns have been integrated in the TERA prototype to simulate users with different levels of trust. These patterns are used as benchmarks for evaluation of trust/reputation systems. Two algorithms have been developed to determine a user's trust value based on a sequence of her interactions and past behaviors. A classification algorithm is designed to build user-role profiles. Experiments have been conducted on discovering the intention behind a sequence of behaviors. The research on fraud formalization is being integrated with vulnerabilities in any system to devise anomaly detectors, state transition analysis, and risk analysis.

Publications and Products

  • B. Bhargava and Y. Zhong, "Authorization Based on Evidence and Trust", in Proceeding of International Conference on Data Warehousing and Knowledge Discovery (DaWaK), Sept. 2002.
  • E. Terzi, Y. Zhong, B. Bhargava, Pankaj, and S. Madria, "An Algorithm for Building User-Role Profiles in a Trust Environment", in Proceeding of International Conference on Data Warehousing and Knowledge Discovery (DaWaK), Sept. 2002.
  • B. Bhargava, Y. Zhong, and Y. Lu, "Fraud Formalization and Detection", in Proceedings of International Conference on Data Warehousing and Knowledge Discovery (DaWaK), Sept. 2003.
  • B. Bhargava, "Vulnerabilities and Fraud in Computer Systems", in Proceedings of the International Conference on Advances in Internet, Processing, Systems, and Interdisciplinary Research (IPSI), Oct. 2003.
  • L. Lilien and A. Bhargava, "From Vulnerabilities to Trust: A Road to Trusted Computing", in Proceedings of the International Conference on Advances in Internet, Processing, Systems, and Interdisciplinary Research (IPSI), Oct. 2003.
  • L. Lilien, "Developing Pervasive Trust Paradigm for Authentication and Authorization", in Proceedings of Third Cracow Grid Workshop (CGW), Oct. 2003.
  • P. Ruth, D. Xu, B. Bhargava, and F. Regnier, "E-notebook Middleware for Accountability and Reputation Based Trust in Distributed Data Sharing Communities", in: Proceedings of the 2nd International Conference on Trust Management, Mar. 2004.
  • M. Jenamani, L. Lilien, and B. Bhargava, "Anonymizing Web Services Through a Club Mechanism with Economic Incentives", in Proceedings of International Conference on Web Services (ICWS), Jul. 2004.
  • Y. Zhong and B. Bhargava, "Using Entropy to Tradeoff Privacy and Trust", in Proceedings of the 1st NSF/NSA/AFRL workshop on Secure Knowledge Management (SKM), Sept. 2004.
  • Y. Lu, W. Wang, D. Xu, and B. Bhargava, "Trust-based Privacy Preservation for Peer-to-peer Data Sharing", in Proceedings of the 1st NSF/NSA/AFRL workshop on Secure Knowledge Management (SKM), Sept. 2004.
  • B. Bhargava, M. Jenamani, and Y. Zhong, "Impact of Cheating on Online Auctions", under submission.
  • Y. Zhong, Y. Lu, and B. Bhargava, "TERA: An Authorization Framework Based on Uncertain Evidence and Dynamic Trust", under submission.
  • Y. Zhong, B. Bhargava, L. Lilien, and Y. Lu, "Fraud Formalization, Prevention, and Detection", under submission.

Project Impact

Research on authorization, security, and privacy in database systems provides new measures, tradeoffs, and practical schemes. Development of prototypes is providing tools, measurements, and guidelines that can be used in various applications such as e-commerce and transportation system security. User behavior in e-commerce Web sites, fraud issues in e-auction systems can benefit from this research.

The mischievous behavior detection mechanisms have been adopted to improve the security and privacy in distributed systems. An intruder identification mechanism has been developed to detect the entities giving wrong and false information. The quorum-based method has been adopted to detect coordinated attacks. The trust relations among the members in a peer-to-peer system are applied to protect the privacy of peers. The trustworthiness of a peer is assessed based on its behaviors and other peers' recommendations.

Research will enhance the efficient use of machine learning and incentive-based distribution of work.

This research has use for improving healthcare delivery. The objective is achieved through enhancing trust in timely data exchange among patients, physicians, and nurses. A framework that enables distributed and pervasive data access in healthcare is proposed, with focus on trust, integration, privacy, and usability.

The Ph.D students are learning about formalization of difficult concepts such as trust, evidence, and fraud. The design of experiments to quantify these concepts and evaluate them in terms of malicious behavior and interactions is unique in computer science. The research has taught students to integrate formal methods in philosophy, statistics, and machine reasoning to practical problems in database processing. Two minority Ph.D students have been trained in database security research practice and experiments. This research helped in upgrading course material in database classes that goes beyond reliability, integrity, and security and leads to the notion of trust as a measure.

Goals, Objectives and Targeted Activities

The development of a full scope prototype system called PRETTY (PRivatE and TrusTed sYstem) is underway. PRETTY is an extension of TERA and incorporates issues of fraud. It will serve as a test bed for experimental studies for trust, fraud, and privacy. It provides a platform to simulate privacy violators and users with different levels of trust.

The formalization and detection of fraud have been studied based on a sequence of transactions. Several types of frauds have been identified and the costs associated with them are being studied through experiments. Three deceiving intentions have been identified based upon the behavior patterns and a deceiving intention predictor is developed to detect unauthorized access and fraud.

Experiments are being conducted to study the detection capability of the deceiving intention predictor when multiple collaborators misbehave intentionally but in a coordinated manner. In current implementation, the authenticity and integrity of the interaction histories and evidence are protected by using cryptography techniques.

The tradeoff between privacy and trust has been investigated. The objective is to build a certain level of trust with the least loss of privacy. The research involves the estimation of loss of privacy and gains in trust by disclosing a set of evidence that contains private information. In the probability method, privacy is measured as the difference between entropies. Bayes networks and kernel density estimation are being adopted to compute the conditional probability for entropy evaluation. In the lattice method, privacy loss is measured as the least upper bound of the privacy levels of candidate evidence.

Area Background

Current research efforts grant privilege to a user based on her properties that are demonstrated by digital credentials (evidences). Holding credentials does not certify that the user will not carry out harmful actions. Authorization based on evidence as well as trust makes the access control adaptable to users' misbehaviors. Existing computational trust management model can be broadly categorized into authorization-based and reputation-based trust management. Our research effort integrates them into one framework. Evidence testifies certain properties of an entity, or subject. A computational evidence theory, such as Bayesian network, Dampster-shafer theory and subjective logic, deals with the evaluation and combination of evidence. In our research, Damspter-shafer theory is used to integrate reputation and subjective logic is adopted to evaluate recommendations.

Area References

  • L. Mui, "Computational Models of Trust and Reputation: Agents, Evolutionary Games, and Social Networks", PhD Thesis, EECS, MIT 2002.
  • J. Park and R. Sandhu, "Role-based Access Control on the Web", ACM Transactions on Information and System Security, Vol. 4, No. 1, Feb. 2001.
  • G. Shafer, "A Mathematical Theory of Evidence", Princeton University Press, 1976.
  • A. Jøsang, "A Logic for Uncertain Probabilities", International Journal of Uncertainty, Fuzziness and Knowledge-based Systems, Vol. 9 No. 3, June 2001.

Potential Related Projects

This project is related to "Vulnerability Analysis and Threat Assessment/Avoidance" funded by NSF.

Project Websites

http://www.cs.purdue.edu/homes/bb/NSFtrust

The website contains information about the project participants, including collaborators and graduate students. The links to three news websites are provided, at which this research effort is publicized. The research papers, presentations, and proposals can be downloaded. The TERA prototype software and the demonstration are available for public access.

Online Software

http://raidlab.cs.purdue.edu/zhong/NSFtrust/TERA.zip
The TERA prototype software consists of several TERM (Trust-Enhanced Role Mapping) servers and a reputation server. The TERM server cooperates with the RBAC-enhanced application server to monitor users' interactions and evaluate their trust values. It exchanges information with the reputation server, evaluates the reliability of evidence, and assigns roles to users. A policy declaration language and the corresponding policy interpreter have been developed to specify the requirements for role assignment. The reputation server stores the trust values of users reported by the TERM servers and provides them as a special type of evidence. This prototype realizes the distributed trust model. Different TERM servers may have different views of trust towards the same user. Four heuristic and subjective trust production algorithms have been designed to dynamically evaluate the trust value of a user based on interactions. These algorithms have been implemented in the TERA prototype.

Illustrations

http://raidlab.cs.purdue.edu/zhong/NSFtrust/Demo/index.html

The demonstration of the TERA prototype consists of four video clips, including an overview of the software, an introduction to the components, and two authorization examples. The demonstration illustrates (a) how authorization based on evidence and trust cooperates with role-based access control, (b) how a user’s behaviors impact her trust value, (c) how the four trust production algorithms realize different heuristic and subjectivity, and (d) how the trust values are propagated among TERM servers.