Probabilistic privacy protection

Data processing: measuring – calibrating – or testing – Measurement system – Performance or efficiency evaluation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C702S188000

Reexamination Certificate

active

06470299

ABSTRACT:

BACKGROUND
The present invention concerns gathering usage information and pertains particularly to providing privacy protection using probabilistic accumulation of data.
Effective marketing often requires an understanding of the history and habits of individuals or groups to which a product or other object is being marketed. However, many consumers value their privacy and are reluctant to provide information themselves or to have their activities monitored.
One reason consumers-are hesitant to provide information or to be monitored is because they fear that gathered information may be used inappropriately. Even though those who gather information may promise to hold in confidence obtained information; nevertheless, consumers often do not have complete trust that information collected about them will not be misused.
It is desirable therefore to provide some mechanism that allows collection of information that is useful to marketers but protects the privacy of individuals.
SUMMARY OF THE INVENTION
In accordance with the preferred embodiments of the present invention, information about activities is gathered. Performed activities are monitored and recorded. Each recording of a performed activity has an accuracy level. The accuracy level indicates a probability level that any particular recorded activity accurately records a performed activity. Random selection of activities from a list of activities is used to inaccurately record some performed activities.
For example, the performed activities include monitoring accesses of sites through a network or using a network to purchase products. In the preferred embodiment, a user is allowed to select the accuracy level. Recordings of the performed activities are aggregated with recordings of other performed activities to obtain aggregated information.
The present invention allows gathering of information while retaining a degree of privacy. Because the information is not always true, individual pieces of information are not worth much. However, when aggregated with other information, the aggregated information can be useful. The accuracy of the aggregated information depends on the probability of individual answers being true. If the individual answers have a higher probability of truth, the aggregated answers provide r results closer to the true probability.
Because individual pieces of information may be untrue, the information is valuable only in aggregation. Hence, data collectors are required to aggregate information to generate useful information.


REFERENCES:
patent: 6321263 (2001-11-01), Luzzi et al.
patent: 6356859 (2002-03-01), Talbot et al.
Goals of the NymIP Effort [online] Available: http:/
ymip.velvet.com/cvs/general/goals.html [Mar. 14, 2001].
Thy NymIP Effort [online] Available: http:/
ymip.velvet.com/ [Mar. 14, 2001].
Principles For Standardization and Interoperability in Web-based Digital Rights Management [online] Available: http://www.w3.org/2000/12/drm-ws/pp/hp-erickson.html [Mar. 13, 2001].
Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms [online] Available: http:world.std.com/ franl/crypto/chaum-acr-1981.html [Mar. 13, 2001].
Intel Nixes Chip-Tracking ID [online] Available: http://www.wired.com
ews/politics/0,1283,35950,00.html [Mar. 13, 2001].
SafeNet 2000: Security and Privacy Leaders Gather at Microsoft Campus to Seek Solutions To Chanllenges Facing Internet USers [online] Available: http://www.microsoft.com/PressPass/features/2000/dec00/12-07safenet.asp [Mar. 13, 2001].
RealNetworks in Real Trouble [online] Available: http://www.wirednews.com
ews/politics/0,1283,32459,00.html [Mar. 13, 2001].
David Chaum, “Achieving Electronic Privacy”, Scientific American, Aug. 1992, pp. 96-101.
Kenneth C. Laudon, “Markets and Privacy”, Communications of the ACM, Sep. 1996/vol. 39, No. 9, pp. 92-104.
Julie E. Chohen, “A Right To Read Anonymously: A Closer Look At “Copyright Management” In Cyberspace”, Originally published 28, Conn. L. Rev. 981 (1996).
Nabil R. Adam and John C. Wortmann, “Security-Control Methods For Statistical Databases: A Comparative Study”, ACM Computing Surveys, vol. 21, No. 4, Dec. 1989, pp. 515-556.
Jan Schlorer, “Security of Statistical Databases: Multidimensional Transformaion”, ACM Transactions on Database Systems, vol. 6, No. 1, Mar. 1981, pp 95-112.
George T. Duncan and Sumitra Mukherjee, “Optimal Disclosure Limitation Strategy In Statistical Databases: Deterring Tracker Attacks Through Addititive Noise”, Jun. 16, 1998, [Online] Available: http://duncan.heinz.cmu.edu/GeorgeWeb.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Probabilistic privacy protection does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Probabilistic privacy protection, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Probabilistic privacy protection will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2991453

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.