Smart Personal Assistants (SPAs), such as Amazon Alexa, Google Assistant and Apple Siri, leverage different AI techniques to provide convenient help and assistance to users. However, inappropriate information sharing decisions can lead SPAs to incorrectly disclose user information to undesired parties, or mistakenly block their reasonable access in specific scenarios to desired parties. In fact, reports about privacy violations in SPAs and associated user con- cerns are well known and understood in the related literature. It is difficult for SPAs to automatically decide how data should be shared with respect to the privacy preferences of the users. We argue norms, which are regarded as shared standards of acceptable behaviour of groups and/or individuals, can be used to govern and reason about the best course of action of SPAs with regards to information sharing, and our work is the first to propose a practi- cal model to address the above issues and govern SPAs based on normative systems and the contextual integrity theory of privacy. We evaluated the performance of the model using a real dataset of user preferences for privacy in SPAs and the results showed a very marked and significant improvement in understanding user preferences and making the right decisions with respect to data sharing.
|Name||AIES 2022 - Proceedings of the 2022 AAAI/ACM Conference on AI, Ethics, and Society|
|Conference||5th AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society, AIES 2022|
|Period||1/08/2022 → 3/08/2022|
- personal data
- smart personal assistants
- voice assistants