You don't read privacy policies. And of course, that's because they're not actually written for you, or any of the other billions of people who click to agree to their inscrutable legalese. Instead, like bad poetry and teenagers' diaries, those millions upon millions of words are produced for the benefit of their authors, not readers—the lawyers who wrote those get-out clauses to protect their Silicon Valley employers.
'What if we turned privacy policies into a conversation?'
Hamza Harkous, EPFL
"What if we visualize what’s in the policy for the user?" asks Hamza Harkous, an EPFL researcher who led the work, describing the thoughts that led the group to their work on Polisis and Pribot. "Not to give every piece of the policy, but just the interesting stuff… What if we turned privacy policies into a conversation?"
"The information is there, it defines how companies can use your data, but no one reads it," says Florian Schaub, a University of Michigan researcher who worked on the project. "So we want to foreground it."
But the researchers see their AI engine in part as the groundwork for future tools. They suggest that future apps could use their trained AI to automatically flag data practices that a user asks to be warned about, or to automate comparisons between different services' policies that rank how aggressively each one siphons up and share your sensitive data.
"Caring about your privacy shouldn't mean you have to read paragraphs and paragraphs of text," says Michigan's Schaub. But with more eyes on companies' privacy practices—even automated ones—perhaps those information stewards will think twice before trying to bury their data collection bad habits under a mountain of legal minutiae.
More From this publisher : HERE ;
We also Recommend : [wp-stealth-ads rows=”2″ mobile-rows=”2″]