
Discover more from AI in Business
In Search of Permission and Wealth-Sharing Around Personal Data
Efforts of startups to enable users to retain control over personal data are fledgling; machine learning data harvesting seen as compromising privacy
By John P. Desmond, Editor, AI in Business

Computer users who want to gain control over use of their personal data, to grant permission for when and how it can be used, and who may be interested in sharing the wealth if they do grant permission for use of their data, might find the options are few.
Some companies are thinking along those lines, however, and see it as an essential privacy issue for AI developers that individuals retain control over their personal data. “A core concept should be the ability to give the user insight into the data that they are sharing and the ability to stop sharing it or the ability to control where it is being shared,” stated Will LaSala, the field CTO at cybersecurity company OneSpan, in a recent account in Information Week.
OneSpan is a cybersecurity software company known for its multifactor authentication and electronic signature software.
The high-permission, data-sharing approach has been tried by at least one other company: Glimpse Protocol, founded in 2019 in London with some $1.8 million in seed capital. According to an account on ExplodingTopics, “Glimpse Protocol aims to create a more transparent and equitable advertising ecosystem by empowering consumers to control their personal data and receive compensation for sharing it with advertisers.”
A visit to the company’s website yields a message that says business is currently “paused,” with this explanation, “The current system of programmatic advertising relies on the illegal and unnecessary harvesting of consumer data, which is then shared with hundreds of data brokers and adtech companies without adequate consent from the consumer, or fair value exchange.”
Finally, “Glimpse’s commercial activities are currently paused until the market is ready to shift to a more private model.”
Asked to clarify, cofounder and CEO Mark Stoter responded via LinkedIn to explain the pause. “We were too early,” he said, mentioning the impact of COVID and Google’s announced plan to phase out the use of cookies, which the advertising world relies on for targeting. He added, “Our model allows advertisers to reach the right audiences and measure effect, but without ever harvesting consumer data.” He said the company has a patent and is waiting for the right time to resurface.
Cavai.com Seeks to Collaborate with Customers
Approaches vary on the intersection of advertising and privacy. Ad tech startup Cavai.com seeks collaboration with customers. Writing recently on the company's blog, Cavai COO and cofounder Nikolai Pietilainen stated, “Creativity is the way brands get to engage with their audience … we can uplift the whole value chain. This all starts with wanting to understand users and listening to them.”
The company describes its offering as a global conversational advertising cloud that works with agencies, publishers, advertisers and other tech platforms. “We can invite users to actually tell us what they want,” he stated.
Users can also try to gain benefit directly–to get paid–from the sites they choose to interact with. Market research firm Forrester reports that about a third of business-to-consumer marketers now go directly to consumers with an offer to exchange personal data for a deal, according to a recent account in The Washington Post.
For example, Tapestri, a startup based in Chicago, offers users cash in exchange for close to constant access to their location. “We say: Let Tapestri pay for your Netflix bill, let us buy your next pumpkin spice latte,” CEO Walter Harrison stated. “It adds up over time, and you’re allowing us to do that by allowing us to be in your back pocket.” Tapestri pays users from $8 to $25 a month for their data, he stated.
Forrester has coined the term “zero-party data sharing” to describe this arrangement, in which the customer intentionally and proactively shares data with a brand. This comes as brands are encountering more difficulty in tapping traditional sources of customer data, as a result of moves including Apple’s offer to iOS users to opt out of some tracking, Google’s announced plans to disable cookies for some users starting in 2024, as well as privacy legislation out of Europe and some US states, notably California.
“Consumers are becoming more privacy savvy,” stated Stephanie Liu, marketing analyst with Forrester, while describing zero-party data as a consensual transaction, where users voluntarily share knowing what they will get in return.
However, privacy experts including Nicole Ozer, technology and civil liberties director at the ACLU of Northern California, warn that the data exchange is more visible in this zero-party model, but has not changed the balance of power. The user still has little say over what the company does with their data or how long it gets stored.
Economist Glen Weyl, a researcher at Microsoft Research New England, theorized in 2018 about pay-for-data-frameworks in which people would receive “data dividends,” returns on money made from personal data. The key would be for consumers to collectively bargain, the way a union would, for a fair price. Otherwise, the data aggregators would pay individual consumers “a pittance,” Weyl has stated.
Setting up a way for consumers to bargain in this way is a challenge. Weyl has stated that businesses claiming to pay for data tend to “ping pong between being exploitative and not very functional,” according to the Post report.
IEEE Standards Association Weighs In on Privacy
Grounding tech in reality is the purview of the IEEE, whose Standards Association set out the importance of data governance and privacy in a recent paper. “The widespread adoption of AI and machine learning (ML) brings increased focus on responsibility, including data governance and privacy of software-based systems,” stated the author, Srikanth Chandrasekaran.
He has been the Foundational Technology Practice Lead for the IEEE Standards Association for the past nine years, a senior director. He heads activities for IEEE SA for the Asia Pacific region, and is involved in the development of an IEEE eLearning platform focused on bridging skills for students and lateral skilling of industry professionals.
With so many children today having access to the internet, IEEE SA has a focus on designing trustworthy digital experiences for children.
Among the standards addressing children’s data governance include IEEE 2089™ Standard for Age-Appropriate Digital Services Framework; IEEE P2089.1™ Standard for Online Age Verification; and IEEE 3527.1™ Standard for Digital Intelligence (DQ).
The effort aims to get engineering considered by developers of products incorporating AI intended for use by children. The author states, “Artificial Intelligence Systems (AIS) enabling many products and services are driven by algorithms invisible to users that still deeply affect their data, identity, and values. Despite the best intentions of a manufacturer, without having a methodology to analyze and test how an end user interprets a product, service, or system, a design process will prioritize the values of its creators.”
The values of the AI product creators may not be something an engineer can trust. “Responsible innovation in the algorithmic era requires a values-oriented methodology that complements traditional systems engineering,” Chandrasekaran stated. One example is the use of federated machine learning in data harvesting as a way to protect privacy.
Read the source articles and information in Information Week, the Glimpse Protocol website, the blog of Cavai.com, The Washington Post and the IEEE Standards Association.
(Write to the editor here; tell him what you want to read about in AI in Business.)