The White House released an 85 page report, Big Data: Seizing Opportunities, Preserving Values. According to the report, big data, for all its very real benefits to economy, health, and security, raises serious concerns around privacy. The sooner those boundaries are defined the better.
The report gets right to the point: The technical capabilities of big data have become so sophisticated, there needs to be a balance between the opportunities it affords the users of data against the value it serves the providers of data. Social and ethical considerations must be made against the power of the technology.
A significant finding of this report is that big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americansâ relationship with data should expand, not diminish, their opportunities and potential.
"The report acknowledges data is going to be collected and made available by consumers. Now the real question becomes, how will it be used?" says John Lucker, principal and global advanced analytics & modeling market leader for Deloitte Analytics.
Private Sector Management of Data
The report does not gloss over big data's impact on business. The very products and services made available to consumers are increasingly based on insights from customer activity, even if those customers are unaware their data is being collected and analyzed.
The Obama Administration has supported Americaâs leadership position in using big data to spark innovation, productivity, and value in the private sector. However, the near-continuous collection, transfer, and re-purposing of information in a big data world also raises important questions about individual control over personal data and the risks of its use to exploit vulnerable populations. While big data will be a powerful engine for economic growth and innovation, there remains the potential for a disquieting asymmetry between consumers and the companies that control information about them.
To keep this digital ecosystem in check there has to be a balanced value for both consumers and users of data. In many cases, the business world is offering use of various applications or tools in exchange for access to consumer identities. Consumers want the benefits of the apps and services for nothing or low cost, but in the back on their minds recognize they are giving companies access to their information.
Realistically, as long as people perceive they are getting something of value, they don't really have much concern. An example of this is using a loyalty card at a grocery store. In this scenario, a weekly discount is well worth all the information customers are exchanging. Lucker explains it's infrequent somebody would opt out of a loyalty card because they don't want corporations knowing what they buy. The value equation holds true.
The concern then emerges when in some way the customer is being disadvantaged, when they are not getting back value equal to the value they are providing the company.
How then, does a customer recognize the value of their data? If they clearly knew the extent of the information gathered, would their privacy concerns become more significant? Perhaps, but to do that customers have to first ask the question about their data privacy. And let's face it, few really care.
Education - does it even matter?
There is an emphasis in White House document around the need for more R&D on privacy protection technology and an increase on consumer education. "It's an acknowledgement that everyday consumers are providing significant information through voluntary use of apps and technologies, but there seems to be gap in how consumers understand what they are providing and how to control it," explains Lucker.
For those interested in laying low, deleting apps, turning off a phone's geospatial tracking, using anonymous browsers, turning off cookies, enabling "do not track" browser settings, etc, all help reduce digital exhaust. The average consumer does not know how to control these things, or simply doesn't care - something like letting the clock on the VCR blink midnight.
"The government talks about the need to educate, and somehow these privacy issues are a concern. However a lot of ability to control identity is already there, but people choose not to pay attention or don't know how to pay attention to it," adds Lucker. "They face a bit of a conundrum there."
[For more on the White House Report, read: White House Big Data Report: 5 Privacy Takeaways ]
Of course people's willingness to learn might change if they started to see behaviors from commerce that has more of a "creep out" factor - like coupons that are a bit too targeted or revealing. That could cause consumers to re-consider the exchange, and look into adjusting their level of exposure. As a good business practice, it would be wise for corporations to toe that line.
"Inside every company with analytics there need to be people who ask themselves, 'just because we can, should we?'" Lucker adds, "If we created a creepiness scale, what's the rank? That's a form of self regulations around good business practice and less about government regulation."
The truth of the matter is, a lot of commerce is pretty good about being transparent about what is done with data, and how it will and will not be used. Every site has terms of usage agreements, a license, privacy policies. All those things are ubiquitous. The problem is they are not always in plain English, and even if they are the onus on consumers to read and understand them is not very practical. Even so, Lucker suggests if they did take a look at the terms, the avenger consumer would be surprised at how transparent they already are.
"Frankly, they don't pay attention to how broadly they've given their rights away to collectors. It's hard to fault commerce for having broad rights when consumers aren't paying attention anyway."