Collecting huge amounts of information about all of us and then using supercomputers to sift through, analyze and study it — this is a reality of modern life, and it can be a tremendously powerful thing.
Researchers can use techniques like those to identify genetic markers linked to breast cancer, better understand climate change or figure out how to combat hospital infections.
The collection of personal information has become so ubiquitous that even staunch privacy advocates now say it's impossible to build a protective wall around your all personal data. Rather, they say, expectations for privacy have to be redefined.
Danny Weitzner, who organized a privacy workshop at MIT with White House officials and researchers Monday, says he sees lots of promise in analyzing information — but he's not blind to how big business and the government could abuse it.
"Some marketers have discovered that a very good proxy for a high credit score is found in people who buy furniture coasters," says Weitzner, who used to advise the White House on technology and privacy and helped write the administration's proposed privacy bill of rights.
Yes, that's right, furniture coasters — those little felt pads you put underneath chair legs. Turns out that if you use them, you are probably a better credit risk. And credit scores can be used for all sorts of things, like employment decisions and getting an apartment.
There are rules for how those scores can be used; if someone checks your credit, generally they have to get your permission to do it. But that's not the case if they use a proxy for your credit score — like furniture coasters.
"No one's ever going to tell you when they turn you down for a job that it's because they pulled up your data broker report," says Julia Angwin, author of the new book Dragnet Nation. "Right now, the problem is you can't tell when your data is being used against you, so there is this kind of feeling of fear."
Angwin covered privacy and technology at the Wall Street Journal for years. In her book, she chronicles her attempt to erase her own digital trail and prevent it from ever being used against her. She spent thousands of dollars on gadgets and software to keep her identity secret, and hundreds of hours tracking down stashes of data about her — then begging companies to erase it.
In the end, she failed.
"After spending a year doing this, I felt this is not something any normal person would do or should do," Angwin says. "It's too much work, and I didn't actually achieve many of my goals."
Danny Weitzner says we can never turn back the clock. None of us will ever be able to disappear. But he insists that privacy isn't dead — at least, not his definition of privacy.
"There is a tendency to think that privacy is synonymous with secrecy," he says. "That if you can keep your personal information secret then you have privacy. But if you don't, if it's not secret anymore — if third parties hold your personal information, then somehow you have lost all your privacy.
"I personally reject that notion of privacy," Weitzner adds. He says protecting privacy in the digital age means creating rules that require governments and businesses to be transparent about how they use our information.
Weitzner envisions a world where big databases are audited to prevent abuse, including those at the NSA and financial reports, and where encryption technology ensures that research involving troves of sensitive personal information don't open windows into individuals' personal lives.
Surprisingly, Chris Calabrese, legislative counsel for privacy-related issues at the ACLU, agrees.
"This can't be a discussion about how one side wants to stop global warming by doing a better analysis of huge information on our energy use and the other side cares about privacy," he says. "That can't be the way the debate is framed."
Calabrese says he can't kill off the big data business; it's too late. In fact, he says, we probably don't want to anyway. But with the right rules and technologies, maybe we can get it under control.