Hurry and protect your data from Trump
Law professor Daniel Solove on how to safeguard your private details before an authoritarian administration takes over
Americans tend to assume that they’ve got nothing to hide. At the same time, they’ve long taken for granted that the government and the companies that run the apps they use already have a ton of their information (and that’s true), so most respond to privacy challenges with apathy. It’s no big deal. “Why would anyone care where I go,” they might ask, “or who I’m friends with, or what I’m buying?”
With the Trump administration about to take power, it might be time to reexamine all of those assumptions.
The apps you use for social media, search, or email are important parts of your daily life. And it’s true you can’t entirely avoid prying eyes — governmental or corporate — even if they’re using encrypted messaging apps like Signal. But it’s time to start caring about your privacy and the future of privacy for all Americans, as Trump gets ready to retake office. To get a better understanding of why — and what Americans need to demand from their representatives to protect their privacy and personal information in the years ahead — we spoke with Daniel Solove, a professor of intellectual property and technology law at the George Washington University Law School.
We hope The Ink will be essential to the thinking and reimagining and reckoning and doing that all lie ahead. We want to thank you for being a part of what we are and what we do, and we promise you that this community is going to find every way possible to be there for you in the times that lie ahead and be there for this country and for what it can be still.
Many people assume they don’t have much privacy these days, and regardless, they don’t think they’ve done anything wrong. Why is it important for people to start caring about their privacy?
That’s kind of a “nothing to hide” argument, right? “I’ve got nothing to hide, so it’s fine.” There are many arguments against that. There could come a day when something you do is something the government doesn’t like or could put you in some kind of peril. We’re seeing that already where the government is going to target certain people.
A lot of people are going to have targets on their back —where before it was much less so. There’s also the problem of inference. People think if they’re not revealing anything important, then it’s okay, and they’re safe. Who cares about what I buy at the supermarket or a store? It’s just stuff. What we see is sophisticated AI algorithms can analyze this data that people give off with pretty mundane purchases and determine very sensitive things about them.
How so?
There’s the very famous Target story. It’s actually more than a decade old. Target was able to analyze people’s purchases and find out if they were pregnant. It wasn’t like they went and bought baby products. In fact, the algorithm was trying to figure out if they were pregnant based on non-obvious [purchases]. Things that gave people away that the algorithm found were a correlation between people buying unscented products and cotton balls and various things people would never have suspected would reveal that they were pregnant.
When data is gathered, and it’s gathered more and more and more, and then combined and analyzed by these sophisticated algorithms, those algorithms can find patterns in this data and make inferences about people that reveal a lot more than they might expect they would. It’s very hard for a person to determine, “Here’s what I want to reveal and here’s what I want to conceal.”
This data can reveal quite a lot about them that they might not want to reveal. It can be bad in a lot of different ways. It’s bad if it’s accurate — if they’re correct, and they get it right — because now there’s a lot of details on someone that they didn’t want to reveal.
If you bought these products, and this is where you live, and here’s when you bought them… from that you can perhaps figure out religion. They’re not shopping on certain days, they’re not buying certain types of foods, they live in this zip code — that can pretty much give it away with a very high degree of accuracy.
So if the government wants to target a certain group, they can start using existing algorithms to determine where those people live and what they’re doing
Keep reading with a 7-day free trial
Subscribe to The.Ink to keep reading this post and get 7 days of free access to the full post archives.