[ad_1]
(Psst: The FTC wants me to remind you that this website contains affiliate links. That means if you make a purchase from a link you click on, I might receive a small commission. This does not increase the price you’ll pay for that item nor does it decrease the awesomeness of the item. ~ Daisy)
By the author of
The world of privacy is a constant battlefield. It’s not a static decision where once you’ve done this one single step, you’re now good until the end of time. Instead, you have to stay abreast of the research, studying the ways that privacy is constantly being diminished so that you can then take the appropriate steps to respond.
If you’ve read through a privacy policy for an app, website, or contract in the past, you’ve likely noticed that they state they may sell your data to third parties. Exactly who these third parties are, you never know, nor what your information is being used for in the first place.
But sometimes, you find the privacy policy tries to add a feel-good clause here, saying something to the extent that “our data about you is completely anonymous.”
Not anymore, it isn’t.
Researchers have created an artificial intelligence that can use sets of anonymous data and the trends within that data to correctly pick out a targeted individual more than 50% of the time. (Admittedly, this took place in early 2022, but it’s something few know about.)
Specifically, they’ve done this with phone numbers.
After being fed a database of 40,000+ phone numbers as well as some background information on who that number contacted, when this AI system was tasked with picking out an individual from an anonymous dataset, it was able to correctly pick out the target by analyzing all of the numbers that Phone Number C regularly contacted. People are creatures of habit, and because AI is very good for data harvesting purposes, this AI was able to effectively say, “This phone number likes to contact these four phone numbers quite a bit. Based off of the existing data I already have on these four contact phone numbers, the anonymous data point that is contacting these four people is likely John Brown.”
You can be identified by your habits.
Of the research, University of Minnesota computer scientist Jaideep Srivastata said, “It’s no surprise that people tend to remain within established social structures and that these regular interactions form a stable pattern over time.”
I would add that it’s not just your social structures that are habitual. So are your buying patterns, your location patterns, your driving patterns, your entertainment patterns, your exercise patterns, your sleep patterns, and just about everything else.
Ultimately, this means that even if data is anonymized, it really isn’t. Transhumanist Ray Kurzweil was spot on when he pointed to the drastic leaps in AI technology we’re going to see over the course of the next few decades, and this type of technology is only going to grow more prevalent. Trends can be analyzed and used to determine who you really are.
What are some of the dangers of this?
Let’s say you regularly read the exact same five websites every day at the exact same time. This type of technology could potentially be used to determine who you are, even if you’re using a VPN.
Let’s say that you have a credit card addiction and are trying to dig your way out of deep debt. You’re really trying hard, but now, third parties could hit you with very effective ads that would better entice you to spend money on stuff you really don’t need even though you have tried to be as anonymous as possible.
Let’s say you use an alarm clock app that sells anonymous data to third parties. You also set your alarm to 5:47 every morning. Somebody could use this type of AI to figure out not only when you are sleeping and when you are in your deepest stage of sleep.
Let’s say you’re in Belarus and have posted something against Vladimir Putin online. Even if you thought you were remaining anonymous, this type of technology could easily be used against you to determine exactly who you are.
Let’s say you suffer from a rather embarrassing medical disorder that you would like to keep private. The entire marketing world could soon know that you have this condition, and that information could be purchased by anybody who wants to know more about you.
While some of these results couldn’t be accomplished with this AI alone, when combined with many of the other AI methods out there, you could very easily end up with that outcome.
Privacy sure isn’t what it used to be.
So be careful with what apps you download or use. Be careful with the websites you browse. When combined with a Chinese-style social credit score system (such as an ESG score), this type of power could easily be used for nefarious purposes.
And honestly, I’m not even really sure of what the best course of action is to avoid this type of AI. But you have to know what the problem is first before you can start to search for solutions. Hopefully, this will help to get that information out there to somebody who has a better fix for this than I do.
What are your thoughts? Were you aware of this kind of attack against personal privacy? What do you think is the best way to counter this? Let us know what you’re thinking in the comment section below.
About Aden
Aden Tate is a regular contributor to TheOrganicPrepper.com and TheFrugalite.com. Aden runs a micro-farm where he raises dairy goats, a pig, honeybees, meat chickens, laying chickens, tomatoes, mushrooms, and greens. Aden has four published books, What School Should Have Taught You, The Faithful Prepper, An Arm and a Leg, The Prepper’s Guide to Post-Disaster Communications, and Zombie Choices. You can find his podcast The Last American on Preppers’ Broadcasting Network.
[ad_2]
Source link