This company proposes to store highly sensitive information on the blockchain - a technology that has no way to delete data on it, and works by spreading your data across thousands of computers.
Find anyone on the popular Russian social network VKontakte, simply by taking a picture of their face. Already the app has lead to problems with stalking and vigilantism. The city of Moscow wants to integrate it into all of their security cameras.
A lot of new security companies focus on 'insider threats'. RedOwl takes data from employees, including their emails and social media feed, and generates a list of the ones you should keep an eye on.. including those that might leak to the press.
Their algorithms look at what you've posted online and deduce your psychological profile.
A credit scorer and databroker that has not only created intimate profiles on people, but had such poor security that in 2017 their databases were hacked. Sensitive data on millions of Americans is now out in the open.
This CIA-backed multi-billion dollar company that offers military grade data-analysis, surveillance and prediction to governments, police, cities and companies. The are considered to be as powerful as Google, but the creepy thing is: you're likely never heard of them.
This company works for landlords. Renters are "...required to grant it full access to your Facebook, LinkedIn, Twitter and/or Instagram profiles. From there, Tenant Assured scrapes your site activity, including entire conversation threads and private messages; runs it through natural language processing and other analytic software; and finally, spits out a report that catalogs everything from your personality to your 'financial stress level'."
"If you don't pay for it, you are the product". Facebook creates detailed profiles on both its users and on people who don't have an account. It's a dream come true for any spy agency: the world's biggest people database, where people report on themselves and on each other. Their algorithmic newsfeed has a huge influence on how its users see the world, allowing companies and governments to manipulate their perception, and even their mood.
Did you know Samsung also makes automated sentry guns? These guns are used at the Korean border, and can kill people autonomously. Photo by MarkBlackUltor
One of a growing number of companies that offer algorithmic auditing for companies looking to hire new employees. Will companies create their own filterbubbles for people?
Remember the classic board game "Guess who"? This is a Facebook version where they propose that you answer disturbing questions about mutual friends, like "are they in a healthy relationship" or "are they gay".
This company claims to do DNA Phenotyping - recreating what someone's face looks like from just their DNA. The question: could you convict someone of a crime based on this technology?
This company creates detailed psychological profiles from your data, and then uses that to influence elections. They know what messages you will be most susceptible to. They worked on the Trump campagne.
This company aggregates your reputation from various online sources to create one single score. Just like the Chinese government is doing. These reputation systems amplify social pressure, leading to an increase in censorship and risk avoidance.
This face recognition company's software is used in public spaces like stadiums, and matches 20 million faces per second. It can track people as they move from store to store, and alert security personnel.
Can you boil someone down to a single score? Upstairs.me wants us all to rate each other. On top of that, they want to store all this in a blockchain, so they are less accountable, and it will be impossible to really delete any data. This will invite abuse and create incredible social pressure which will lead to a culture of self-censorship. This violently ignorant startup is probably illegal in the EU.
By placing highly sensitive microphones and cameras in your home, you allow Amazon to create more detailed profiles about your life. Having them in your house also has security implications. If you buy this device, you are also making a choice about the privacy of those around you - any voice Alexa picks up will be fingerprinted, allowing Amazon to track people as they visit other places where Alexa is installed.
The list of Uber's unethical practices is long: showing fake data to regulators, requesting and then cancelling rides at their competitor Lyft, rampant sexual harassment, repeatedle ignoring local laws, claiming their workers are contractors, and generally contributing the the rise of the 'precariat': people who make little money and have very little job security (a situation that breeds extremist thought).
This company claims their algorithms only need a pictures of your face to detect if you're an academic, poker player or even a terrorist.
You get a better deal on insurance as long as you and your friends don't make any claims. This increases social pressure to not make claims, which could lead to bad situations. Secondly, a core idea behind insurance is solidarity with those less fortunate. Segmenting the market could mean higher premiums for the poor.
This is a ranking of the most creepy companies and startups that collect your data or use algorithms in a dubious way.