"Minority Report" Meets Homeland Security
In a disturbing parallel to George Orwell’s Thought Police, the Department of Homeland Security has started to test its latest privacy invasion tool: Future Attribute Screening Technology (FAST), an airport screener that “senses” if you are about to commit a crime, with an estimated 70% accuracy, reminiscent of the movie Minority Report. We have reached the point where national security stops justifying privacy breeches with a machine that reads your pulse, pupil dilation, facial features, and more. It is time for a new tact. The government should mimic an unlikely crop of privacy heroes — tech companies like Facebook and Google — and give the public a choice to limit Big Brother’s all-seeing eye.
Tech companies offer us what the government does not: options. Both Facebook and Google have been willing to compromise on past privacy concerns, because without users’ confidence in their products, they have nothing to sell. To maintain that trust, they allow users to opt out of some of their most egregious privacy violations, whereas the government never considers a similar route. Business interests evaluate, react, and proceed with caution (sometimes pulling the product); the government simply decrees.
Google and Facebook have modified their new features after receiving poor press and protest from users. The 2007 experiment with Beacon — where Facebook shared information about users’ purchases on third-party websites with friends — was a complete failure and later dropped. While the company will not remove the heavily criticized Tag Suggestions feature, which automatically recognizes friends in pictures, it has apologized and allows users to opt out.
Facebook’s gaffes in privacy protection only emphasize the importance of flexibility in business. While it has continuously pushed the limits of traditional privacy notions, the key takeaway is Facebook’s willingness to step back when the public balks.
Meanwhile, Google decided to entirely shelve its facial recognition project. Google has received scrutiny from privacy advocates in the past, probably why it decided not to release the latest feature. Google made the decision because Eric Schmidt, the executive chairman, rightfully feared the combination of face recognition and mobile tracking could be used “in a very bad way.” It was a good choice. As a technology available on the global market, everyone from nefarious dictators to sexual predators would have access to a dangerous tool.
For all of Facebook and Google's faults, at least they recognize the importance (and legal issues) of having users’ consent to share information. Google actually proves to be the greater hero, showing foresight, honesty, and appropriate caution about the product.
Unfortunately, public opinion has less weight in matters of national security. Since Homeland Security is testing FAST in secret — without people’s knowledge or consent — we have little choice but to accept the monitoring. Companies may react sensitively to public outcry, but TSA and the government’s usual response is to just deal with it or don’t fly.
That is not a fair choice. FAST too closely resembles a “medical exam” that is “substantially more invasive than screening in airports,” according to John Verdi from the Electronic Privacy Information Center. Furthermore, FAST could reveal physical conditions and high stress levels; information our government has no right to know.
FAST was created to be used virtually anywhere; are we supposed to choose between privacy and entertainment when it begins to appear at concert and sport venues? Facial recognition technology is a slippery slope, and can easily become a danger to the very people it supposedly protects. The private sector recognizes this; if it does not, it at least responds to public outcry. But under the guise of public interest, we are expected to take the government’s word for having our best interests at heart. When comparing the public and business sectors, I fear the government has a long way to go in preserving our privacy.
Photo Credit: Wikimedia Commons