Big Brother

Schneier on Bush’s illegal wiretaps

From Bruce Schneier’s Cryptogram, in a recent post comparing Bush’s recent (and continuing!) wiretapping to Project Shamrock in the 1960s:

Bush’s eavesdropping program was explicitly anticipated in 1978, and made illegal by FISA. There might not have been fax machines, or e-mail, or the Internet, but the NSA did the exact same thing with telegrams.

We can decide as a society that we need to revisit FISA. We can debate the relative merits of police-state surveillance tactics and counterterrorism. We can discuss the prohibitions against spying on American citizens without a warrant, crossing over that abyss that Church warned us about twenty years ago. But the president can’t simply decide that the law doesn’t apply to him.

This issue is not about terrorism. It’s not about intelligence gathering. It’s about the executive branch of the United States ignoring a law, passed by the legislative branch and signed by President Jimmy Carter: a law that directs the judicial branch to monitor eavesdropping on Americans in national security investigations.

It’s not the spying, it’s the illegality.

Personally, I think it’s the illegality and the spying, but in the name of keeping the debate clear I’m happy to keep the two arguments separate.

Schneier on Bush’s illegal wiretaps Read More »

Big Brother Down Under

From the Sydney Morning Herald:

Jane, from Coogee, was surprised to find three police on her bus asking to inspect mobile phones. Each took a phone at random and scrolled through messages for five or ten minutes. Everyone obeyed. “The people were perfectly friendly about it,” she said. “I thought it was a bit weird and a breach of privacy. But I didn’t say anything. Nobody did.”

No, it’s not about terrorism, it’s about potential racial violence, but it’s still that nasty abuse-of-rights-in-the-name-of-safety-from-unknown-boogeymen vibe. Of course, such flagrant violations of our rights without a court order could never happen in the US. In the US, we’d never even know they’d read our text messages without a court order until we read about it in the New York Times.

(Thanks to Omri for the link.)

Big Brother Down Under Read More »

Forget-Me-Not Panties

Forget-me-not panties

Wondering why your daughter, wife or girlfriend stays out so late? Wonder no more with new forget-me-not panties, the underwear that gives her comfort and you peace of mind:

These panties will monitor the location of your daughter, wife or girlfriend 24 hours a day, and can even monitor their heart rate and body temperature…

These “panties” can trace the exact location of your woman and send the information, via satellite, to your cell phone, PDA, and PC simultaneously! Use our patented mapping system, pantyMap®, to find the exact location of your loved one 24 hours a day.

Brought to you by The Contagious Media Project, the brilliant minds that also created the Black People Love Us site and the Fundrace Neighbor Search.

(Thanks to Dan on the wearables list for the link.)

Forget-Me-Not Panties Read More »

…somewhere in Washington enshrined in some little folder…

Did you fly in June 2004? If so (or if you have a similar name to somebody who did) then the Transportation Security Administration may have secretly collected information on you from airline reservation systems and credit bureaus. Wouldn’t it be nice to find out what they know?

Luckily, we still live in a free country — you can just ask! EFF is making it easy to do just that, and you can help them reverse-engineer exactly what the TSA has been up to at the same time.

…somewhere in Washington enshrined in some little folder… Read More »

Sign of the times

I took this photo last week at the entrance to MARTA, Atlanta’s subway system.

It wasn’t so long ago that I would have taken a picture like this one in a foreign country as a reminder of how different life would be without our Bill of Rights. It’s amazing how quickly we’re letting it slip away from us…

Sign of the times Read More »

Thoughts on privacy and David Brin

For years science-fiction author David Brin has been preaching that privacy as we know it is essentially dead, and rather than mourn our loss of shadow we should embrace the light — and make sure it shines in the bedrooms of power as much as it shines in our own. The Cameras Are Coming! has been his battle cry.

I remember hearing Brin speak at the Media Lab sometime in the late 90s and thinking he was completely off the mark if he thought ubiquitous lack of privacy was anything but trouble — I saw it as giving an expert marksman (powerful individuals, companies and governments) and someone who has never held a gun before (us peons) the same high-end rifle and saying “there you go, now you’re both equal.”

I’ve not gone completely over to Brin’s position, but events in the intervening years have brought me a little closer. First, I’ve seen no sign of privacy erosion even slowing down and every sign that information wants to be free and unfettered is becoming a new physical law for the 21st century. (In the spirit of Free as in beer and Free as in freedom, this would be the Free as in virus point of view.) The same forces that erode top-down power and barriers to free expression are the forces that erode our privacy — I can’t think one is inevitable without accepting the other as well. Second, things like the Abu Grahab scandal give me at least a little hope that light will occasionally leak into even the more protected dens, and that we peons are slowly learning how to shoot. I’m not totally convinced by any stretch (Abu Grahab, I’ll point out, has so far only lead to punishment of low-level participants), but it’s something.

I came in halfway through Brin’s talk in the opening debate at CFP, but I did note one quote I especially liked (slightly paraphrased here):

Give the watchdog better glasses and more freedom, then yank the choke chain to make sure it remembers that it’s a dog and not a wolf.

The fundamental question for every free society is how to insure we keep a hold of that choke chain. Shining light in the bedrooms of power is one part of the answer I think, but it’s not enough.

I’ve some thoughts of what else is needed, but they involve questions about free will — and anyone who’s heard me rant in person on the topic knows I’d never get to sleep if I started down that path tonight…

(For some related reading take a look at Stafanos’ response to my post about privacy that got me thinking about Brin again.)

Thoughts on privacy and David Brin Read More »

Threats from lack of privacy

Tonight’s keynote by Daniel Solove (author of The Digital Person) fell on the privacy-as-a-means-to-an-end side of the debate, though he mostly only discussed one danger: identity theft.

Personally I think identity theft is one of the biggest boons to privacy advocates in the past decade, because it finally answers the question “why should I care about privacy if I don’t have anything to hide?” There are several other examples and classes of threat that I think are equally important though:

  • Direct threat: using what you know to directly cause me harm. Identity theft is one example, but so is using my contact information to spam or telemarket to me, using my location to know when to rob my house, and using personal information to create false trust when selling me something.
  • Profiling: punishing or restricting people with a set of features that are benign in their own right, but that are perceived as correlating with features that are undesirable. Racial profiling is the obvious example: there’s nothing wrong with being black or hispanic, but because these races are perceived as being “more likely to commit crime,” people of this race are singled out for extra hassle and restrictions. Age, religion and gender discrimination are other examples. Data mining brings profiling to a whole new level: now not only can you be harmed because of obvious traits like your race, gender or age, but also subtle things like your purchase habits, where you travel, who you know, what you read and what your politics are. This is both unfair to the individual singled out and harms society by dissuading activity we would rather allow, simply because the activity is sometimes correlated with activity we don’t want.
  • Cherry picking: breaking society’s risk-pooling safety systems (i.e. insurance) by giving insurance companies enough data to cherry-pick only the safest people. For the individual, insurance is a way to pool risk so that a catastrophic illness or event doesn’t wipe you out. For an insurance company, insurance is like being a casino owner: they profit by setting their payoff a little higher than the overall risk. If the insurance company has enough information to completely predict who will get sick and who won’t, that’s like playing poker against a psychic — they always fold before you get to bid on a good hand, and take your money when you have a bad one. Of course there’s never enough money to completely predict who will get sick and who won’t, but every extra bit of predictive power takes us further from the ability to effectively pool our risk.

Threats from lack of privacy Read More »

Computers, Freedom, and PRIVACY

One disappointment I have about CFP is how privacy (step two of the privacy chain I talked about last post) is overshadowing discussion about freedom. I think privacy is important and worth fighting to protect, but I mostly see privacy as a way to keep others from gaining power over me (and thus becoming able to harm me) rather than as an end in itself. Sure I’d rather not have people posting nude pictures of me on the net, but I’m a lot more concerned that information collected about me isn’t used to steal my identity or deny me a loan, employment or insurance. The debate between privacy-as-means-to-an-end folk like me and privacy-as-intrinsically-valuable folk has played itself out several times over the past few days.

Computers, Freedom, and PRIVACY Read More »

RFID Passports: why contactless?

Just had a panel on Privacy Risks of New Passport Technologies, discussing among other things the new RFID tag the US is rolling out for passports in the coming months. The tags will contain a digitally signed copy of your photo plus all the information on your data page except the signature, and will be readable at a distance. The readers are designed to read chips about from about ten centimeters away, but the danger is that it’s possible to design devices that read the tag from longer distances. The exact distances possible aren’t clear to me, but a speaker from the ACLU demonstrated reading a passport with the type of RFID being used from three to four feet away. The State Department is now promising the passport cover will include a Faraday cage to prevent reading when the passport is closed, but that won’t help when the passport is opened.

The dangers really boil down to someone snooping or stealing one’s identity at a distance without one’s knowledge or consent:

  1. Skimming: a terrorist, spy or criminal can lurk nearby a hotel or airport check-in desk and read the identities of people checking in. They can use this information to pick out victims or gather information on who gathers at a particular meeting or site.
  2. Cloning: reading people’s passport info at a distance and using that information to create a copy. To be effective, you’d need to clone the passport of someone who looks like the person who will eventually use the card, since the picture can’t be changed without invalidating the digital signature.
  3. Tracking: if an ID chip isn’t contained in a Faraday cage then they could be used to track people as they walk past readers distributed throughout a shopping center, neighborhood or city. This wouldn’t be possible with passports (they say), but there has been talk among policy makers to extend the RFID chip to driver’s licenses and other forms of ID.

Sounds like pretty big flaws in something in theory designed to make us safer, all of which would be solved by simply making the data only communicate through physical contact. The lone proponent on the panel was Deputy Assistant Secretary of State for Passport Services Frank Moss. I was rather unimpressed with his answers — many parts sounded like a song and dance surrounded by apologies for not really understanding the technology (and thus not being able to explain any details. However, he did answer the one main question I had: why the heck did the US push so hard for passports that could be read at a distance? His answer seems to boil down to it was cheaper and a little more flexible. Specifically:

  1. Passport manufacturers said it would be cheaper to change their processes to include RFID chips than contact-requiring chips.
  2. Different countries want different designs, and rather than specify a single location for a contact-point it was easier to just embed an RFID reader.

I’m sympathetic to the difficulties in standardizing over a hundred national documents, but that’s a piss-poor excuse given the potential security holes it opens up. The follow-up argument of “we were stupid when we pushed for it, but it’s too late now so tough” is equally unacceptable in my mind.

Update 4/14/05: Ed Felton at Freedom to Tinker was at the same panel and has posted his own summary. His conclusion about the reason we’re getting stuck with a contactless system are in line with my own: “In short, this looks like another flawed technology procurement program.”

RFID Passports: why contactless? Read More »

The privacy chain

After a couple days soaking in privacy issues I’m starting to break everything into a three-part chain: identification, information and actions. (Appropriately enough for this conference, these these are fairly well associated with computers, privacy and freedom respectively.)

  1. Identification: ability to identify an individual person or class of person. Includes face recognition, mandatory ID cards, DNA, iris scanners, retinal scanners, thumbprint, spyware, phone-home DRM, RFID chips in your clothing and other “Things That Fink,” etc., as well as obvious things like racial profiling and having someone sign their name.
  2. Information/Databases: access to information about those people or class of people. Medical, criminal, financial, your race/culture/religion, consumer preference data, where you’ve been, who you know, who you talk to, what you say…
  3. Actions: what people with access to this information do. Some are good for the identified person or society (completing financial transactions, stop crime & terrorism, etc.). Many are bad, including police harassment of a particular race or religion, suppression of political dissent and travel of political activists, identity theft, scam games, red-lining, employment and insurance discrimination, price differentiation, loss of social reputation, and coercive advertising.

Many people have just a visceral negative reaction to someone knowing too much about them, but the consequences are mostly in part 3 — that’s where you get stung. That said, sometimes the best way to stop something bad happening in step 3 is to stop steps 1 or 2 from happening, and often you never even find out that you didn’t get a loan or a job due to a privacy violation.

The privacy chain Read More »