Part of the problem with hugging is that it has become a social convention, rather than what it once was, which was an expression of genuine emotion.
There are some times when a hug is appropriate. Those times are when there’s a marriage proposal in the air or a body in the ground.
Hugging is for celebration, or comforting someone who’s had a setback. Hugging is not for noting that two people have both managed to meet at Chili’s after work. Being at Chili’s is not a cause for celebration, and nor is it quite dire enough to require comforting.
An even more important rule is Men don’t hug. The only time men should hug is when male family members are observing a major life milestone, such as a major promotion, the safe return from overseas deployment, or noting a witty observation in the commentary audio track of Die Hard.
The only exception to these guidelines if a man tells another man, “Boy, I could sure use a hug.” But he won’t say that, because he’s a man, so just stop with the male-on-male hugging.
To be serious, if I could: There are rules of physical distance, and there are meanings to breaches of those rules.
People of course do occasionally touch each other. But those touches have important communicative purposes precisely because of the general rule that we don’t touch each other.
There’s something a little child-like about hugging, too. It’s an innocent gesture — it’s intended to be so.
But it sort of ignores the adult-world meaning of intimate touching.
So I wonder if it’s somehow connected to a growing preference for Child World rules, and an increasing rejection of Adult World rules.
Ace, “Arms Are Not Made For Hugging”, Ace of Spades H.Q., 2014-10-10.
October 25, 2014
September 24, 2014
People who were charged with a crime in England used to be told by the police that they did not have to say anything, but that anything they did say might be taken down and used as evidence against them. I think we should all be given this warning whenever we use a mobile telephone.
Theodore Dalrymple, “Nowhere to Hide”, Taki’s Magazine, 2014-02-23
September 20, 2014
David Akin posted a list of questions posed by John Gilmore, challenging the Apple iOS8 cryptography promises:
Gilmore considered what Apple said and considered how Apple creates its software — a closed, secret, proprietary method — and what coders like him know about the code that Apple says protects our privacy — pretty much nothing — and then wrote the following for distribution on Dave Farber‘s Interesting People listserv. I’m pretty sure neither Farber nor Gilmore will begrudge me reproducing it.
And why do we believe [Apple]?
- Because we can read the source code and the protocol descriptions ourselves, and determine just how secure they are?
- Because they’re a big company and big companies never lie?
- Because they’ve implemented it in proprietary binary software, and proprietary crypto is always stronger than the company claims it to be?
- Because they can’t covertly send your device updated software that would change all these promises, for a targeted individual, or on a mass basis?
- Because you will never agree to upgrade the software on your device, ever, no matter how often they send you updates?
- Because this first release of their encryption software has no security bugs, so you will never need to upgrade it to retain your privacy?
- Because if a future update INSERTS privacy or security bugs, we will surely be able to distinguish these updates from future updates that FIX privacy or security bugs?
- Because if they change their mind and decide to lessen our privacy for their convenience, or by secret government edict, they will be sure to let us know?
- Because they have worked hard for years to prevent you from upgrading the software that runs on their devices so that YOU can choose it and control it instead of them?
- Because the US export control bureacracy would never try to stop Apple from selling secure mass market proprietary encryption products across the border?
- Because the countries that wouldn’t let Blackberry sell phones that communicate securely with your own corporate servers, will of course let Apple sell whatever high security non-tappable devices it wants to?
- Because we’re apple fanboys and the company can do no wrong?
- Because they want to help the terrorists win?
- Because NSA made them mad once, therefore they are on the side of the public against NSA?
- Because it’s always better to wiretap people after you convince them that they are perfectly secure, so they’ll spill all their best secrets?
There must be some other reason, I’m just having trouble thinking of it.
September 19, 2014
In the Guardian, Cory Doctorow says that we need privacy-enhancing technical tools that can be easily used by everyone, not just the highly technical (or highly paranoid) among us:
You don’t need to be a technical expert to understand privacy risks anymore. From the Snowden revelations to the daily parade of internet security horrors around the world – like Syrian and Egyptian checkpoints where your Facebook logins are required in order to weigh your political allegiances (sometimes with fatal consequences) or celebrities having their most intimate photos splashed all over the web.
The time has come to create privacy tools for normal people – people with a normal level of technical competence. That is, all of us, no matter what our level of technical expertise, need privacy. Some privacy measures do require extraordinary technical competence; if you’re Edward Snowden, with the entire NSA bearing down on your communications, you will need to be a real expert to keep your information secure. But the kind of privacy that makes you immune to mass surveillance and attacks-of-opportunity from voyeurs, identity thieves and other bad guys is attainable by anyone.
I’m a volunteer on the advisory board for a nonprofit that’s aiming to do just that: Simply Secure (which launches Thursday at simplysecure.org) collects together some very bright usability and cryptography experts with the aim of revamping the user interface of the internet’s favorite privacy tools, starting with OTR, the extremely secure chat system whose best-known feature is “perfect forward secrecy” which gives each conversation its own unique keys, so a breach of one conversation’s keys can’t be used to snoop on others.
More importantly, Simply Secure’s process for attaining, testing and refining usability is the main product of its work. This process will be documented and published as a set of best practices for other organisations, whether they are for-profits or non-profits, creating a framework that anyone can use to make secure products easier for everyone.
August 5, 2014
MIT, Adobe and Microsoft have developed a technique that allows conversations to be reconstructed based on the almost invisible vibrations of surfaces in the same room:
Researchers at MIT, Microsoft, and Adobe have developed an algorithm that can reconstruct an audio signal by analyzing minute vibrations of objects depicted in video. In one set of experiments, they were able to recover intelligible speech from the vibrations of a potato-chip bag photographed from 15 feet away through soundproof glass.
In other experiments, they extracted useful audio signals from videos of aluminum foil, the surface of a glass of water, and even the leaves of a potted plant. The researchers will present their findings in a paper at this year’s Siggraph, the premier computer graphics conference.
“When sound hits an object, it causes the object to vibrate,” says Abe Davis, a graduate student in electrical engineering and computer science at MIT and first author on the new paper. “The motion of this vibration creates a very subtle visual signal that’s usually invisible to the naked eye. People didn’t realize that this information was there.”
Reconstructing audio from video requires that the frequency of the video samples — the number of frames of video captured per second — be higher than the frequency of the audio signal. In some of their experiments, the researchers used a high-speed camera that captured 2,000 to 6,000 frames per second. That’s much faster than the 60 frames per second possible with some smartphones, but well below the frame rates of the best commercial high-speed cameras, which can top 100,000 frames per second.
I was aware that you could “bug” a room by monitoring the vibrations of a non-soundproofed window, at least under certain circumstances, but this is rather more subtle. I wonder how long this development has been known to the guys at the NSA…
July 19, 2014
It’s been my constant experience that laws that are purported to “protect” my privacy always seem to restrict me from being given information that doesn’t seem to merit extra protection (for example, my son’s university administration goes way out of the way to protect his privacy … to the point they barely acknowledge that I might possibly have any interest in knowing anything about him). The effect of most “privacy” laws is to allow bureaucrats to prevent outsiders from being given any information at all. Anything they don’t want to share now seems to be protected by nebulous “privacy concerns” (whether real or imaginary). It’s not just my paranoia, however, as Stewart Baker points out:
It’s time once again to point out that privacy laws, with their vague standards and selective enforcement, are more likely to serve privilege than to protect privacy. The latest to learn that lesson are patients mistreated by the Veterans Administration and the whistleblowers who sought to help them.
Misuse of privacy law is now so common that I’ve begun issuing annual awards for the worst offenders — the Privies. The Veterans Administration has officially earned a nomination for a 2015 Privy under the category “We All Got To Serve Someone: Worst Use of Privacy Law to Serve Power and Privilege.”
July 15, 2014
Tim Cushing wonders why we don’t seem to sympathize with the plight of poor, overworked law enforcement officials who find the crushing burden of getting a warrant for accessing your cell phone data to be too hard:
You’d think approved warrants must be like albino unicorns for all the arguing the government does to avoid having to run one by a judge. It continually acts as though there aren’t statistics out there that show obtaining a warrant is about as difficult as obeying the laws of thermodynamics. Wiretap warrants have been approved 99.969% of the time over the last decade. And that’s for something far more intrusive than cell site location data.
But still, the government continues to argue that location data, while possibly intrusive, is simply Just Another Business Record — records it is entitled to have thanks to the Third Party Doctrine. Any legal decision that suggests even the slightest expectation of privacy might have arisen over the past several years as the public’s relationship with cell phones has shifted from “luxury item/business tool” to “even grandma has a smartphone” is greeted with reams of paper from the government, all of it metaphorically pounding on the table and shouting “BUSINESS RECORDS!”
When that fails, it pushes for the lower bar of the Stored Communications Act [PDF] to be applied to its request, dropping it from “probable cause” to “specific and articulable facts.” The Stored Communications Act is the lowest bar, seeing as it allows government agencies and law enforcement to access electronic communications older than 180 days without a warrant. It’s interesting that the government would invoke this to defend the warrantless access to location metadata, seeing as the term “communications” is part of the law’s title. This would seem to imply what’s being sought is actual content — something that normally requires a higher bar to obtain.
Update: Ken White at Popehat says warrants are not particularly strong devices to protect your liberty and lists a few distressing cases where warrants have been issued recently.
We’re faced all the time with the ridiculous warrants judges will sign if they’re asked. Judges will sign a warrant to give a teenager an injection to induce an erection so that the police can photograph it to fight sexting. Judges will, based on flimsy evidence, sign a warrant allowing doctors to medicate and anally penetrate a man because he might have a small amount of drugs concealed in his rectum. Judges will sign a warrant to dig up a yard based on a tip from a psychic. Judges will kowtow to an oversensitive politician by signing a warrant to search the home of the author of a patently satirical Twitter account. Judges will give police a warrant to search your home based on a criminal libel statute if your satirical newspaper offended a delicate professor. And you’d better believe judges will oblige cops by giving them a search warrant when someone makes satirical cartoons about them.
I’m not saying that warrants are completely useless. Warrants create a written record of the government’s asserted basis for an action, limiting cops’ ability to make up post-hoc justifications. Occasionally some prosecutors turn down weak warrant applications. The mere process of seeking a warrant may regulate law enforcement behavior soomewhat.
Rather, I’m saying that requiring the government to get a warrant isn’t the victory you might hope. The numbers — and the experience of criminal justice practitioners — suggests that judges in the United States provide only marginal oversight over what is requested of them. Calling it a rubber stamp is unfair; sometimes actual rubber stamps run out of ink. The problem is deeper than court decisions that excuse the government from seeking warrants because of the War on Drugs or OMG 9/11 or the like. The problem is one of the culture of the criminal justice system and the judiciary, a culture steeped in the notion that “law and order” and “tough on crime” are principled legal positions rather than political ones. The problem is that even if we’d like to see the warrant requirement as interposing neutral judges between our rights and law enforcement, there’s no indication that the judges see it that way.
July 6, 2014
The argument that you’ve got nothing to worry about because you’re not doing anything wrong has long since passed its best-before date. As Nick Gillespie points out, you don’t need to be a member of Al Qaeda, a black-hat hacker, or a registered Republican to be of interest to the NSA’s information gathering team:
If You’re Reading Reason.com, The NSA is Probably Already Following You
Two things to contemplate on early Sunday morning, before church or political talk shows get underway:
Remember all those times we were told that the government, especially the National Security Agency (NSA), only tracks folks who either guilty of something or involved in suspicious-seeming activity? Well, we’re going to have amend that a bit. Using documents from Edward Snowden, the Washington Post‘s Barton Gellman, Julie Tate, and Ashkan Soltani report
Ordinary Internet users, American and non-American alike, far outnumber legally targeted foreigners in the communications intercepted by the National Security Agency from U.S. digital networks, according to a four-month investigation by The Washington Post.
Nine of 10 account holders found in a large cache of intercepted conversations, which former NSA contractor Edward Snowden provided in full to The Post, were not the intended surveillance targets but were caught in a net the agency had cast for somebody else.
Many of them were Americans. Nearly half of the surveillance files, a strikingly high proportion, contained names, e-mail addresses or other details that the NSA marked as belonging to U.S. citizens or residents. NSA analysts masked, or “minimized,” more than 65,000 such references to protect Americans’ privacy, but The Post found nearly 900 additional e-mail addresses, unmasked in the files, that could be strongly linked to U.S. citizens or U.S.residents.
The cache of documents in question date from 2009 through 2012 and comprise 160,000 documents collected up the PRISM and Upstream, which collect data from different sources. “Most of the people caught up in those programs are not the targets and would not lawfully qualify as such,” write Gellman, Julie Tate, and Ashkan Soltani, who also underscore that NSA surveillance has produced some very meaningful and good intelligence. The real question is whether the government can do that in a way that doesn’t result in massive dragnet programs that create far more problems ultimately than they solve (remember the Church Committee?).
Read the whole thing. And before anyone raises the old “if you’re innocent, you’ve got nothing to hide shtick,” read Scott Shackford’s “3 Reasons the ‘Nothing to Hide’ Crowd Should be worried about Government Surveillance.”
June 17, 2014
Michael Geist talks about another court attempting to push local rules into other jurisdictions online — in this case it’s not the European “right to be forgotten” nonsense, it’s unfortunately a Canadian court pulling the stunt:
In the aftermath of the European Court of Justice “right to be forgotten” decision, many asked whether a similar ruling could arise in Canada. While a privacy-related ruling has yet to hit Canada, last week the Supreme Court of British Columbia relied in part on the decision in issuing an unprecedented order requiring Google to remove websites from its global index. The ruling in Equustek Solutions Inc. v. Jack is unusual since its reach extends far beyond Canada. Rather than ordering the company to remove certain links from the search results available through Google.ca, the order intentionally targets the entire database, requiring the company to ensure that no one, anywhere in the world, can see the search results. Note that this differs from the European right to be forgotten ruling, which is limited to Europe.
The implications are enormous since if a Canadian court has the power to limit access to information for the globe, presumably other courts would as well. While the court does not grapple with this possibility, what happens if a Russian court orders Google to remove gay and lesbian sites from its database? Or if Iran orders it remove Israeli sites from the database? The possibilities are endless since local rules of freedom of expression often differ from country to country. Yet the B.C. court adopts the view that it can issue an order with global effect. Its reasoning is very weak, concluding that:
the injunction would compel Google to take steps in California or the state in which its search engine is controlled, and would not therefore direct that steps be taken around the world. That the effect of the injunction could reach beyond one state is a separate issue.
Unfortunately, it does not engage effectively with this “separate issue.”
June 13, 2014
Some great news on the privacy front, this time a decision handed down by the Supreme Court of Canada, as reported by Michael Geist:
This morning another voice entered the discussion and completely changed the debate. The Supreme Court of Canada issued its long-awaited R. v. Spencer decision, which examined the legality of voluntary warrantless disclosure of basic subscriber information to law enforcement. In a unanimous decision written by (Harper appointee) Justice Thomas Cromwell, the court issued a strong endorsement of Internet privacy, emphasizing the privacy importance of subscriber information, the right to anonymity, and the need for police to obtain a warrant for subscriber information except in exigent circumstances or under a reasonable law.
I discuss the implications below, but first some of the key findings. First, the Court recognizes that there is a privacy interest in subscriber information. While the government has consistently sought to downplay that interest, the court finds that the information is much more than a simple name and address, particular in the context of the Internet. As the court states:
the Internet has exponentially increased both the quality and quantity of information that is stored about Internet users. Browsing logs, for example, may provide detailed information about users’ interests. Search engines may gather records of users’ search terms. Advertisers may track their users across networks of websites, gathering an overview of their interests and concerns. “Cookies” may be used to track consumer habits and may provide information about the options selected within a website, which web pages were visited before and after the visit to the host website and any other personal information provided. The user cannot fully control or even necessarily be aware of who may observe a pattern of online activity, but by remaining anonymous – by guarding the link between the information and the identity of the person to whom it relates – the user can in large measure be assured that the activity remains private.
Given all of this information, the privacy interest is about much more than just name and address.
Second, the court expands our understanding of informational privacy, concluding that there three conceptually distinct issues: privacy as secrecy, privacy as control, and privacy as anonymity. It is anonymity that is particularly notable as the court recognizes its importance within the context of Internet usage. Given the importance of the information and the ability to link anonymous Internet activities with an identifiable person, a high level of informational privacy is at stake.
in the totality of the circumstances of this case, there is a reasonable expectation of privacy in the subscriber information. The disclosure of this information will often amount to the identification of a user with intimate or sensitive activities being carried out online, usually on the understanding that these activities would be anonymous. A request by a police officer that an ISP voluntarily disclose such information amounts to a search.
Fourth, having concluded that obtaining subscriber information was a search with a reasonable expectation of privacy, the information was unconstitutionally obtained therefore led to an unlawful search. Addressing the impact of the PIPEDA voluntary disclosure clause, the court notes:
Since in the circumstances of this case the police do not have the power to conduct a search for subscriber information in the absence of exigent circumstances or a reasonable law, I do not see how they could gain a new search power through the combination of a declaratory provision and a provision enacted to promote the protection of personal information.
Update, 7 July: A few weeks later, the US Supreme Court also made a strong pro-privacy ruling, this one mandating a warrant for police to search the contents of a cellphone.
— Julia Angwin (@JuliaAngwin) June 25, 2014
Politico‘s Josh Gerstein has more on the ruling in in Riley v. California:
The Supreme Court’s blunt and unequivocal decision Wednesday giving Americans strong protection against arrest-related searches of their cell phones could also give a boost to lawsuits challenging the National Security Agency’s vast collection of phone call data.
Chief Justice John Roberts’s 28-page paean to digital privacy was like music to the ears of critics of the NSA’s metadata program, which sweeps up details on billions of calls and searches them for possible links to terrorist plots.
“This is a remarkably strong affirmation of privacy rights in a digital age,” said Marc Rotenberg of the Electronic Privacy Information Center. “The court found that digital data is different and that has constitutional significance, particularly in the realm of [the] Fourth Amendment…I think it also signals the end of the NSA program.”
Roberts’s opinion is replete with rhetoric warning about the privacy implications of access to data in individuals’ smart phones, including call logs, Web search records and location information. Many of the arguments parallel, or are virtually identical to, the ones privacy advocates have made about the dangers inherent in the NSA’s call metadata program.
May 27, 2014
Cory Doctorow sympathizes with young people who have literally grown up with the internet:
The problem with being a “digital native” is that it transforms all of your screw-ups into revealed deep truths about how humans are supposed to use the Internet. So if you make mistakes with your Internet privacy, not only do the companies who set the stage for those mistakes (and profited from them) get off Scot-free, but everyone else who raises privacy concerns is dismissed out of hand. After all, if the “digital natives” supposedly don’t care about their privacy, then anyone who does is a laughable, dinosauric idiot, who isn’t Down With the Kids.
“Privacy” doesn’t mean that no one in the world knows about your business. It means that you get to choose who knows about your business.
It’s difficult to explain to people just how open their online “secrets” really are … and that’s not even covering the folks who are specifically targets of active surveillance … just being on Facebook or other social media sites hands over a lot of your personal details without your direct knowledge or (informed) consent. But you can start to take back some of your own privacy online:
If you start using computers when you’re a little kid, you’ll have a certain fluency with them that older people have to work harder to attain. As Douglas Adams wrote:
- Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
- Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
- Anything invented after you’re thirty-five is against the natural order of things.
If I was a kid today, I’d be all about the opsec — the operational security. I’d learn how to use tools that kept my business between me and the people I explicitly shared it with. I’d make it my habit, and get my friends into the habit too (after all, it doesn’t matter if all your email is encrypted if you send it to some dorkface who keeps it all on Google’s servers in unscrambled form where the NSA can snaffle it up).
Here’s some opsec links to get you started:
- First of all, get a copy of Tails, AKA “The Amnesic Incognito Live System.” This is an operating system that you can use to boot up your computer so that you don’t have to trust the OS it came with to be free from viruses and keyloggers and spyware. It comes with a ton of secure communications tools, as well as everything you need to make the media you want to send out into the world.
- Next, get a copy of The Tor Browser Bundle, a special version of Firefox that automatically sends your traffic through something called TOR (The Onion Router, not to be confused with Tor Books, who publish my novels). This lets you browse the Web with a much greater degree of privacy and anonymity than you would otherwise get.
- Learn to use GPG, which is a great way to encrypt (scramble) your emails. There’s a Chrome plugin for using GPG with Gmail, and another version for Firefox
- If you like chatting, get OTR, AKA “Off the Record,” a very secure private chat tool that has exciting features like “perfect forward secrecy” (this being a cool way of saying, even if someone breaks this tomorrow, they won’t be able to read the chats they captured today).
Once you’ve mastered that stuff, start to think about your phone. Android phones are much, much easier to secure than Apple’s iPhones (Apple tries to lock their phones so you can’t install software except through their store, and because of a 1998 law called the DMCA, it’s illegal to make a tool to unlock them). There are lots of alternative operating systems for Android, of varying degrees of security. The best place to start is Cyanogenmod, which makes it much easier to use privacy tools with your mobile device.
May 26, 2014
May 24, 2014
Published on 22 May 2014
One of the most talked about technology tradeoffs today is the question of how much privacy we give up to live in a world of convenience, speed and intelligence. We’re now less anonymous than many people are aware of or comfortable with, and headline-grabbing stories like the Heartbleed Bug don’t provide much reassurance for those of us seeking comfort around data privacy. How can we balance our need for anonymity with the incredible benefits of our connected world? World class Internet privacy expert Dr. Michael Geist helps us understand which current surveillance and privacy issues should be on your mind.
Dr. Michael Geist is a law professor at the University of Ottawa where he holds the Canada Research Chair in Internet and E-commerce Law. He has obtained a Bachelor of Laws (LL.B.) degree from Osgoode Hall Law School in Toronto, Master of Laws (LL.M.) degrees from Cambridge University in the UK and Columbia Law School in New York, and a Doctorate in Law (J.S.D.) from Columbia Law School. Dr. Geist is an internationally syndicated columnist on technology law issues with his regular column appearing in the Toronto Star and the Ottawa Citizen. Dr. Geist is the editor of From “Radical Extremism” to “Balanced Copyright”: Canadian Copyright and the Digital Agenda (2010) and In the Public Interest: The Future of Canadian Copyright Law (2005), both published by Irwin Law, the editor of several monthly technology law publications, and the author of a popular blog on Internet and intellectual property law issues.
Dr. Geist serves on many boards, including the CANARIE Board of Directors, the Canadian Legal Information Institute Board of Directors, the Privacy Commissioner of Canada’s Expert Advisory Board, the Electronic Frontier Foundation Advisory Board, and on the Information Program Sub-Board of the Open Society Institute. He has received numerous awards for his work including the Kroeger Award for Policy Leadership and the Public Knowledge IP3 Award in 2010, the Les Fowlie Award for Intellectual Freedom from the Ontario Library Association in 2009, the Electronic Frontier Foundation’s Pioneer Award in 2008, Canarie’s IWAY Public Leadership Award for his contribution to the development of the Internet in Canada and he was named one of Canada’s Top 40 Under 40 in 2003. In 2010, Managing Intellectual Property named him on the 50 most influential people on intellectual property in the world.
If you care about your privacy, you’re equally worried about the intrusive surveillance state and the unconstrained snooping of corporations. You may now need to worry about your snoopy neighbours also getting in on the act, as Declan McCullagh explains on Google+. This is a response to someone on a private mailing list for Silicon Valley folks, who said that he had no issue with automated collection of license plate data:
Tomorrow one of your PV [Portola Valley] neighbors will set up a computer-connected camera on private property and aimed at the street. It records all those “plates exposed” going by and, by doing optical character recognition with free software such as ANPR MX (C# code, BSD-licensed), it records every time a car goes by. The DMV will happily provide drivers’ names based on the license plate*; there’s even a process for “bulk quantities” of data.** That information doesn’t include a home address, but that’s easy to come by through other searches.
Then the neighbor launches PVPeopleTracker.com. It updates in real time showing whenever someone is at home, and marks their house in bright red if they’re gone on an extended trip. If there are odd patterns of movement compared to a baseline — perhaps suspicious late-night outings — those can be flagged as well. Any visitor to PVPeopleTracker.com can sign up for handy free email alerts reporting at what time their targeted house becomes vacant each weekday morning. Other network-linked cameras in PV can supplement the PVPeopleTracker.com database, so that everyone driving in town will have their movements monitored, archived, and publicly visible at all times.
With more than one network-linked camera separated by a known distance by roads with known speed limits, it would be simple to calculate speeding violations and send automated alerts, with MP4 videos attached as evidence, to the sheriff and CHP. PVPeopleTracker.com can also be cross-referenced against databases showing, say, marijuana convictions; if your movement profile matches a known drug trafficker, law enforcement can be alerted. (Sorry about those false positives!)
May 23, 2014
“Mammals don’t respond well to surveillance. We consider it a threat. It makes us paranoid, and aggressive and vengeful”
Angelique Carson reports on a recent IAPP Canada Privacy Symposium presentation:
If marine biologist-turned-best-selling author Peter Watts is an expert on anything, it’s mammals. Speaking to 400 or so privacy pros and regulators gathered last week at the IAPP Canada Privacy Symposium to talk privacy and data protection, he used that experience to send a rather jarring — and anything but conventional — message:
Mammals don’t respond well to surveillance. We consider it a threat. It makes us paranoid, and aggressive and vengeful. But we’ll never win against the giant corporations and governments that watch us, Watts argued, so all we can develop is a surefire defense.
Think “scorched earth.” If we can’t protect the data, Watts posited, maybe we should burn it to the ground.
Hear him out: Mammals will always respond to the surveillance threat as they would any threat — with aggression, in the same way the natural selection process has shaped every other life form on this planet.
“Anybody who thinks their own behavior isn’t at least partly informed by those legacy circuits has not been paying attention,” he said.
Watts pointed to author David Brin’s assertion during his keynote recently at the IAPP’s Global Privacy Summit that while our instinct is to pass a law aimed at telling governments and corporations to “stop looking” at us, we should instead turn our gaze to them in the name of reciprocity.
“It’s not telling them do not look,” Brin said during his speech. “It’s looking back.”
But Edward Snowden is currently living in Russia after he tried to “look back.” And as someone who’s worked a lot in the past with mammals, Watts knows that, biologically, looking back is a bad idea: “To get into a staring contest with a large, aggressive, territorial mammal primed to think of eye contact as a threat display … I can’t recommend it.”
“Natural selection favors the paranoid,” Watts said.
H/T to Bruce Schneier for the link.