Quotulatiousness

November 6, 2016

Canadian intelligence agencies and domestic overreach

Filed under: Cancon, Law, Liberty — Tags: , , , , — Nicholas @ 02:00

Michael Geist on the drumbeat of revelations — but less outrage than you’d expect — on the extent of surveillance being conducted within Canada by CSIS and law enforcement organizations:

In the aftermath of the Snowden revelations in which the public has become largely numb to new surveillance disclosures, the Canadian reports over the past week will still leave many shocked and appalled. It started with the Ontario Provincial Police mass text messaging thousands of people based on cellphone usage from nearly a year earlier (which is not government surveillance per se but highlights massive geo-location data collection by telecom carriers and extraordinary data retention periods), continued with the deeply disturbing reports of surveillance of journalists in Quebec (which few believe is limited to just Quebec) and culminated in yesterday’s federal court decision that disclosed that CSIS no longer needs warrants for tax records (due to Bill C-51) and took the service to task for misleading the court and violating the law for years on its metadata collection and retention program.

The ruling reveals a level of deception that should eliminate any doubts that the current oversight framework is wholly inadequate and raises questions about Canadian authorities commitment to operating within the law. The court found a breach of a “duty of candour” (which most people would typically call deception or lying) and raises the possibility of a future contempt of court proceeding. While CSIS attempted to downplay the concern by noting that the data collection in question – metadata involving a wide range of information used in a massive data analysis program – was collected under a court order, simply put, the court found that the retention of the data was illegal. Further, the amount of data collection continues to grow (the court states the “scope and volume of incidentally gathered information has been tremendously enlarged”), leading to the retention of metadata that is not part of an active investigation but rather involves non-threat, third party information. In other words, it is precisely the massive, big data metadata analysis program feared by many Canadians.

The court ruling comes after the Security Intelligence Review Committee raised concerned about CSIS bulk data collection in its latest report and recommended that that inform the federal court about the activities. CSIS rejected the recommendation. In fact, the court only became aware of the metadata retention due to the SIRC report and was astonished by the CSIS response, stating that it “shows a worrisome lack of understanding of, or respect for, the responsibilities of a party [SIRC] benefiting from the opportunity to appear ex parte.”

July 27, 2016

Security concerns with the “Internet of things”

Filed under: Technology — Tags: , , — Nicholas @ 03:00

When it comes to computer security, you should always listen to what Bruce Schneier has to say, especially when it comes to the “Internet of things”:

Classic information security is a triad: confidentiality, integrity, and availability. You’ll see it called “CIA,” which admittedly is confusing in the context of national security. But basically, the three things I can do with your data are steal it (confidentiality), modify it (integrity), or prevent you from getting it (availability).

So far, internet threats have largely been about confidentiality. These can be expensive; one survey estimated that data breaches cost an average of $3.8 million each. They can be embarrassing, as in the theft of celebrity photos from Apple’s iCloud in 2014 or the Ashley Madison breach in 2015. They can be damaging, as when the government of North Korea stole tens of thousands of internal documents from Sony or when hackers stole data about 83 million customer accounts from JPMorgan Chase, both in 2014. They can even affect national security, as in the case of the Office of Personnel Management data breach by — presumptively — China in 2015.

On the Internet of Things, integrity and availability threats are much worse than confidentiality threats. It’s one thing if your smart door lock can be eavesdropped upon to know who is home. It’s another thing entirely if it can be hacked to allow a burglar to open the door — or prevent you from opening your door. A hacker who can deny you control of your car, or take over control, is much more dangerous than one who can eavesdrop on your conversations or track your car’s location.

With the advent of the Internet of Things and cyber-physical systems in general, we’ve given the internet hands and feet: the ability to directly affect the physical world. What used to be attacks against data and information have become attacks against flesh, steel, and concrete.

Today’s threats include hackers crashing airplanes by hacking into computer networks, and remotely disabling cars, either when they’re turned off and parked or while they’re speeding down the highway. We’re worried about manipulated counts from electronic voting machines, frozen water pipes through hacked thermostats, and remote murder through hacked medical devices. The possibilities are pretty literally endless. The Internet of Things will allow for attacks we can’t even imagine.

The increased risks come from three things: software control of systems, interconnections between systems, and automatic or autonomous systems. Let’s look at them in turn

I’m usually a pretty tech-positive person, but I actively avoid anything that bills itself as being IoT-enabled … call me paranoid, but I don’t want to hand over local control of my environment, my heating or cooling system, or pretty much anything else on my property to an outside agency (whether government or corporate).

March 29, 2016

Why did Apple suddenly grow a pair over consumer privacy and (some) civil rights?

Filed under: Business, Technology, USA — Tags: , , , , , — Nicholas @ 03:00

Charles Stross has a theory:

A lot of people are watching the spectacle of Apple vs. the FBI and the Homeland Security Theatre and rubbing their eyes, wondering why Apple (in the person of CEO Tim Cook) is suddenly the knight in shining armour on the side of consumer privacy and civil rights. Apple, after all, is a goliath-sized corporate behemoth with the second largest market cap in US stock market history — what’s in it for them?

As is always the case, to understand why Apple has become so fanatical about customer privacy over the past five years that they’re taking on the US government, you need to follow the money.

[…]

Apple see their long term future as including a global secure payments infrastructure that takes over the role of Visa and Mastercard’s networks — and ultimately of spawning a retail banking subsidiary to provide financial services directly, backed by some of their cash stockpile.

The FBI thought they were asking for a way to unlock a mobile phone, because the FBI is myopically focussed on past criminal investigations, not the future of the technology industry, and the FBI did not understand that they were actually asking for a way to tracelessly unlock and mess with every ATM and credit card on the planet circa 2030 (if not via Apple, then via the other phone OSs, once the festering security fleapit that is Android wakes up and smells the money).

If the FBI get what they want, then the back door will be installed and the next-generation payments infrastructure will be just as prone to fraud as the last-generation card infrastructure, with its card skimmers and identity theft.

And this is why Tim Cook is willing to go to the mattresses with the US department of justice over iOS security: if nobody trusts their iPhone, nobody will be willing to trust the next-generation Apple Bank, and Apple is going to lose their best option for securing their cash pile as it climbs towards the stratosphere.

March 20, 2016

Apple software engineers threaten to quit rather than crack encryption for the government

Filed under: Business, Government, Liberty, Technology, USA — Tags: , , , , — Nicholas @ 02:00

It’s only a rumour rather than a definite stand, but it is a hopeful one for civil liberties:

The spirit of anarchy and anti-establishment still runs strong at Apple. Rather than comply with the government’s requests to develop a so-called “GovtOS” to unlock the iPhone 5c of San Bernardino shooter Syed Rizwan Farook, The New York Times‘ half-dozen sources say that some software engineers may quit instead. “It’s an independent culture and a rebellious one,” former Apple engineering manager Jean-Louis Gassée tells NYT. “If the government tries to compel testimony or action from these engineers, good luck with that.”

Former senior product manager for Apple’s security and privacy division Window Snyder agrees. “If someone attempts to force them to work on something that’s outside their personal values, they can expect to find a position that’s a better fit somewhere else.”

In another instance of Apple’s company culture clashing with what the federal government demands, the development teams are apparently relatively siloed off from one another. It isn’t until a product gets closer to release that disparate teams like hardware and software engineers come together for finalizing a given gizmo. NYT notes that the team of six to 10 engineers needed to develop the back door doesn’t currently exist and that forcing any sort of collaboration would be incredibly difficult, again, due to how Apple works internally.

January 31, 2016

“To be honest, the spooks love PGP”

Filed under: Liberty, Technology — Tags: , , — Nicholas @ 03:00

If nothing else, it’s a needle in their acres of data haystacks. Use of any kind of encryption doesn’t necessarily let CSIS and their foreign friends read your communications, but it alerts them that you think you’ve got something to say that they shouldn’t read:

Although the cops and Feds wont stop banging on and on about encryption – the spies have a different take on the use of crypto.

To be brutally blunt, they love it. Why? Because using detectable encryption technology like PGP, Tor, VPNs and so on, lights you up on the intelligence agencies’ dashboards. Agents and analysts don’t even have to see the contents of the communications – the metadata is enough for g-men to start making your life difficult.

“To be honest, the spooks love PGP,” Nicholas Weaver, a researcher at the International Computer Science Institute, told the Usenix Enigma conference in San Francisco on Wednesdy. “It’s really chatty and it gives them a lot of metadata and communication records. PGP is the NSA’s friend.”

Weaver, who has spent much of the last decade investigating NSA techniques, said that all PGP traffic, including who sent it and to whom, is automatically stored and backed up onto tape. This can then be searched as needed when matched with other surveillance data.

Given that the NSA has taps on almost all of the internet’s major trunk routes, the PGP records can be incredibly useful. It’s a simple matter to build a script that can identify one PGP user and then track all their contacts to build a journal of their activities.

Even better is the Mujahedeen Secrets encryption system, which was released by the Global Islamic Media Front to allow Al Qaeda supporters to communicate in private. Weaver said that not only was it even harder to use than PGP, but it was a boon for metadata – since almost anyone using it identified themselves as a potential terrorist.

“It’s brilliant!” enthused Weaver. “Whoever it was at the NSA or GCHQ who invented it give them a big Christmas bonus.”

January 7, 2016

QotD: The right to record police officers

Filed under: Law, Liberty, Quotations, USA — Tags: , , , , — Nicholas @ 01:00

Some advice for the beleaguered and backward states of Illinois, Massachusetts, et al.: If police are not obliged to ask our permission before recording their public encounters with us, then we should not be obliged to ask their permission before recording our public encounters with them. That states generally dominated by so-called progressives should be so insistent upon asymmetric police powers and special privileges for government’s armed agents is surprising only to those who do not understand the basic but seldom-spoken truth about progressivism: The welfare state is the police state.

Why Illinois Republicans are on board is another matter, bringing up the eternal question that conservatives can expect to be revisiting frequently after January: What, exactly, is the point of the Republican party?

Illinois is attempting to resurrect what the state’s politicians pretend is a privacy-protecting anti-surveillance law; in reality, it is the nearly identical reincarnation of the state’s earlier anti-recording law, the main purpose of which was to charge people who record police encounters with a felony, an obvious and heavy-handed means of discouraging such recording. Illinois’s state supreme court threw the law out on the grounds that police do not have a reasonable expectation of privacy when carrying out their duties, though police and politicians argued the contrary — apparently, some part of the meaning of the phrase “public servants” eludes them. The new/old law is, by design, maddeningly vague, and will leave Illinois residents unsure of which encounters may be legally recorded and which may not.

Here is the solution: Pass a law explicitly recognizing the right of citizens to record police officers. It is important to note that such a law would recognize a right rather than create one: Government has no legitimate power to forbid free people from using cameras, audio-recording devices, or telephones in public to document the business of government employees. The statute would only clarify that Americans — even in Illinois — already are entitled to that right.

Kevin D. Williamson, “Prairie State Police State”, National Review, 2014-12-10.

December 4, 2015

QotD: “Dance like nobody’s watching”

Filed under: Quotations, Randomness — Tags: , — Nicholas @ 01:00

I always laugh when I hear the phrase, “Dance like nobody’s watching.” It’s 2015. Everybody’s watching.

Jim Treacher, “Mac & Cheese Dude Is Sorry For Being An [Incredibly Unpleasant Person]”, The Daily Caller, 2015-10-13.

November 23, 2015

Do you have a smartphone? Do you watch TV? You might want to reconsider that combination

Filed under: India, Technology — Tags: , , , , — Nicholas @ 02:00

At The Register, Iain Thomson explains a new sneaky way for unscrupulous companies to snag your personal data without your knowledge or consent:

Earlier this week the Center for Democracy and Technology (CDT) warned that an Indian firm called SilverPush has technology that allows adverts to ping inaudible commands to smartphones and tablets.

Now someone has reverse-engineered the code and published it for everyone to check.

SilverPush’s software kit can be baked into apps, and is designed to pick up near-ultrasonic sounds embedded in, say, a TV, radio or web browser advert. These signals, in the range of 18kHz to 19.95kHz, are too high pitched for most humans to hear, but can be decoded by software.

An application that uses SilverPush’s code can pick up these messages from the phone or tablet’s builtin microphone, and be directed to send information such as the handheld’s IMEI number, location, operating system version, and potentially the identity of the owner, to the application’s backend servers.

Imagine sitting in front of the telly with your smartphone nearby. An advert comes on during the show you’re watching, and it has a SilverPush ultrasonic message embedded in it. This is picked up by an app on your mobile, which pings a media network with information about you, and could even display followup ads and links on your handheld.

How it works ... the transfer of sound-encoded information from a TV to a phone to a backend server

How it works … the transfer of sound-encoded information from a TV to a phone to a backend server

“This kind of technology is fundamentally surreptitious in that it doesn’t require consent; if it did require it then the number of users would drop,” Joe Hall, chief technologist at CDT told The Register on Thursday. “It lacks the ability to have consumers say that they don’t want this and not be associated by the software.”

Hall pointed out that very few of the applications that include the SilverPush SDK tell users about it, so there was no informed consent. This makes such software technically illegal in Europe and possibly in the US.

October 17, 2015

Ken White of Popehat.com Talks Blogging, Anonymous Speech

Filed under: Law, Liberty, Politics, USA — Tags: , , , — Nicholas @ 03:00

Published on 13 Oct 2015

Ken White, founder of the influential group blog Popehat, tells FIRE how he got interested in the First Amendment and discusses anonymous speech on the Internet.

White, who writes for Popehat on a variety of issues, including the First Amendment, criminal justice, and the legal system, said a college project at Stanford University “during … one of the upsurges of controversy on campus about speech codes and speech issues,” opened his eyes to the nuances of the First Amendment.

“I wound up doing my senior honors thesis in college with a law school professor on the subject of legal restrictions on hate speech,” White said. “I thought it was very much emblematic of a very American problem, and that is: How do we express our disapproval — our moral disapproval — for bad things like bigotry, while not restricting liberties?”

Popehat seems to be a space created to do exactly that. The forum has evolved into a blog the contributors describe as a “group complaint” about “whatever its authors want.”

That freedom hasn’t always come so easily for White, who blogged anonymously for more than five years due to concerns his honest blogging might harm his career. He still thinks anonymous speech provides both benefits and drawbacks.

“I think the right to anonymous speech is very central in the First Amendment and in American life,” said White. “Throughout American history, people have said unpopular things, incendiary things, politically dangerous things behind the shield of anonymity. A lot of bad things come with that. There’s some really terrible, immoral, anonymous behavior on the Internet.”

White said there’s also a risk to writing anonymously, and that even while he benefitted from posting behind the security of an online persona, he supports the rights of others to try and discover his true identity. Eventually, White said he gave up the pretext and started blogging under his own name.

For more from White, including why free speech “catchphrases” harm First Amendment discourse, watch the above video.

September 30, 2015

Russia’s “bounty” on TOR

Filed under: Liberty, Technology — Tags: , , , , , — Nicholas @ 05:00

Strategy Page on the less-than-perfect result of Russia’s attempt to get hackers to crack The Onion Router for a medium-sized monetary prize:

Back in mid-2014 Russia offered a prize of $111,000 for whoever could deliver, by August 20th 2014, software that would allow Russian security services to identify people on the Internet using Tor (The Onion Router), a system that enables users to access the Internet anonymously. On August 22nd Russia announced that an unnamed Russian contractor, with a top security clearance, had received the $111,000 prize. No other details were provided at the time. A year later is was revealed that the winner of the Tor prize is now spending even more on lawyers to try and get out of the contract to crack Tor’s security. It seems the winners found that their theoretical solution was too difficult to implement effectively. In part this was because the worldwide community of programmers and software engineers that developed Tor is constantly upgrading it. Cracking Tor security is firing at a moving target and one that constantly changes shape and is quite resistant to damage. Tor is not perfect but it has proved very resistant to attack. A lot of people are trying to crack Tor, which is also used by criminals and Islamic terrorists was well as people trying to avoid government surveillance. This is a matter of life and death in many countries, including Russia.

Similar to anonymizer software, Tor was even more untraceable. Unlike anonymizer software, Tor relies on thousands of people running the Tor software, and acting as nodes for email (and attachments) to be sent through so many Tor nodes that it was believed virtually impossible to track down the identity of the sender. Tor was developed as part of an American government program to create software that people living in dictatorships could use to avoid arrest for saying things on the Internet that their government did not like. Tor also enabled Internet users in dictatorships to communicate safely with the outside world. Tor first appeared in 2002 and has since then defied most attempts to defeat it. The Tor developers were also quick to modify their software when a vulnerability was detected.

But by 2014 it was believed that NSA had cracked TOR and others may have done so as well but were keeping quiet about it so that the Tor support community did not fix whatever aspect of the software that made it vulnerable. At the same time there were alternatives to Tor, as well as supplemental software that were apparently uncracked by anyone.

September 10, 2015

Making it easy for governments to monitor texts, emails, and other messages

Filed under: Law, Liberty, Technology, USA — Tags: , , , , , — Nicholas @ 03:00

Megan McArdle explains that while it’s quite understandable why governments want to maintain their technological ability to read private, personal communications … but that’s not sufficient justification to just give in and allow them the full access they claim that they “need”:

Imagine, if you will, a law that said all doors had to be left unlocked so that the police could get in whenever they needed to. Or at the very least, a law mandating that the government have a master key.

That’s essentially what some in the government want for your technology. As companies like Apple and Google have embraced stronger encryption, they’re making it harder for the government to do the kind of easy instant collection that companies were forced into as the government chased terrorists after 9/11.

And how could you oppose that government access? After all, the government keeps us safe from criminals. Do you really want to make it easier for criminals to evade the law?

The analogy with your home doors suggests the flaw in this thinking: The U.S. government is not the only entity capable of using a master key. Criminals can use them too. If you create an easy way to bypass security, criminals — or other governments — are going to start looking for ways to reproduce the keys.

[…]

Law enforcement is going to pursue strategies that maximize the ability to catch criminals or terrorists. These are noble goals. But we have to take care that in the pursuit of these goals, the population they’re trying to protect is not forgotten. Every time we open more doors for our own government, we’re inviting other unwelcome guests to join them inside.

I don’t really blame law enforcement for pushing as hard as possible; rare is the organization in history that has said, “You know, the world would be a better place if I had less power to do my job.” But that makes it more imperative that the rest of us keep an eye on what they’re doing, and force the law to account for tradeoffs, rather than the single-minded pursuit of one goal.

August 28, 2015

Google and the (bullshit) European “right to be forgotten”

Filed under: Europe, Law, Liberty, Media, USA — Tags: , , , , , — Nicholas @ 03:00

Techdirt‘s Mike Masnick points and laughs at a self-described consumerist organization’s attempt to force Google to apply EU law to the rest of the world, by way of an FTC complaint:

If you want an understanding of my general philosophy on business and economics, it’s that companies should focus on serving their customers better. That’s it. It’s a very customer-centric view of capitalism. I think companies that screw over their customers and users will have it come back to bite them, and thus it’s a better strategy for everyone if companies focus on providing good products and services to consumers, without screwing them over. And, I’m super supportive of organizations that focus on holding companies’ feet to the fire when they fail to live up to that promise. Consumerist (owned by Consumer Reports) is really fantastic at this kind of thing, for example. Consumer Watchdog, on the other hand, despite its name, appears to have very little to do with actually protecting consumers’ interests. Instead, it seems like some crazy people who absolutely hate Google, and pretend that they’re “protecting” consumers from Google by attacking the company at every opportunity. If Consumer Watchdog actually had relevant points, that might be useful, but nearly every attack on Google is so ridiculous that all it does is make Consumer Watchdog look like a complete joke and undermine whatever credibility the organization might have.

In the past, we’ve covered an anti-Google video that company put out that contained so many factual errors that it was a complete joke (and was later revealed as nothing more than a stunt to sell some books). Then there was the attempt to argue that Gmail was an illegal wiretap. It’s hard to take the organization seriously when it does that kind of thing.

Its latest, however, takes the crazy to new levels. John Simpson, Consumer Watchdog’s resident “old man yells at cloud” impersonator, recently filed a complaint with the FTC against Google. In it, he not only argues that Google should offer the “Right to be Forgotten” in the US, but says that the failure to do that is an “unfair and deceptive practice.” Really.

As you know by now, since an EU court ruling last year, Google has been forced to enable a right to be forgotten in the EU, in which it will “delink” certain results from the searches on certain names, if the people argue that the links are no longer “relevant.” Some in the EU have been pressing Google to make that “right to be forgotten” global — which Google refuses to do, noting that it would violate the First Amendment in the US and would allow the most restrictive, anti-free speech regime in the world to censor the global internet.

But, apparently John Simpson likes censorship and supporting free speech-destroying regimes. Because he argues Google must allow such censorship in the US. How could Google’s refusal to implement “right to be forgotten” possibly be “deceptive”? Well, in Simpson’s world, it’s because Google presents itself as “being deeply committed to privacy” but then doesn’t abide by a global right to be forgotten. Really.

August 2, 2015

Thinking about realistic security in the “internet of things”

Filed under: Technology — Tags: , , , , , — Nicholas @ 02:00

The Economist looks at the apparently unstoppable rush to internet-connect everything and why we should worry about security now:

Unfortunately, computer security is about to get trickier. Computers have already spread from people’s desktops into their pockets. Now they are embedding themselves in all sorts of gadgets, from cars and televisions to children’s toys, refrigerators and industrial kit. Cisco, a maker of networking equipment, reckons that there are 15 billion connected devices out there today. By 2020, it thinks, that number could climb to 50 billion. Boosters promise that a world of networked computers and sensors will be a place of unparalleled convenience and efficiency. They call it the “internet of things”.

Computer-security people call it a disaster in the making. They worry that, in their rush to bring cyber-widgets to market, the companies that produce them have not learned the lessons of the early years of the internet. The big computing firms of the 1980s and 1990s treated security as an afterthought. Only once the threats—in the forms of viruses, hacking attacks and so on—became apparent, did Microsoft, Apple and the rest start trying to fix things. But bolting on security after the fact is much harder than building it in from the start.

Of course, governments are desperate to prevent us from hiding our activities from them by way of cryptography or even moderately secure connections, so there’s the risk that any pre-rolled security option offered by a major corporation has already been riddled with convenient holes for government spooks … which makes it even more likely that others can also find and exploit those security holes.

… companies in all industries must heed the lessons that computing firms learned long ago. Writing completely secure code is almost impossible. As a consequence, a culture of openness is the best defence, because it helps spread fixes. When academic researchers contacted a chipmaker working for Volkswagen to tell it that they had found a vulnerability in a remote-car-key system, Volkswagen’s response included a court injunction. Shooting the messenger does not work. Indeed, firms such as Google now offer monetary rewards, or “bug bounties”, to hackers who contact them with details of flaws they have unearthed.

Thirty years ago, computer-makers that failed to take security seriously could claim ignorance as a defence. No longer. The internet of things will bring many benefits. The time to plan for its inevitable flaws is now.

August 1, 2015

The Streisand Effect – Trying to Hide Things On The Internet I INTO CONTEXT

Filed under: Media — Tags: , , , , , — Nicholas @ 02:00

Published on 6 Jan 2015

One of the biggest news stories this Christmas was the (un-)cancelled release of Sony Pictures’ movie The Interview. In the movie, Seth Rogan and James Franco try to assassinate North Korean dictator Kim Jong-Un. After terror threats against movie theatres showing the film, Sony cancelled the release of the movie. This ultimately increased the movies attention and made the later online release the most successful one this year. Actually, there is a name for this kind of phenomenon: the Streisand Effect. In this episode of INTO CONTEXT, Indy explains why it’s not always smart to try to hide things on the internet.

July 17, 2015

The case for encryption – “Encryption should be enabled for everything by default”

Filed under: Liberty, Technology — Tags: , , — Nicholas @ 03:00

Bruce Schneier explains why you should care (a lot) about having your data encrypted:

Encryption protects our data. It protects our data when it’s sitting on our computers and in data centers, and it protects it when it’s being transmitted around the Internet. It protects our conversations, whether video, voice, or text. It protects our privacy. It protects our anonymity. And sometimes, it protects our lives.

This protection is important for everyone. It’s easy to see how encryption protects journalists, human rights defenders, and political activists in authoritarian countries. But encryption protects the rest of us as well. It protects our data from criminals. It protects it from competitors, neighbors, and family members. It protects it from malicious attackers, and it protects it from accidents.

Encryption works best if it’s ubiquitous and automatic. The two forms of encryption you use most often — https URLs on your browser, and the handset-to-tower link for your cell phone calls — work so well because you don’t even know they’re there.

Encryption should be enabled for everything by default, not a feature you turn on only if you’re doing something you consider worth protecting.

This is important. If we only use encryption when we’re working with important data, then encryption signals that data’s importance. If only dissidents use encryption in a country, that country’s authorities have an easy way of identifying them. But if everyone uses it all of the time, encryption ceases to be a signal. No one can distinguish simple chatting from deeply private conversation. The government can’t tell the dissidents from the rest of the population. Every time you use encryption, you’re protecting someone who needs to use it to stay alive.

Older Posts »

Powered by WordPress

%d bloggers like this: