When I was a teenager in the 1970s, there was not yet anything you could call “geek culture”. Sure, there were bright kids fascinated by computers or math or science, kids who were often “poorly socialized” in the jargon of the day and hung together as a defensive measure; I was one of them. But we didn’t see ourselves as having a social identity or affiliation the way the jocks or surfers or hippies did. We weren’t a subculture, nor even a community; we didn’t even have a label for ourselves.
Slowly, slowly that began to change. One key event was the eruption of science fiction into pop culture that began with the first Star Wars movie in 1977. This was our stuff and we knew it, even though most of us never joined the subculture of SF fandom proper. Personal computers made another big difference after 1980; suddenly, technology was cool and sexy in a way it hadn’t been for decades, and people who were into it started to get respect rather than (or in addition to) faint or not-so-faint scorn.
You could see the trend in movies. War Games in 1983; Revenge of the Nerds in 1984; Real Genius in 1985. To kids today Revenge of the Nerds doesn’t seem remarkable, because geek culture is more secure and confident today than a lot of older tribes like bikers or hippies. But at the time, the idea that you could have an entire fraternity of geeks — an autonomous social group with reason to be proud of itself and a recognized place in the social ecology — was funny; all by itself it was a comedy premise.
The heroes of Revenge of the Nerds were people who created a fraternity of their own, who bootstrapped a niche for themselves in Grant McCracken’s culture of plenitude. The movie was an extended joke, but it described and perhaps helped create a real phenomenon.
The term ‘geek’ didn’t emerge as a common label, displacing the older and much more sporadically-used ‘nerd’, until around the time of the Internet explosion of 1993-1994. I noticed this development because I didn’t like it; I still prefer to tell people I hang out with hackers (all hackers are geeks, but not all geeks are hackers). Another index of the success of the emerging geek culture is that around that time it stopped being an almost exclusively male phenomenon.
Yes, you catch my implication. When I was growing up we didn’t have geekgirls. Even if the label ‘geek’ had been in use at the time, the idea that women could be so into computers or games or math that they would identify with and hang out with geek guys would have struck us as sheerest fantasy. Even the small minority of geek guys who were good with women (and thus had much less reason to consider them an alien species) would have found the implications of the term ‘geekgirl’ unbelievable before 1995 or so.
(There are people who cannot read an account like the above without assuming that the author is simply projecting his own social and sexual isolation onto others. For the benefit of those people, I will report here that I had good relations with women long before this was anything but rare in my peer group. This only made the isolation of my peers easier to notice.)
What changed? Several things. One is that geek guys are, on the whole, better adjusted and healthier and more presentable today than they were when I was a teenager. Kids today have trouble believing the amount of negative social pressure on intelligent people to pass as normal and boring that was typical before 1980, the situation Revenge of the Nerds satirized and inverted. It meant that the nascent geek culture of the time attracted only the most extreme geniuses and misfits — freaks, borderline autists, obsessives, and other people in reaction against the mainstream. Women generally looked at this and went “ugh!”
But over time, geeky interests became more respectable, even high-status (thanks at least in part to the public spectacle of übergeeks making millions). The whole notion of opposition to the mainstream started to seem dated as ‘mainstream’ culture gradually effloresced into dozens of tribes freakier than geeks (two words: “body piercings”). Thus we started to attract people who were more normal, in psychology if not in talent. Women noticed this. I believe it was in 1992, at a transhumanist party in California, that I first heard a woman matter-of-factly describe the Internet hacker culture as “a source of good boyfriends”. A few years after that we started to get a noticeable intake of women who wanted to become geeks themselves, as opposed to just sleeping with or living with geeks.
The loner/obsessive/perfectionist tendencies of your archetypal geek are rare in women, who are culturally encouraged (and perhaps instinct-wired) to value social support and conformity more. Thus, women entering the geek subculture was a strong sign that it had joined the set of social identities that people think of as ‘normal’. This is still a very recent development; I can’t recall the term ‘geekgirl’ being used at all before about 1998, and I don’t think it became commonly self-applied until 2000 or so.
Eric S. Raymond, “The Revenge of the Nerds is Living Well”, Armed and Dangerous, 2004-12-20.
February 17, 2017
January 29, 2017
ESR performs a useful service in pulling together a document on what all hackers used to need to know, regardless of the particular technical interest they followed. I was never technical enough to be a hacker, but I worked with many of them, so I had to know (or know where to find) much of this information, too.
One fine day in January 2017 I was reminded of something I had half-noticed a few times over the previous decade. That is, younger hackers don’t know the bit structure of ASCII and the meaning of the odder control characters in it.
This is knowledge every fledgling hacker used to absorb through their pores. It’s nobody’s fault this changed; the obsolescence of hardware terminals and the RS-232 protocol is what did it. Tools generate culture; sometimes, when a tool becomes obsolete, a bit of cultural commonality quietly evaporates. It can be difficult to notice that this has happened.
This document is a collection of facts about ASCII and related technologies, notably hardware terminals and RS-232 and modems. This is lore that was at one time near-universal and is no longer. It’s not likely to be directly useful today – until you trip over some piece of still-functioning technology where it’s relevant (like a GPS puck), or it makes sense of some old-fart war story. Even so, it’s good to know anyway, for cultural-literacy reasons.
One thing this collection has that tends to be indefinite in the minds of older hackers is calendar dates. Those of us who lived through all this tend to have remembered order and dependencies but not exact timing; here, I did the research to pin a lot of that down. I’ve noticed that people have a tendency to retrospectively back-date the technologies that interest them, so even if you did live through the era it describes you might get a few surprises from reading.
There are references to Unix in here because I am mainly attempting to educate younger open-source hackers working on Unix-derived systems such as Linux and the BSDs. If those terms mean nothing to you, the rest of this document probably won’t either.
January 23, 2017
Interestingly, the dot.com bust does not seem to have slowed down or discredited the geek subculture at all. Websites like http://geekculture.com and http://thinkgeek.com do a flourishing business, successfully betting investment capital on the theory that there is in fact a common subculture or community embracing computer hackers, SF fans, strategy gamers, aficionados of logic puzzles, radio hams, and technology hobbyists of all sorts. Just the fact that a website can advertise “The World’s Coolest Propeller Beanies!” is indication of how far we’ve come.
I’ve previously observed about one large and important geek subtribe, the Internet hackers, that when people join it they tend to retrospectively re-interpret their past and after a while find it difficult to remember that they weren’t always part of this tribe. I think something similar is true of geeks in general; even those of us who lived through the emergence of geek culture have to struggle a bit to remember what it was like back when we were genuinely atomized outcasts in a culture that was dismissive and hostile.
There are even beginning to be geek families with evidence of generational transmission. I know three generations of one, starting when two computer scientists married in the late 1960s, and had four kids in the 1970s; the kids have since produced a first grandchild who at age five shows every sign of becoming just as avid a gamer/hacker/SF-fan as his parents and grandparents.
Little Isaac, bless him, will grow up in a culture that, in its plenitude, offers lots of artifacts and events designed by and for people like him. He will take the World Wide Web and the Sci-Fi Channel and Yugio and the Lord of the Rings movies and personal computers for granted. He’ll probably never be spat on by a jock, and if he can’t find a girlfriend it will be because the geekgirls and geek groupies are dating other guys like him, rather than being nonexistent.
For Isaac, Revenge of the Nerds will be a quaint period piece with very little more relevance to the social circumstances of his life than a Regency romance. And that is how we know that the nerds indeed got their revenge.
Eric S. Raymond, “The Revenge of the Nerds is Living Well”, Armed and Dangerous, 2004-12-20.
January 3, 2017
Yes, I’m just getting caught up on articles that got published between Christmas and New Year’s, which is why I’m linking to another Megan McArdle article. This one is on the Democratic party’s “festival of wrongness” delusions about hacking the nomination to replace Antonin Scalia on the US Supreme Court:
You may be a bit confused. Republicans hold the majority in this Senate. They will also control the next Senate. How are Democrats supposed to bring the thing to the floor for a vote, much less get enough votes to actually confirm him?
That’s a very good question! The answer some progressives have come up with is that there will be a nanosecond gap between when the outgoing senators leave office, and the new ones are sworn in. During that gap, there will be more Democrats left than Republicans. So the idea is to call that smaller body into session, vote on the nomination, and voila! — a new Supreme Court justice. Alternatively, President Obama could use that gap to make a recess appointment.
The first idea started on Daily Kos, where I initially saw it. I didn’t pay it overmuch attention, as my second law of politics is that “At any given time, someone is suggesting something completely insane.” Usually these ideas go nowhere. This one, however, has gotten a bit of traction; the idea of a nanosecond nomination vote has shown up at the Princeton Election Consortium blog, and endorsements of a recess appointment have appeared in the New Republic and New York magazine.
It’s hard to know where to start with this festival of wrongness. The idea behind the nanosecond nomination seems to be that there are two discrete Senates, the old and the new, with a definite gap between them; yet that somehow, though neither the old nor the new Senate exists, there are senators, who can hold a vote on something — a sort of quantum Senate that pops into and out of existence depending on the needs of the Democratic Party.
The legal grounds for a recess appointment are even weaker, because in 2014 the Supreme Court ruled that recess appointments require at least a three-day gap — not three femtoseconds — between sessions to be valid. Even if that were not the case, Jonathan Adler argues that the new Republican Senate could adjourn sine die, ending the recess appointment a few weeks after it was made. Since Garland would have to vacate his appellate court seat, all Democrats would succeed in doing is opening up another judicial appointment for Trump.
But this is almost quibbling compared with the deeper problem: Even if these moves could work, they wouldn’t work. The people proposing these ideas seem to imagine that they are making a movie about politics, rather than actually doing politics. The hero’s quest is to get a liberal supreme court, but they are stymied until — third act miracle! A daring procedural caper! The gavel slams down on Merrick Garland’s “Aye” vote … cut to him taking his Supreme Court seat … fade to black as the audience cheers. In the real world, of course, there’s a sequel, called “Tomorrow.” And what do the Republicans do then? The answer, alas, is not “stand around shaking their fists at fate, while the moderates among them offer a handshake across the aisle and a rueful ‘You got us this time, guys.’”
December 27, 2016
Robert Graham has a handy tip for understanding newspaper stories, the New York Times in particular:
Here’s a trick when reading New York Times articles: when they switch to passive voice, they are covering up a lie. An example is this paragraph from the above story [*]:
The Russians were also quicker to turn their attacks to political purposes. A 2007 cyberattack on Estonia, a former Soviet republic that had joined NATO, sent a message that Russia could paralyze the country without invading it. The next year cyberattacks were used during Russia’s war with Georgia.
Normally, editors would switch this to the active voice, or:
The next year, Russia used cyberattacks in their war against Georgia.
But that would be factually wrong. Yes, cyberattacks happened during the conflicts with Estonia and Georgia, but the evidence in both cases points to targets and tools going viral on social media and web forums. It was the people who conducted the attacks, not the government. Whether it was the government who encouraged the people is the big question — to which we have no answer. Since the NYTimes has no evidence pointing to the Russian government, they switch to the passive voice, hoping you’ll assume they meant the government was to blame.
It’s a clear demonstration that the NYTimes is pushing a narrative, rather than reporting just the facts allowing you to decide for yourself.
August 22, 2016
I learned something this weekend about the high cost of the subtle delusion that creative technical problem-solving is the preserve of a priesthood of experts, using powers and perceptions beyond the ken of ordinary human beings.
Terry Pratchett is the author of the Discworld series of satirical fantasies. He is — and I don’t say this lightly, or without having given the matter thought and study — quite probably the most consistently excellent writer of intelligent humor in the last century in English. One has to go back as far as P.G. Wodehouse or Mark Twain to find an obvious equal in consistent quality, volume, and sly wisdom.
I’ve been a fan of Terry’s since before his first Discworld novel; I’m one of the few people who remembers Strata, his 1981 first experiment with the disc-world concept. The man has been something like a long-term acquaintance of mine for ten years — one of those people you’d like to call a friend, and who you think would like to call you a friend, if the two of you ever arranged enough concentrated hang time to get that close. But we’re both damn busy people, and live five thousand miles apart.
This weekend, Terry and I were both guests of honor at a hybrid SF convention and Linux conference called Penguicon held in Warren, Michigan. We finally got our hang time. Among other things, I taught Terry how to shoot pistols. He loves shooter games, but as a British resident his opportunities to play with real firearms are strictly limited. (I can report that Terry handled my .45 semi with remarkable competence and steadiness for a first-timer. I can also report that this surprised me not at all.)
During Terry’s Guest-of-Honor speech, he revealed his past as (he thought) a failed hacker. It turns out that back in the 1970s Terry used to wire up elaborate computerized gadgets from Timex Sinclair computers. One of his projects used a primitive memory chip that had light-sensitive gates to build a sort of perceptron that could actually see the difference between a circle and a cross. His magnum opus was a weather station that would log readings of temperature and barometric pressure overnight and deliver weather reports through a voice synthesizer.
But the most astonishing part of the speech was the followup in which Terry told us that despite his keen interest and elaborate homebrewing, he didn’t become a programmer or a hardware tech because he thought techies had to know mathematics, which he thought he had no talent for. He then revealed that he thought of his projects as a sort of bad imitation of programming, because his hardware and software designs were total lash-ups and he never really knew what he was doing.
I couldn’t stand it. “And you think it was any different for us?” I called out. The audience laughed and Terry passed off the remark with a quip. But I was just boggled. Because I know that almost all really bright techies start out that way, as compulsive tinkerers who blundered around learning by experience before they acquired systematic knowledge. “Oh ye gods and little fishes”, I thought to myself, “Terry is a hacker!”
Yes, I thought ‘is’ — even if Terry hasn’t actually tinkered any computer software or hardware in a quarter-century. Being a hacker is expressed through skills and projects, but it’s really a kind of attitude or mental stance that, once acquired, is never really lost. It’s a kind of intense, omnivorous playfulness that tends to color everything a person does.
So it burst upon me that Terry Pratchett has the hacker nature. Which, actually, explains something that has mildly puzzled me for years. Terry has a huge following in the hacker community — knowing his books is something close to basic cultural literacy for Internet geeks. One is actually hard-put to think of any other writer for whom this is as true. The question this has always raised for me is: why Terry, rather than some hard-SF writer whose work explicitly celebrates the technologies we play with?
Eric S. Raymond, “The Delusion of Expertise”, Armed and Dangerous, 2003-05-05.
June 29, 2016
Scratch the surface of “Silicon Valley culture” and you’ll find dozens of subcultures beneath. One means of production unites many tribes, but that’s about all that unites them. At a company the size of Google or even GitHub, you can expect to find as many varieties of cliques as you would in an equivalently sized high school, along with a “corporate culture” that’s as loudly promoted and roughly as genuine as the “school spirit” on display at every pep rally you were ever forced to sit through. One of those groups will invariably be the weirdoes.
Humans are social animals, and part of what makes a social species social is that its members place a high priority on signaling their commitment to other members of their species. Weirdoes’ priorities are different; our primary commitment is to an idea or a project or a field of inquiry. Species-membership commitment doesn’t just take a back seat, it’s in the trunk with a bag over its head.
Not only that, our primary commitments are so consuming that they leak over into everything we think, say, and do. This makes us stick out like the proverbial sore thumb: We’re unable to hide that our deepest loyalties aren’t necessarily to the people immediately around us, even if they’re around us every day. We have a name for people whose loyalties adhere to the field of technology — and to the society of our fellow weirdoes who we meet and befriend in technology-mediated spaces — rather than to the hairless apes nearby. I prefer this term to “weird nerds,” and so I’ll use it here: hackers.
You might not consider hackers to be a tribe apart, but I guarantee you that many — if not most — hackers themselves do. Eric S. Raymond’s “A Brief History of Hackerdom,” whose first draft dates to 1992, contains a litany of descriptions that speak to this:
They wore white socks and polyester shirts and ties and thick glasses and coded in machine language and assembler and FORTRAN and half a dozen ancient languages now forgotten .…
The mainstream of hackerdom, (dis)organized around the Internet and by now largely identified with the Unix technical culture, didn’t care about the commercial services. These hackers wanted better tools and more Internet ….
[I]nstead of remaining in isolated small groups each developing their own ephemeral local cultures, they discovered (or re-invented) themselves as a networked tribe.
Meredith Patterson, “When Nerds Collide: My intersectionality will have weirdoes or it will be bullshit”, Medium.com, 2014-04-23.
June 22, 2016
Of all the sound, fury, and quiet voices of reason in the storm of controversy about tech culture and what is to become of it, quiet voice of reason Zeynep Tufekci’s “No, Nate, brogrammers may not be macho, but that’s not all there is to it” moves the discussion farther forward than any other contribution I’ve seen to date. Sadly, though, it still falls short of truly bridging the conceptual gap between nerds and “weird nerds.” Speaking as a lifelong member of the weird-nerd contingent, it’s truly surreal that this distinction exists at all. I’m slightly older than Nate Silver and about a decade younger than Paul Graham, so it wouldn’t surprise me if either or both find it just as puzzling. There was no cultural concept of cool nerds, or even not-cool-but-not-that-weird nerds, when we were growing up, or even when we were entering the workforce.
That’s no longer true. My younger colleague @puellavulnerata observes that for a long time, there were only weird nerds, but when our traditional pursuits (programming, electrical engineering, computer games, &c) became a route to career stability, nerdiness and its surface-level signifiers got culturally co-opted by trend-chasers who jumped on the style but never picked up on the underlying substance that differentiates weird nerds from the culture that still shuns them. That doesn’t make them “fake geeks,” boy, girl, or otherwise — you can adopt geek interests without taking on the entire weird-nerd package — but it’s still an important distinction. Indeed, the notion of “cool nerds” serves to erase the very existence of weird nerds, to the extent that many people who aren’t weird nerds themselves only seem to remember we exist when we commit some faux pas by their standards.
Even so, science, technology, and mathematics continue to attract the same awkward, isolated, and lonely personalities they have always attracted. Weird nerds are made, not born, and our society turns them out at a young age. Tufekci argues that “life’s not just high school,” but the process of unlearning lessons ingrained from childhood takes a lot more than a cap and gown or even a $10 million VC check, especially when life continues to reinforce those lessons well into adulthood. When weird nerds watch the cool kids jockeying for social position on Twitter, we see no difference between these status games and the ones we opted out of in high school. No one’s offered evidence to the contrary, so what incentive do we have to play that game? Telling us to grow up, get over it, and play a game we’re certain to lose is a demand that we deny the evidence of our senses and an infantilising insult rolled into one.
This phenomenon explains much of the backlash from weird nerds against “brogrammers” and “geek feminists” alike. (If you thought the conflict was only between those two groups, or that someone who criticises one group must necessarily be a member of the other, then you haven’t been paying close enough attention.) Both groups are latecomers barging in on a cultural space that was once a respite for us, and we don’t appreciate either group bringing its cultural conflicts into our space in a way that demands we choose one side or the other. That’s a false dichotomy, and false dichotomies make us want to tear our hair out.
Meredith Patterson, “When Nerds Collide: My intersectionality will have weirdoes or it will be bullshit”, Medium.com, 2014-04-23.
May 18, 2016
March 20, 2016
It’s only a rumour rather than a definite stand, but it is a hopeful one for civil liberties:
The spirit of anarchy and anti-establishment still runs strong at Apple. Rather than comply with the government’s requests to develop a so-called “GovtOS” to unlock the iPhone 5c of San Bernardino shooter Syed Rizwan Farook, The New York Times‘ half-dozen sources say that some software engineers may quit instead. “It’s an independent culture and a rebellious one,” former Apple engineering manager Jean-Louis Gassée tells NYT. “If the government tries to compel testimony or action from these engineers, good luck with that.”
Former senior product manager for Apple’s security and privacy division Window Snyder agrees. “If someone attempts to force them to work on something that’s outside their personal values, they can expect to find a position that’s a better fit somewhere else.”
In another instance of Apple’s company culture clashing with what the federal government demands, the development teams are apparently relatively siloed off from one another. It isn’t until a product gets closer to release that disparate teams like hardware and software engineers come together for finalizing a given gizmo. NYT notes that the team of six to 10 engineers needed to develop the back door doesn’t currently exist and that forcing any sort of collaboration would be incredibly difficult, again, due to how Apple works internally.
November 17, 2015
September 30, 2015
Strategy Page on the less-than-perfect result of Russia’s attempt to get hackers to crack The Onion Router for a medium-sized monetary prize:
Back in mid-2014 Russia offered a prize of $111,000 for whoever could deliver, by August 20th 2014, software that would allow Russian security services to identify people on the Internet using Tor (The Onion Router), a system that enables users to access the Internet anonymously. On August 22nd Russia announced that an unnamed Russian contractor, with a top security clearance, had received the $111,000 prize. No other details were provided at the time. A year later is was revealed that the winner of the Tor prize is now spending even more on lawyers to try and get out of the contract to crack Tor’s security. It seems the winners found that their theoretical solution was too difficult to implement effectively. In part this was because the worldwide community of programmers and software engineers that developed Tor is constantly upgrading it. Cracking Tor security is firing at a moving target and one that constantly changes shape and is quite resistant to damage. Tor is not perfect but it has proved very resistant to attack. A lot of people are trying to crack Tor, which is also used by criminals and Islamic terrorists was well as people trying to avoid government surveillance. This is a matter of life and death in many countries, including Russia.
Similar to anonymizer software, Tor was even more untraceable. Unlike anonymizer software, Tor relies on thousands of people running the Tor software, and acting as nodes for email (and attachments) to be sent through so many Tor nodes that it was believed virtually impossible to track down the identity of the sender. Tor was developed as part of an American government program to create software that people living in dictatorships could use to avoid arrest for saying things on the Internet that their government did not like. Tor also enabled Internet users in dictatorships to communicate safely with the outside world. Tor first appeared in 2002 and has since then defied most attempts to defeat it. The Tor developers were also quick to modify their software when a vulnerability was detected.
But by 2014 it was believed that NSA had cracked TOR and others may have done so as well but were keeping quiet about it so that the Tor support community did not fix whatever aspect of the software that made it vulnerable. At the same time there were alternatives to Tor, as well as supplemental software that were apparently uncracked by anyone.
August 7, 2015
At The Register, John Leyden talks about the recent revelation that the Tesla Model S has known hacking vulnerabilities:
Security researchers have uncovered six fresh vulnerabilities with the Tesla S.
Kevin Mahaffey, CTO of mobile security firm Lookout, and Cloudflare’s principal security researcher Marc Rogers, discovered the flaws after physically examining a vehicle before working with Elon Musk’s firm to resolve security bugs in the electric automobile.
The vulnerabilities allowed the researchers to gain root (administrator) access to the Model S infotainment systems.
With access to these systems, they were able to remotely lock and unlock the car, control the radio and screens, display any content on the screens (changing map displays and the speedometer), open and close the trunk/boot, and turn off the car systems.
When turning off the car systems, Mahaffey and Rogers discovered that, if the car was below five miles per hour (8km/hr) or idling they were able to apply the emergency hand brake, a minor issue in practice.
If the car was going at any speed the technique could be used to cut power to the car while still allowing the driver to safely brake and steer. Consumer’s safety was still preserved even in cases, like the hand-brake issue, where the system ran foul of bugs.
Despite uncovering half a dozen security bugs the two researcher nonetheless came away impressed by Tesla’s infosec policies and procedures as well as its fail-safe engineering approach.
“Tesla takes a software-first approach to its cars, so it’s no surprise that it has key security features in place that minimised and contained the risk of the discovered vulnerabilities,” the researchers explain.
August 2, 2015
The Economist looks at the apparently unstoppable rush to internet-connect everything and why we should worry about security now:
Unfortunately, computer security is about to get trickier. Computers have already spread from people’s desktops into their pockets. Now they are embedding themselves in all sorts of gadgets, from cars and televisions to children’s toys, refrigerators and industrial kit. Cisco, a maker of networking equipment, reckons that there are 15 billion connected devices out there today. By 2020, it thinks, that number could climb to 50 billion. Boosters promise that a world of networked computers and sensors will be a place of unparalleled convenience and efficiency. They call it the “internet of things”.
Computer-security people call it a disaster in the making. They worry that, in their rush to bring cyber-widgets to market, the companies that produce them have not learned the lessons of the early years of the internet. The big computing firms of the 1980s and 1990s treated security as an afterthought. Only once the threats—in the forms of viruses, hacking attacks and so on—became apparent, did Microsoft, Apple and the rest start trying to fix things. But bolting on security after the fact is much harder than building it in from the start.
Of course, governments are desperate to prevent us from hiding our activities from them by way of cryptography or even moderately secure connections, so there’s the risk that any pre-rolled security option offered by a major corporation has already been riddled with convenient holes for government spooks … which makes it even more likely that others can also find and exploit those security holes.
… companies in all industries must heed the lessons that computing firms learned long ago. Writing completely secure code is almost impossible. As a consequence, a culture of openness is the best defence, because it helps spread fixes. When academic researchers contacted a chipmaker working for Volkswagen to tell it that they had found a vulnerability in a remote-car-key system, Volkswagen’s response included a court injunction. Shooting the messenger does not work. Indeed, firms such as Google now offer monetary rewards, or “bug bounties”, to hackers who contact them with details of flaws they have unearthed.
Thirty years ago, computer-makers that failed to take security seriously could claim ignorance as a defence. No longer. The internet of things will bring many benefits. The time to plan for its inevitable flaws is now.
August 1, 2015
Published on 6 Jan 2015
One of the biggest news stories this Christmas was the (un-)cancelled release of Sony Pictures’ movie The Interview. In the movie, Seth Rogan and James Franco try to assassinate North Korean dictator Kim Jong-Un. After terror threats against movie theatres showing the film, Sony cancelled the release of the movie. This ultimately increased the movies attention and made the later online release the most successful one this year. Actually, there is a name for this kind of phenomenon: the Streisand Effect. In this episode of INTO CONTEXT, Indy explains why it’s not always smart to try to hide things on the internet.