Quotulatiousness

November 12, 2024

“Nice business ya got there, Patreon. Wouldn’t want anything to happen to it …”

Filed under: Business, Media, USA — Tags: , , , , , — Nicholas @ 04:00

Above the paywall, Ted Gioia discusses Apple’s latest attempt to cut itself a nice big middleman’s slice of the indy creator market by putting the thumbscrews to Patreon:

Can Apple really charge a 30% tax on indie creators?

What Apple is now doing to indie creators is pure evil — but this story has received very little coverage. Journalists should pay attention, because they are under threat themselves.

Apple is now putting the squeeze on Patreon, a platform that supports more than a quarter of a million creators — artists, writers, musicians, podcasters, videographers, etc.

These freelancers rely on the support of more than 8 million patrons through Patreon, which charges a small 8-12% fee. Many of these supporters pay via Patreon’s iPhone app.

Earlier this year, Apple insisted that Patreon must pay them a 30% commission on all new subscriptions made with the app. In other words, Apple wants to take away close to a third of the income for indie creators — almost quadrupling their transaction fees.

This is the new business model from Cupertino, and it feels like a Mafia shakedown. Apple will make more from Patreon than Patreon does itself.

The only way for indies to avoid this surcharge is by convincing supporters to pay in some other way, and not use an iPhone or Apple tablet.

This is what happens when Apple decides to treat a transaction as an “in app payment” — as if an artist’s entire vocation is no different than a make-believe token in a fantasy video game.

But you can easily imagine how almost anything you do with your phone could be subject to similar demands.

I’ve been very critical of Apple in recent months. But this is the most shameful thing they have ever done to the creative community. A company that once bragged how it supported artistry now actively works to punish it.

March 24, 2024

“[A] term was coined in Britain for playing music on your phone in public: ‘sodcasting’ – after ‘sod’ for ‘sodomite’, i.e. something that only a total ASSHOLE would do”

Filed under: Media, Technology, USA — Tags: , , , — Nicholas @ 03:00

We’ve all been there at some point, especially in waiting rooms or on public transit: someone is either accidentally or deliberately subjecting everyone else in the vicinity to their personal soundtrack:

The modern world is noisy, I get that. I’m fine dealing with busy, urban places. But that surely makes those other places where you can escape the noise all the more vital in the constant struggle for sanity in this century. This is perhaps the one issue on which uber-leftist Elie Mystal and I agree. He found himself this week in a waiting room, full of peeps “listening to content on their devices with no headphones … LOUDLY. What the SHIT is this?? Is this normal?” His peroration: “I’M DEAD. I CAN FOR REAL FEEL THE VEINS IN MY HEAD THROBBING. THIS IS HOW I DIED.” #MeToo, my old lefty comrade.

The degradation of public space in America isn’t entirely new, of course. As soon as transistor radios became portable, people would carry them around — for music or sports scores on construction sites or wherever. But the smartphone era — thanks once again, Steve Jobs, you were so awesome! — gave us an exponential jump in the number of people with highly portable sound-broadcasting machines in every public space imaginable. In other words: Hell on toast.

At the beginning of this phone surge, a term was even coined in Britain for playing music on your phone in public: “sodcasting” — after “sod” for “sodomite”, i.e. something that only a total ASSHOLE would do. Sodcasting was just an amuse bouche, though, compared with our current Bluetooth era, where amplifiers the size of golf-balls have dialed it all up to 11, and the age of full-spectrum public cacophony — including that thump-thump-thump of the bass that carries much farther than the sodcasting treble — has truly begun.

National parks? They are now often intermittent raves, where younger peeps play loud, amplified dance music as they walk their trails. On trains? There is now a single “quiet car” when once they all were, because we were a civilized culture. Walk down a street and you’ll catch a cyclist with a speaker attached to the handlebars, broadcasting at incredible volume for 50 feet ahead and behind him, obliterating every stranger’s conversation in his path.

On a bus? Expect the person sitting right behind you with her mouth four inches from your ears to have a very loud phone conversation, with the speaker turned up, and the phone held in front of her like a waiter holding a platter. The things she’ll tell you! Go to a beach and have your neighbors play volleyball — but with a loud speaker playing Kylie Minogue remixes to generate “atmosphere”.

When did we decide we didn’t give a fuck about anyone else in public anymore?

It’s not as if there isn’t an obvious win-win solution for both those who want to listen to music and those who don’t. Let me explain something that seems completely unimaginable to the Bluetoothers: If you can afford an iPhone, you can afford AirPods, or a headset, or the like. Put them in your ears, and you will hear music of far, far higher quality than from a distant Bluetooth, and no one else will be forced to hear anything at all! What’s not to like? It follows, it seems to me, that those who continue to refuse to do so, who insist that they are still going to make you listen as well, just because fuck-you they can, are waging a meretricious assault on their fellow humans.

What could be the defense? The Guardian — who else? — had a go at it:

    the ghetto blaster reminds us that defiantly and ostentatiously broadcasting one’s music in public is part of a history of sonically contesting spaces and drawing the lines of community, especially through what gets coded as “noise” … it represents a liberation of music from the private sphere in the west, as well as an egalitarian spreading of music in the developing world.

The first point is not, it seems to me, exculpatory. It’s describing an act of territorial aggression through sound. The second point may have some truth to it — but it hardly explains the super-privileged NYC homos on the beach or the white twenty-something NGO employees in the park. But would I enjoy living in Santo Domingo where not an inch — so far as I could see and hear when I was there — was uncontaminated by overheard fluorescent lights and loud, bad club music? Nope.

Whenever I’ve asked the sonic sadists whether they actually understand that they are hurting others, they blink a few times, their mouths begin to form sentences, and then they look away. Or they’ll tell me to go fuck myself, or say I’m the only one who has complained, which is probably true because most people don’t want public confrontation, and have simply given up and moved on. Then there is often the implication that I’m the one being the asshole. On no occasion has anyone ever turned their music off after being asked to. Too damaging to their pride.

One reddit forum-member had this excuse: “It’s because earbuds hurt my ears and headphones don’t stay on.” Another got closer: “A lot of people that play their music out loud think that others won’t mind it.” Self-absorption. One other factor is simply showing off: at Herring Cove, rich douchefags bring their expensive boats a little off-shore so they can broadcast with their massive sound systems. It strengthens my support for the Second Amendment every summer.

November 27, 2023

The slackening pace of technological innovation

Filed under: Business, Economics, Media, Technology, USA — Tags: , , , , — Nicholas @ 04:00

Freddie deBoer thinks we’re living off the diminishing fumes of a much more innovative and dynamic era:

I gave a talk to a class at Northeastern University earlier this month, concerning technology, journalism, and the cultural professions. The students were bright and inquisitive, though they also reflected the current dynamic in higher ed overall – three quarters of the students who showed up were women, and the men who were there almost all sat moodily in the back and didn’t engage at all while their female peers took notes and asked questions. I know there’s a lot of criticism of the “crisis for boys” narrative, but it’s often hard not to believe in it.

At one point, I was giving my little spiel about how we’re actually living in a period of serious technological stagnation – that despite our vague assumption that we’re entitled to constant remarkable scientific progress, humanity has been living with real and valuable but decidedly small-scale technological growth for the past 50 or 60 or 70 years, after a hundred or so years of incredible growth from 1860ish to 1960ish, give or take a decade or two on either side. You’ve heard this from me before, and as before I will recommend Robert J. Gordon’s The Rise & Fall of American Growth for an exhaustive academic (and primarily economic) argument to this effect. Gordon persuasively demonstrates that from the mid-19th to mid-20th century, humanity leveraged several unique advancements that had remarkably outsized consequences for how we live and changed our basic existence in a way that never happened before and hasn’t since. Principal among these advances were the process of refining fossil fuels and using them to power all manner of devices and vehicles, the ability to harness electricity and use it to safely provide energy to homes (which practically speaking required the first development), and a revolution in medicine that came from the confluence of long-overdue acceptance of germ theory and basic hygienic principles, the discovery and refinement of antibiotics, and the modernization of vaccines.

Of course definitional issues are paramount here, and we can always debate what constitutes major or revolutionary change. Certainly the improvements in medical care in the past half-century feel very important to me as someone living now, and one saved life has immensely emotional and practical importance for many people. What’s more, advances in communication sciences and computer technology genuinely have been revolutionary; going from the Apple II to the iPhone in 30 years is remarkable. The complication that Gordon and other internet-skeptical researchers like Ha-Joon Chang have introduced is to question just how meaningful those digital technologies have been for a) economic growth and b) the daily experience of human life. It can be hard for people who stare at their phones all day to consider the possibility that digital technology just isn’t that important. But ask yourself: if you were forced to live either without your iPhone or without indoor plumbing, could you really choose the latter? I think a few weeks of pooping in the backyard and having no running water to wash your hands or take a shower would probably change your tune. And as impressive as some new development in medicine has been, there’s no question that in simple terms of reducing preventable deaths, the advances seen from 1900 to 1950 dwarf those seen since. To a remarkable extent, continued improvements in worldwide mortality in the past 75 years have been a matter of spreading existing treatments and practices to the developing world, rather than the result of new science.

ANYWAY. You’re probably bored of this line from me by now. But I was talking about this to these college kids, none of whom were alive in a world without widespread internet usage. We were talking about how companies market the future, particularly to people of their age group. I was making fun of the new iPhone and Apple’s marketing fixation on the fact that it’s TITANIUM. A few of the students pushed back; their old iPhones kept developing cracks in their casings, which TITANIUM would presumably fix. And, you know, if it works, that’s progress. (Only time and wear and tear will tell; the number of top-of-the-line phones I’ve gone through with fragile power ports leaves me rather cynical about such things.) Still, I tried to get the students to put that in context with the sense of promise and excitement of the recent past. I’m of a generation that was able to access the primitive internet in childhood but otherwise experienced the transition from the pre-internet world to now. I suspect this is all rather underwhelming for us. When you got your first smartphone, and you thought about what the future would hold, were your first thoughts about more durable casing? I doubt it. I know mine weren’t.

Why is Apple going so hard on TITANIUM? Well, where else does smartphone development have to go? In the early days there was this boundless optimism about what these things might someday do. The cameras, obviously, were a big point of emphasis, and they have developed to a remarkable degree, with even midrange phones now featuring very high-resolution sensors, often with multiple lenses. The addition of the ability to take video that was anything like high-quality, which became widespread a couple years into the smartphone era, was a big advantage. (There’s also all manner of “smart” filtering and adjustments now, which are of more subjective value.) The question is, who in 2023 ever says to themselves “smartphone cameras just aren’t good enough”? I’m sure the cameras will continue to get refined, forever. And maybe that marginal value will mean something, anything at all, in five or ten or twenty years. Maybe it won’t. But no one even pretends that it’s going to be a really big deal. Screens are going to get even more high-resolution, I guess, but again – is there a single person in the world who buys the latest flagship Samsung or iPhone and says, “Christ, I need a higher resolution screen”? They’ll get a little brighter. They’ll get a little more vivid. But so what? So what. Phones have gotten smaller and they’ve gotten bigger. Some gimmicks like built-in projectors were attempted and failed. Some advances like wireless charging have become mainstays. And the value of some things, like foldable screens, remains to be seen. But even the biggest partisans for that technology won’t try to convince you that it’s life-altering.

March 9, 2023

Want to feel more depressed? Spend more time with your smartphone

Filed under: Health, Media, Technology, USA — Tags: , , , , , — Nicholas @ 03:00

Freddie deBoer is convinced that much of the reason for widespread depression among teenagers can be traced directly to their obsessive devotion to the online world through their smartphones:

Are smartphones to blame for the mental health crisis among teens? The debate has picked up steam lately, in part because of the steady accumulation of evidence that they are indeed, at least partially. (As you know, I’m a believer.) Jonathan Haidt has done considerable work marshaling this evidence. But there’s an attendant question of how phones make kids miserable, if indeed they do. In this post I offer some plausible answers. This is mostly just speculation and I don’t know if the proffered explanations can be tested empirically.

I want to start by establishing a sort of meta-layer on which a lot of these problems rest. We might be inclined to say that these problems are inherently problems of the internet/online life/digital culture, rather than smartphones as such; you can be hurt by what I’m going to describe from a laptop as well as from a smartphone. And I think that’s right, except for one key difference: ubiquity. No matter how portable and light it is, you’re not reflexively checking your laptop on the subway platform or in the bathroom. The iPhone took all of the various pathologies of the internet, made it possible for them to be experienced repetitively and at zero cost morning and night, and dramatically scaled up the financial incentives for companies to exploit those pathologies for gain. You can certainly have an unhealthy relationship with the internet when it’s confined to your desktop. But phones make relentless conditioning and reflexive engagement a mass phenomenon.

The other overriding factor here is the fact that adolescents are still developing mentally, and thus are likely more susceptible to these problems.

Constant exposure to unachievable conditions. Back in my youth, you might watch an MTV show about how rich people lived, or leaf through a magazine like US Weekly, and be exposed to opulence and material excess. Or you might go on vacation and see how the other half lives if you took a tour of the Hollywood hills or whatever. You were perfectly well aware that rich people and their privileged lives existed. But then you turned off the show or you put down the magazine or your vacation ended, and unless you were born rich, you lived in an environment that of necessity was modest and real. Your friends might have lived in nice houses, but you didn’t see riches everywhere you looked, and your definition of what a hot girl looked like was mostly derived from the girls you went to school with. Your environment conditioned the scope of your desires.

Now, exposure to lifestyles that are completely unachievable is constant. Instagram is a machine for making you feel like whatever you’ve got isn’t enough. (That’s how it functions financially, through advertising idealized lives.) There are young people out there who have arranged their various feeds such that they’re always a few seconds away from seeing concerts they can’t attend, cars they can’t drive, houses they can’t live in, clothes they can’t wear, women they can’t fuck or whose bodies they can’t have, places they can’t travel to, food they can’t eat, and lives they can’t live. When I was young, if I wanted to see a picture of a Ferrari, I had to seek out a picture of a Ferrari. It was hard to see suggestive photos of intimidatingly hot women, which is why the Sports Illustrated swimsuit edition was a big deal. Mostly, the world around you was quotidian and its pleasures attainable. What can it be doing to these generations of young people, having completely unrealistic visions of what life is like being shoved into their brains all the time? How could their actual lives ever compare?

(Incidentally, I am thoroughly convinced that a majority of self-described incels are men who could find meaningful and fulfilling sexual and romantic success, both short-term and long, but who have developed such a wildly unrealistic idea about what actual human women look like that their standards are laughably high. And it’s easy to make fun of that, but I also think that the conditioning inherent to constantly looking at filtered and photoshopped pictures is powerful.)

November 10, 2020

QotD: The Smartphone, the Eater-of-Gadgets

Filed under: Economics, Quotations, Technology — Tags: , , , , , — Nicholas @ 01:00

I’ve been thinking for some time now that the smartphone has achieved a kind of singularity, becoming a black hole that sucks all portable electronics into itself. PDAs – absorbed. Music players – consumed. Handset GPSes – eaten. Travel-alarm clocks, not to mention ordinary watches – subsumed. Calculators – history. E-readers under serious pressure, and surviving only because e-paper displays have lower battery drain and are a bit larger. Compasses – munched. Pocket flashlights – crunched. Fobs for keyless locks – being scarfed down as we speak, though not gone yet.

[…]

But in an entertaining inversion, one device of the future actually works on smartphones now. Because I thought it would be funny, I searched for “tricorder” in the Android market. For those of you who have been living in a hole since 1965, a tricorder is a fictional gadget from the Star Trek universe, an all-purpose sensor package carried by planetary survey parties. I expected a geek joke, a fancy mock-up with mildly impressive visuals and no actual function. I was utterly gobsmacked to discover instead that I had an arguably real tricorder in my hand.

Consider. My Nexus One includes a GPS, an accelerometer, a microphone, and a magnetometer. That is, sensors for location, magnetic field, gravitational fields, and acoustic energy. Hook a bit of visualization and spectral analysis to these sensors, and bugger me with a chainsaw if you don’t have a tricorder. A quad- or quintcorder, actually.

And these sensors are already completely stock on smartphones because sensor electronics is like any other kind; amortized over a large enough production run, their incremental cost approaches epsilon because most of their content is actually design information (cue the shade of Bucky Fuller talking about ephemeralization). Which in turn points at the fundamental reason the smartphone is Eater-of-Gadgets; because, as the tricorder app deftly illustrates, the sum of a computer and a bunch of sensors costing epsilon is so synergistically powerful that it can emulate not just real single-purpose gadgets but gadgets that previously existed only as science fiction!

[…]

I specified “personal” radios because radios have something in common with personal computers; their main design constraints are actually constraints on a peripheral stage. For a computer you’ll be using for hours at a time you really want a full-sized hard keyboard and a display bigger than a smartphone’s; for a really good radio, the kind you supply sound for a party with, you need speakers with resonant cavities that won’t fit in a smartphone enclosure.

Digital cameras are another diagnostic case. The low-end camera with small lenses is already looking like a goner; the survivors will be DSLRs and more generally those with precision optics too large and too expensive to fit in a phone case.

These two examples suggest Raymond’s Rule of Smartphone Subsumption: if neither the physics nor the ergonomics of a gadget’s function require peripherals larger than will fit in a smartphone case, the smartphone will eat it!

Eric S. Raymond, “Smartphone, the Eater-of-Gadgets”, Armed and Dangerous, 2010-07-16.

September 21, 2019

QotD: Smartphones

Filed under: Quotations, Technology — Tags: , , — Nicholas @ 01:00

My smartphone, an out-of-date and memory-challenged iPhone, is so far and away the most incredible thing I’ve ever owned that I wouldn’t know how to pick a runner-up. I might never own a better camera, except in a replacement. It tunes my guitar vastly better than my guitar tuner. It monitors me from space and guides me to my destination, adjusting the route for traffic congestion. I speak English into it and it translates back in any language I want. It streams more entertainment than I could ever consume. If 21-year-old me could see 41-year-old me today, he would wonder (a) how I could possibly afford the thing, and (b) how I ever found time to stop gawking at it and go to work.

Chris Selley, “Silly to blame the smartphone for everything we’ve done wrong with it”, National Post, 2017-08-08.

June 6, 2019

iTunes is dead – “There will be no funeral, because it had no friends”

Filed under: Business, Technology — Tags: , , , — Nicholas @ 03:00

I use iTunes because I have to, not because I particularly want to. Apparently that’s not uncommon among iPhone users:

iTunes, Apple’s Frankenstein’s monster of an MP3-player-cum-record store-cum-video-store-cum-iPhone-updater-cum-random-task-performer, a piece of software which opens on your computer whenever it wants and which seems to require you to download an updated version every eight hours, was pronounced dead on Monday. It was 19 years old. There will be no funeral, because it had no friends.

Apple CEO Tim Cook announced that in its future operating systems, iTunes will be replaced by three separate programs: One for music (Apple Music), one for podcasts (Apple Podcasts) and one for video (Apple TV). Updating your phone — which never had anything to do with music, podcasts or video — will now be a function of the operating system. This sounds promising. It sounds normal.

But the mystery remains how Apple, of all companies, found itself sullying its machines for so long with iTunes’ wretched presence. By the end iTunes wasn’t just bad, it was fascinatingly bad — a “toxic hellstew,” as programmer Marco Arment put it in 2015. It was a master class in bad user experience from a company whose brand is excellent user experience: Put your trust in Apple’s machines and its native apps and everything will just work. There are no viruses, no blue screens of death, no pre-installed junkware popping up all over your brand-new desktop. Things just show up where they’re supposed to be. Mac’s user interface is so vastly superior to Windows’ that it seems ridiculous even to compare them. They’re both operating systems in the sense that the stick-shift on a Yugo and the flappy paddles on a Ferrari are both transmissions. Yet by 2015 one of Apple’s essential apps wasn’t just horrid to look at and baffling to use — it couldn’t even store and play people’s MP3s properly.

I never experienced the horror stories myself; [lucky bastard!] the idea of buying music from Apple and, because of its aggressive digital rights management, not even getting an MP3 file with which I could do what I liked always struck me as daft. But the Internet is full of tales of woe from people who entrusted their music collections to Apple and got royally screwed. iTunes would make curatorial decisions all by itself: If you bought Neil Young’s 1977 compilation album Decade, but already had On the Beach in your library, it might just decide not to include Walk On and Tired Eyes on your version of Decade. Or it might delete them from On the Beach, depending on its mood.

This was presumptuous and annoying, but at least somewhat explicable: iTunes consumers were far more singles-focused than album-focused. (Indeed the app is widely credited with ending the “age of the album.”) Less explicable were reports of Apple Music replacing people’s legacy music collections — songs they had ripped from CDs and entrusted to iTunes — with new downloads. People spoke of entire collections being corrupted or lost overnight. People reported that their libraries looked nothing alike on their various Apple devices. At one point, apparently under the impression that not many people loathe U2, Apple famously went ahead and beamed one of the band’s new snorefests onto everyone’s iTunes without asking.

My experiences with iTunes have been mostly of the minor irritant variety: disappearing songs, paid-for tracks that refused to play on certain devices, and songs showing up in playlists that they don’t belong to, for example. But at least — most of the time — the non-Apple songs were not randomly deleted from my library. Not too often, anyway.

March 3, 2019

QotD: Four ways to corporate monopoly

1. Proprietary technology. This one is straightforward. If you invent the best technology, and then you patent it, nobody else can compete with you. Thiel provocatively says that your technology must be 10x better than anyone else’s to have a chance of working. If you’re only twice as good, you’re still competing. You may have a slight competitive advantage, but you’re still competing and your life will be nasty and brutish and so on just like every other company’s. Nobody has any memory of whether Lycos’ search engine was a little better than AltaVista’s or vice versa; everybody remembers that Google’s search engine was orders of magnitude above either. Lycos and AltaVista competed; Google took over the space and became a monopoly.

2. Network effects. Immortalized by Facebook. It doesn’t matter if someone invents a social network with more features than Facebook. Facebook will be better than their just by having all your friends on it. Network effects are hard because no business will have them when it first starts. Thiel answers that businesses should aim to be monopolies from the very beginning – they should start by monopolizing a tiny market, then moving up. Facebook started by monopolizing the pool of Harvard students. Then it scaled up to the pool of all college students. Now it’s scaled up to the whole world, and everyone suspects Zuckerberg has somebody working on ansible technology so he can monopolize the Virgo Supercluster. Similarly, Amazon started out as a bookstore, gained a near-monopoly on books, and used all of the money and infrastructure and distribution it won from that effort to feed its effort to monopolize everything else. Thiel describes how his own company PayPal identified eBay power sellers as its first market, became indispensible in that tiny pool, and spread from there.

3. Economies of scale. Also pretty straightforward, and especially obvious for software companies. Since the marginal cost of a unit of software is near-zero, your cost per unit is the cost of building the software divided by the number of customers. If you have twice as many customers as your nearest competitor, you can charge half as much money (or make twice as much profit), and so keep gathering more customers in a virtuous cycle.

4. Branding. Apple is famous enough that it can charge more for its phones than Amalgamated Cell Phones Inc, even for comparable products. Partly this is because non-experts don’t know how to compare cell phones, and might not trust Consumer Reports style evaluations; Apple’s reputation is an unfakeable sign that their products are pretty good. And partly it’s just people paying extra for the right to say “I have an iPhone, so I’m cooler than you”. Another company that wants Apple’s reputation would need years of successful advertising and immense good luck, so Apple’s brand separates it from the competition and from the economic state of nature.

Scott Alexander, “Book Review: Zero to One”, Slate Star Codex, 2019-01-31.

February 5, 2018

The Apple iPhone … productivity killer

Filed under: Economics, Humour — Tags: , — Nicholas @ 03:00

Tim Harford explains why the first world’s productivity gains have stalled and even gone into reverse since the Apple iPhone was introduced:

A few weeks before Christmas, an impish chart appeared on the Bank of England’s unofficial blog. It compared plunging productivity with the soaring shipments of smartphones. Typical productivity growth in advanced economies had hovered steadily around 1 per cent a year for several decades, but has on average been negative since 2007. That was the year the iPhone started to ship.

Nobody really believes that the iPhone caused the productivity slowdown — a more obvious culprit would be the global financial crisis — but it is hard to find people who think that their phones are an unalloyed blessing. If in 1968 an economist or computer scientist had been told that 50 years later we would all be carrying wirelessly networked supercomputers in our pockets, he or she would have been staggered at the potential. I doubt they would have realised quite how much time we would spend liking Instagram posts, playing Pokémon Go and sending each other digital interruptions.

The costs of this distraction are starting to become apparent. I wrote recently about the research of Gloria Mark of the University of California, Irvine. Prof Mark argues that reorientating yourself after an interruption tends to take between 20 and 25 minutes. We all know how a moment’s inattention can turn into a clickhole of distractions. She also points out that once we get used to being interrupted by others, we start interrupting ourselves, twitchily checking email or social media in the hope something interesting might turn up.

December 17, 2017

Thomas Train Stunts

Filed under: Randomness — Tags: , , — Nicholas @ 02:00

5MadMovieMakers
Published on 4 Dec 2017

Thomas the Tank Engine goes pro skater and pulls off some sick jumps with his train friends. Filmed with an iPhone SE at 120 frames per second.

September 26, 2017

Steve Chapman – “The Unabomber had a point”

Filed under: History, Technology — Tags: , , , — Nicholas @ 03:00

In his Chicago Tribune column, Steve Chapman does his very best “grumpy old man yelling at a cloud” imitation, while the headline writer goes one step further:

The iPhone X proves the Unabomber was right

The introduction of the new iPhone X — which features wireless charging, facial recognition and a price tag of $999 — appears to be a minor event in the advance of technology. But it’s an excellent illustration of something that has long gone unrecognized: The Unabomber had a point.

Not about blowing people up in an effort to advance his social goals. Ted Kaczynski’s campaign to kill and maim chosen victims with explosives was horrific in the extreme and beyond forgiveness. But his 35,000-word manifesto, published in 1995, provided a glimpse of the future we inhabit, and his foresight is a bit unsettling.

“The Industrial Revolution and its consequences have been a disaster for the human race,” it begins. Among the ills he attributes to advances in technology are that they promise to improve our lives but end up imposing burdens we would not have chosen.

He cites the automobile, which offered every person the freedom to travel farther and faster than before. But as cars became more numerous, they became a necessity, requiring great expense, bigger roads and more regulations. Cities were designed for the convenience of drivers, not pedestrians. For most people, driving is no longer optional.

Smartphones have followed the same pattern. When cellphones first appeared, they gave people one more means of communication, which they could accept or reject. But before long, most of us began to feel naked and panicky anytime we left home without one

He also comes up with a non-Unabomber book that kinda-sorta supports the point he’s trying to make, I think:

The problem is hardly a new one. In his book Sapiens: A Brief History of Humankind, Yuval Noah Harari argues that the agricultural revolution that took place 10,000 years ago was “history’s biggest fraud.”

In the preceding 2.5 million years, when our ancestors lived as hunter-gatherers they worked less, “spent their time in more stimulating and varied ways, and were less in danger of starvation and disease” than afterward.

Farming boosted the population but chained humans to the land and demanded ceaseless drudgery to plant, tend, harvest and process food — while making us more vulnerable to famine, disease and war. People who had evolved over eons for one mode of life were pushed into a different mode at odds with many of their natural instincts.

Our distant pre-agricultural ancestors may have worked less than their post-agricultural descendants, but they hardly could be said to have lived lives of leisure and plenty. They lived in very small family groups because without advanced technology they were limited to what could be hunted or gathered in a small region and had very few portable possessions because they generally had to move frequently as the availability of food dictated. Once a group switched from a nomadic to a fixed lifestyle, “work” became how most members of that group would live the vast majority of their lives. Lives of hunter-gatherers were not stunted by the work that farmers had to put in, and hunter-gatherers had no fixed territory to defend, so they had no need of a warrior class or caste to help them protect the land they farmed, and no king or chief or overlord to “protect” them from other groups’ kings or chiefs or overlords. The advantage of the farmers over the nomads was that farmers could build up a surplus of food to tide them over when food was normally scarce – nomads would have to move on to find new hunting grounds.

August 5, 2017

“… theirs is a generation shaped by the smartphone and by the concomitant rise of social media. I call them iGen”

Filed under: Technology, USA — Tags: , , , , , , , — Nicholas @ 04:00

In The Atlantic, “Gen Xer” Jean Twenge is worried about the ways the new generation differ from their Millennial predecessors:

Click to see full-size image

I’ve been researching generational differences for 25 years, starting when I was a 22-year-old doctoral student in psychology. Typically, the characteristics that come to define a generation appear gradually, and along a continuum. Beliefs and behaviors that were already rising simply continue to do so. Millennials, for instance, are a highly individualistic generation, but individualism had been increasing since the Baby Boomers turned on, tuned in, and dropped out. I had grown accustomed to line graphs of trends that looked like modest hills and valleys. Then I began studying Athena’s generation.

Around 2012, I noticed abrupt shifts in teen behaviors and emotional states. The gentle slopes of the line graphs became steep mountains and sheer cliffs, and many of the distinctive characteristics of the Millennial generation began to disappear. In all my analyses of generational data — some reaching back to the 1930s — I had never seen anything like it.

At first I presumed these might be blips, but the trends persisted, across several years and a series of national surveys. The changes weren’t just in degree, but in kind. The biggest difference between the Millennials and their predecessors was in how they viewed the world; teens today differ from the Millennials not just in their views but in how they spend their time. The experiences they have every day are radically different from those of the generation that came of age just a few years before them.

What happened in 2012 to cause such dramatic shifts in behavior? It was after the Great Recession, which officially lasted from 2007 to 2009 and had a starker effect on Millennials trying to find a place in a sputtering economy. But it was exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.

The more I pored over yearly surveys of teen attitudes and behaviors, and the more I talked with young people like Athena, the clearer it became that theirs is a generation shaped by the smartphone and by the concomitant rise of social media. I call them iGen. Born between 1995 and 2012, members of this generation are growing up with smartphones, have an Instagram account before they start high school, and do not remember a time before the internet. The Millennials grew up with the web as well, but it wasn’t ever-present in their lives, at hand at all times, day and night. iGen’s oldest members were early adolescents when the iPhone was introduced, in 2007, and high-school students when the iPad entered the scene, in 2010. A 2017 survey of more than 5,000 American teens found that three out of four owned an iPhone.

The advent of the smartphone and its cousin the tablet was followed quickly by hand-wringing about the deleterious effects of “screen time.” But the impact of these devices has not been fully appreciated, and goes far beyond the usual concerns about curtailed attention spans. The arrival of the smartphone has radically changed every aspect of teenagers’ lives, from the nature of their social interactions to their mental health. These changes have affected young people in every corner of the nation and in every type of household. The trends appear among teens poor and rich; of every ethnic background; in cities, suburbs, and small towns. Where there are cell towers, there are teens living their lives on their smartphone.

It could be that the widespread use of available technology to “virtualize” adolescence is at least to a degree a reaction to helicopter parenting.

H/T to Kate at Small Dead Animals for the link.

September 9, 2016

Apple may learn the lesson of “option value” with the latest iPhone release

Filed under: Business, Economics, Technology — Tags: , , — Nicholas @ 03:00

Megan McArdle explains why some Apple fans are not overjoyed at the latest iPhones:

You’ve probably been thinking to yourself, “Gee, I wish I couldn’t charge my phone while also listening to music.” Or perhaps, “Gosh, if only my headphones were more expensive, easier to lose and required frequent charging.” If so, you’re in luck. Apple’s newest iPhone, unveiled on Wednesday, lacks the familiar 3.5-millimeter headphone jack. You can listen to music through the same lightning jack that you charge the phone with, or you can shell out for wireless headphones. The internets have been … unpleased with this news.

To be fair, there are design reasons for doing this. As David Pogue writes, the old-fashioned jack is an ancient piece of technology. It’s been around for more than 50 years. “As a result,” says Pogue, “it’s bulky — and in a phone, bulk = death.”

Getting rid of this ancient titan will make for a thinner phone or leave room for a bigger battery. Taking a hole out of the phone also makes it easier to waterproof. And getting rid of the jack removes a possible point of failure, since friction isn’t good for parts.

For people who place a high value on a thin phone, this is probably a good move; they’ll switch to wireless earbuds or use the lightning jack. But there are those of us who have never dropped our phones in the sink. We replace our iPhones when the battery dies, an event that tends to occur long before the headphone jack breaks. There are people in the world who take their phones on long trips, requiring them to charge them while making work calls, and they won’t want to fumble around for splitters or adapters. Some of us do not care whether our phone is merely fashionably slender or outright anorexic. For these groups, Apple’s move represents a trivial gain for a large loss: the vital commodity that economists call option value.

Option value is basically what it sounds like. The option to do something is worth having, even if you never actually do it. That’s because it increases the range of possibility, and some of those possibilities may be better than your current alternatives. My favorite example of option value is the famous economist who told me that he had tried to argue his wife into always ordering an extra entree, one they hadn’t tried before, when they got Chinese takeout. Sure, that extra entree cost them money. And sure, they might not like it. But that entree had option value embedded in it: they might discover that they like the new entree even better than the things they usually ordered, and thereby move the whole family up to a higher valued use of their Chinese food dollars.

April 6, 2016

The differences between the Boomers, Gen-Xers and Millennials

Filed under: Humour — Tags: , , — Nicholas @ 02:00

Differences between the generations

Seen at http://ratak-monodosico.tumblr.com/post/142242804250

March 29, 2016

Why did Apple suddenly grow a pair over consumer privacy and (some) civil rights?

Filed under: Business, Technology, USA — Tags: , , , , , — Nicholas @ 03:00

Charles Stross has a theory:

A lot of people are watching the spectacle of Apple vs. the FBI and the Homeland Security Theatre and rubbing their eyes, wondering why Apple (in the person of CEO Tim Cook) is suddenly the knight in shining armour on the side of consumer privacy and civil rights. Apple, after all, is a goliath-sized corporate behemoth with the second largest market cap in US stock market history — what’s in it for them?

As is always the case, to understand why Apple has become so fanatical about customer privacy over the past five years that they’re taking on the US government, you need to follow the money.

[…]

Apple see their long term future as including a global secure payments infrastructure that takes over the role of Visa and Mastercard’s networks — and ultimately of spawning a retail banking subsidiary to provide financial services directly, backed by some of their cash stockpile.

The FBI thought they were asking for a way to unlock a mobile phone, because the FBI is myopically focussed on past criminal investigations, not the future of the technology industry, and the FBI did not understand that they were actually asking for a way to tracelessly unlock and mess with every ATM and credit card on the planet circa 2030 (if not via Apple, then via the other phone OSs, once the festering security fleapit that is Android wakes up and smells the money).

If the FBI get what they want, then the back door will be installed and the next-generation payments infrastructure will be just as prone to fraud as the last-generation card infrastructure, with its card skimmers and identity theft.

And this is why Tim Cook is willing to go to the mattresses with the US department of justice over iOS security: if nobody trusts their iPhone, nobody will be willing to trust the next-generation Apple Bank, and Apple is going to lose their best option for securing their cash pile as it climbs towards the stratosphere.

Older Posts »

Powered by WordPress