Foundation for Economic Education
Published on 25 Sep 2018Forget about slot machines, the future of gaming is virtual reality! In this episode of Mind Your Business, Andrew Heaton is teaming up with entrepreneur Rahul Sood to learn all about esports, safe and legal online betting, and the global community that is surging behind organized competitive video gaming.
September 27, 2018
Mind Your Business #4: Free the Unikrn
September 24, 2018
Verity Stob on early GUI experiences
“Verity Stob” began writing about technology issues three decades back. She reminisces about some things that have changed and others that are still irritatingly the same:
It’s 30 years since .EXE Magazine carried the first Stob column; this is its
pearlPerl anniversary. Rereading article #1, a spoof self-tester in the Cosmo style, I was struck by how distant the world it invoked seemed. For example:
Your program requires a disk to have been put in the floppy drive, but it hasn’t. What happens next?
The original answers, such as:
e) the program crashes out into DOS, leaving dozens of files open
would now need to be supplemented by
f) what’s ‘the floppy drive’ already, Grandma? And while you’re at it, what is DOS? Part of some sort of DOS and DON’TS list?
I say: sufficient excuse to present some Then and Now comparisons with those primordial days of programming, to show how much things have changed – or not.
1988: Drag-and-drop was a showy-offy but not-quite-satisfactory technology.
My first DnD encounter was by proxy. In about 1985 my then boss, a wise and cynical old Brummie engineer, attended a promotional demo, free wine and nibbles, of an exciting new WIMP technology called GEM. Part of the demo was to demonstrate the use of the on-screen trash icon for deleting files.
According to Graham’s gleeful report, he stuck up his hand at this point. “What happens if you drag the clock into the wastepaper basket?’
The answer turned out to be: the machine crashed hard on its arse, and it needed about 10 minutes embarrassed fiddling to coax it back onto its feet. At which point Graham’s arm went up again. “What happens if you drop the wastepaper basket into the clock?’
Drag-ons ‘n’ drag-offs
GEM may have been primitive, but it was at least consistent.
The point became moot a few months later, when Apple won a look-and-feel lawsuit and banned the GEM trashcan outright.
2018: Not that much has changed. Windows Explorer users: how often has your mouse finger proved insufficiently strong to grasp the file? And you have accidentally dropped the document you wanted into a deep thicket of nested server directories?
Or how about touch interface DnD on a phone, where your skimming pinkie exactly masks from view the dragged thing?
Well then.
However, I do confess admiration for this JavaScript library that aims to make a dragging and dropping accessible to the blind. Can’t fault its ambition.
September 13, 2018
Mind Your Business Ep. 2: Aceable in the Hole
Foundation for Economic Education
Published on 11 Sep 2018Believe it or not, parallel parking is not an impossible task. Meet Blake Garrett, the entrepreneur who is using VR to teach people how to drive, without actually getting behind the wheel.
____________
Produced & Directed by Michael Angelo Zervos
Executive Produced by Sean W. Malone
Hosted by Andrew Heaton
Original Music by Ben B. Goss
Featuring Blake Garrett
August 16, 2018
In praise of Donald Knuth
David Warren sings the praises of the inventor of “TeX”:
Among my heroes in that trade is a man now octogenarian, a certain Donald Knuth, author of the multi-volumed Art of Computer Programming, and of the great mass of algorithms behind the “TeX” composing system. A life-long opponent of patenting for software, and still not on email, he is one of the finer products of the Whole Earth Catalogue mindset of that era, though as a devoted Christian, he had it from older sources. (The mindset of: forget politics and do-it-yerself.)
Perfesser Knuth’s life journey was somewhat altered when a publisher presented him with the galley proofs for a reissue of one of his earlier volumes. They were, compared to the pages of the original hot-metal edition, a dog’s breakfast. In particular, even when technically correct, the mathematical formulae appeared to have been set by monkeys. He resolved to “make the world a better place” by doing something about this.
Knowing (pronounce the “k” as we do in this author’s surname) that computers can do many things that humans can’t — or can’t within one lifetime — he set about designing the computer processes to calculate beautiful letter and word spacings, line-breaks, line spacings, marginal proportions and such. He understood that civilization depends on literacy, literacy on legibility, and legibility on elegance. Ruthlessly, he recognized that things like “widowed” and “orphaned” lines of text are moral evils, and discovered algorithms that could exterminate them by complex anticipation. Too, he contributed to the counter-revolution by which the letters themselves could be drawn not pixelated.
I will quickly lose my few remaining readers if I go into the details. But here was a man (and still is) who discerned that nature herself is built on aesthetic principles, which men can investigate and apply. It is when something is ugly that we can know that it is wrong. Mathematicians, like poets and other artists, can embody the Faith at the root of this.
To my mind, or I would rather say K-nowledge, there is nothing wrong with technology, per se. We can often do things better with new tools. But we must be guided by the uncompromising demands of Beauty. Everything must be made as beautiful as we can make it: there must be no wavering, no surrender. All that is ugly must be consigned to Hell.
June 5, 2018
The Internet-of-Things as “Moore’s Revenge”
El Reg‘s Mark Pesce on the end of Moore’s Law and the start of Moore’s Revenge:
… the cost of making a device “smart” – whether that means, aware, intelligent, connected, or something else altogether – is now trivial. We’re therefore quickly transitioning from the Death of Moore’s Law into the era of Moore’s Revenge – where pretty much every manufactured object has a chip in it.
This is going to change the whole world, and it’s going to begin with a fundamental reorientation of IT, away from the “pinnacle” desktops and servers, toward the “smart dust” everywhere in the world: collecting data, providing services – and offering up a near infinity of attack surfaces. Dumb is often harder to hack than smart, but – as we saw last month in the Z-Wave attack that impacted hundreds of millions of devices – once you’ve got a way in, enormous damage can result.
The focus on security will produce new costs for businesses – and it will be on IT to ensure those costs don’t exceed the benefits of this massively chipped-and-connected world. It’ll be a close-run thing.
It’s also likely to be a world where nothing works precisely as planned. With so much autonomy embedded in our environment, the likelihood of unintended consequences amplifying into something unexpected becomes nearly guaranteed.
We may think the world is weird today, but once hundreds of billions of marginally intelligent and minimally autonomous systems start to have a go, that weirdness will begin to arc upwards exponentially.
May 12, 2018
QotD: Women in I.T.
… any woman who wants to be in a STEM field should be able to get as far as talent, hard work, and desire to succeed will take her, without facing artificial barriers erected by prejudice or other factors. If there are women who dream of being in STEM but have felt themselves driven off that path, the system is failing them. And the system is failing itself, too; talent is not so common that we can afford to waste it.
Now I’m going to refocus on computing, because that’s what I know best and I think it exhibits the problems that keep women out of STEM fields in an extreme form. There’s a lot of political talk that the tiny and decreasing number of women in computing is a result of sexism and prejudice that has to be remedied with measures ranging from sensitivity training up through admission and hiring quotas. This talk is lazy, stupid, wrong, and prevents correct diagnosis of much more serious problems.
I don’t mean to deny that there is still prejudice against women lurking in dark corners of the field. But I’ve known dozens of women in computing who wouldn’t have been shy about telling me if they were running into it, and not one has ever reported it to me as a primary problem. The problems they did report were much worse. They centered on one thing: women, in general, are not willing to eat the kind of shit that men will swallow to work in this field.
Now let’s talk about death marches, mandatory uncompensated overtime, the beeper on the belt, and having no life. Men accept these conditions because they’re easily hooked into a monomaniacal, warrior-ethic way of thinking in which achievement of the mission is everything. Women, not so much. Much sooner than a man would, a woman will ask: “Why, exactly, am I putting up with this?”
Correspondingly, young women in computing-related majors show a tendency to tend to bail out that rises directly with their comprehension of what their working life is actually going to be like. Biology is directly implicated here. Women have short fertile periods, and even if they don’t consciously intend to have children their instincts tell them they don’t have the option young men do to piss away years hunting mammoths that aren’t there.
There are other issues, too, like female unwillingness to put up with working environments full of the shadow-autist types that gravitate to programming. But I think those are minor by comparison, too. If we really want to fix the problem of too few women in computing, we need to ask some much harder questions about how the field treats everyone in it.
Eric S. Raymond, “Women in computing: first, get the problem right”, Armed and Dangerous, 2010-07-15.
March 24, 2018
James May doesn’t trust Sat Navs | Q&A extras | HeadSqueeze
BBC Earth Lab
Published on 27 Sep 2013Don’t trust the Sat Nav! Speaking from experience, James thinks we shouldn’t blindly trust a machine. Get a map!
February 27, 2018
The notion of “uploading” your consciousness
Skeptic author Michael Shermer pours cold water on the dreams and hopes of Transhumanists, Cryonicists, Extropians, and Technological Singularity proponents everywhere:
It’s a myth that people live twice as long today as in centuries past. People lived into their 80s and 90s historically, just not very many of them. What modern science, technology, medicine, and public health have done is enable more of us to reach the upper ceiling of our maximum lifespan, but no one will live beyond ~120 years unless there are major breakthroughs.
We are nowhere near the genetic and cellular breakthroughs needed to break through the upper ceiling, although it is noteworthy that companies like Google’s Calico and individuals like Abrey deGrey are working on the problem of ageing, which they treat as an engineering problem. Good. But instead of aiming for 200, 500, or 1000 years, try to solve very specific problems like cancer, Alzheimer’s and other debilitating diseases.
Transhumanists, Cryonicists, Extropianists, and Singularity proponents are pro-science and technology and I support their efforts but extending life through technologies like mind-uploading not only cannot be accomplished anytime soon (centuries at the earliest), it can’t even do what it’s proponents claim: a copy of your connectome (the analogue to your genome that represents all of your memories) is just that—a copy. It is not you. This leads me to a discussion of…
The nature of the self or soul. The connectome (the scientific version of the soul) consists of all of your statically-stored memories. First, there is no fixed set of memories that represents “me” or the self, as those memories are always changing. If I were copied today, at age 63, my memories of when I was, say, 30, are not the same as they were when I was 50 or 40 or even 30 as those memories were fresh in my mind. And, you are not just your memories (your MEMself). You are also your point-of-view self (POVself), the you looking out through your eyes at the world. There is a continuity from one day to the next despite consciousness being interrupted by sleep (or general anaesthesia), but if we were to copy your connectome now through a sophisticated fMRI machine and upload it into a computer and turn it on, your POVself would not suddenly jump from your brain into the computer. It would just be a copy of you. Religions have the same problem. Your MEMself and POVself would still be dead and so a “soul” in heaven would only be a copy, not you.
Whether or not there is an afterlife, we live in this life. Therefore what we do here and now matters whether or not there is a hereafter. How can we live a meaningful and purposeful life? That’s my final chapter, ending with a perspective that our influence continues on indefinitely into the future no matter how long we live, and our species is immortal in the sense that our genes continue indefinitely into the future, making it all the more likely our species will not go extinct once we colonize the moon and Mars so that we become a multi-planetary species.
February 21, 2018
Transistors – The Invention That Changed The World
Real Engineering
Published on 12 Sep 2016
February 15, 2018
QotD: Computer models
How can one be certain about outcomes in a complex system that we’re not really all that good at modeling? Anyone who’s familiar with the history of macroeconomic modeling in the 1960s and 1970s will be tempted to answer “Umm, we can’t.” Economists thought that the explosion of data and increasingly sophisticated theory was going to allow them to produce reasonably precise forecasts of what would happen in the economy. Enormous mental effort and not a few careers were invested in building out these models. And then the whole effort was basically abandoned, because the models failed to outperform mindless trend extrapolation — or as Kevin Hassett once put it, “a ruler and a pencil.”
Computers are better now, but the problem was not really the computers; it was that the variables were too many, and the underlying processes not understood nearly as well as economists had hoped. Economists can’t run experiments in which they change one variable at a time. Indeed, they don’t even know what all the variables are.
This meant that they were stuck guessing from observational data of a system that was constantly changing. They could make some pretty good guesses from that data, but when you built a model based on those guesses, it didn’t work. So economists tweaked the models, and they still didn’t work. More tweaking, more not working.
Eventually it became clear that there was no way to make them work given the current state of knowledge. In some sense the “data” being modeled was not pure economic data, but rather the opinions of the tweaking economists about what was going to happen in the future. It was more efficient just to ask them what they thought was going to happen. People still use models, of course, but only the unflappable true believers place great weight on their predictive ability.
Megan McArdle, “Global-Warming Alarmists, You’re Doing It Wrong”, Bloomberg View, 2016-06-01.
December 12, 2017
Why Hold Music Sounds Worse Now
Tom Scott
Published on 27 Nov 2017It’s not your imagination; hold music on phones really did sound better in the old days. Here’s why, as we talk about old telephone exchanges and audio compression.
Thanks to the Milton Keynes Museum, and their Connected Earth gallery: http://www.mkmuseum.org.uk/ – they’re also on Twitter as @mkmuseum, and on Facebook: https://www.facebook.com/mkmuseum/
September 14, 2017
The art of leadership and other secrets
In Taki’s Magazine, Steve Sailer remembers the late Jerry Pournelle, including a helpful leadership tip he once shared:
I didn’t meet Jerry until 1999, but I’d known his son Alex in high school. The Pournelle family asked me to go with them to Kansas City in August 1976 to the science-fiction convention at which Heinlein, the central American sci-fi writer of the 20th century, received his lifetime achievement award. (But I had to be at college that week.)
But Jerry, one of the great Southern California Cold Warriors, had a remarkable number of careers, starting as a teenage artillery officer during the Korean War, which deafened him in one ear. (At the lunch table, he’d choose his seat carefully to position his one remaining good ear next to his guest.)
He once recalled a question from the Army Officer Candidate School test:
Q. You are in charge of a detail of 11 men and a sergeant. There is a 25-foot flagpole lying on the sandy, brush-covered ground. You are to erect the pole. What is your first order?
The right answer is:
A. “Sergeant, erect that flagpole.”?
In other words, if the sergeant knows how to do it, then there’s no need for you to risk your dignity as an officer and a gentleman by issuing some potentially ludicrous order about how to erect the flagpole. And if the sergeant doesn’t know either, well, he’ll probably order a corporal to do it, and so forth down the chain of command. But by the time the problem comes back up to you, it will be well established that nobody else has any more idea than you do.
He also quotes Dave Barry’s breakdown of Pournelle’s monthly columns for Byte magazine:
In 1977 Jerry paid $12,000 to have a state-of-the-art personal computer assembled for him, supposedly to boost his productivity. By 1980 that led to his long-running “Chaos Manor” column in Byte magazine in which he would document his troubles on the bleeding edge of PC technology. As fellow word-processing aficionado Dave Barry explained jealously, Jerry got paid to mess around with his computers when he should be writing:
Every month, his column has basically the same plot, which is:
1. Jerry tries to make some seemingly simple change to one of his computers, such as connect it to a new printer.
2. Everything goes hideously wrong…. Sometimes there are massive power outages all over the West Coast. Poor Jerry spends days trying to get everything straightened out.
3. Finally…Jerry gets his computer working again approximately the way it used to, and he writes several thousand words about it for ‘Byte.’
I swear it’s virtually the same plot, month after month, and yet it’s a popular column in a magazine that appeals primarily to knowledgeable computer people.
I like to imagine Steve Jobs circulating “Chaos Manor” columns to his executives with scribbled annotations suggesting that some people would pay good money to not have to go through all this.
August 27, 2017
August 5, 2017
What are binary numbers? – James May’s Q&A (Ep 11100) – Head Squeeze
Published on 5 Jul 2013
James May asks “What are binary numbers, and why does my computer need them?
Watch James getting confused here: http://youtu.be/8Kjf5x-1-_s
Binary: http://mathworld.wolfram.com/Binary.html
Counting in base 10: http://mae.ucdavis.edu/dsouza/Classes/ECS15-W13/counting.pdf
March 15, 2017
QotD: How may I interrupt you next?
Q: What do Google, Facebook, Twitter, Apple, and Samsung all have in common?
A: Their business models involve interrupting you all day long.
Individually, each company’s interruptions are trivial. You can easily ignore them. But cumulatively, the interruptions from these and other companies can be crippling.
In the economy of the past, companies made money by being useful to customers. Now the biggest tech companies make their money by distracting you with ads and apps and notifications and whatnot. I don’t mean to sound like an alarmist, but I think this is the reason 80% of the adults I know are medicating. People are literally being driven crazy by a combination of complexity (too many choices) and the Interruption Economy.
There are days when my brain is flying in so many directions that I have to literally chant aloud what I need to do next in order to focus.
[…]
I’m wondering if you have as many distractions in your life. And if you do, can the chanting help you too? The next time you have a boring task that you know will be subject to lots of interruptions, try the chanting technique and let me know how it goes. It probably won’t cure your ADHD but it might help you ignore the tech industry’s distractions until you get your tasks done.
Bonus question: The economy has evolved from “How can I help you?” to “How can I distract you?” Can that trend lead anywhere but mass mental illness?
My hypothesis, based on observation alone, is that the business model of the tech industry, with its complexity, glut of options, and continuous interruptions are literally driving people to mental illness.
Scott Adams, “The Interruption Economy”, Scott Adams Blog, 2015-07-07.