Lost Girl: A Guilty Pleasure You Shouldn’t Feel Guilty About



Images courtesy of Showcase Lost Girl site

We all have guilty pleasures. Maybe you’re a hardcore leftist intellectual whose partner publishes dense books on comparative religion and you read People Magazine on the sly; maybe you’re a professor of Music specializing in Medieval religious music and you follow Miley Cyrus on Twitter and have every one of her albums; maybe your an avowed fan and proponent of the detective novel as major literature, a member of the Baker Street Irregulars, and a regular speaker on the influence of Doyle on modern detective fiction, but you have every episode of Scooby Doo on your Tivo. I dunno what yours is; I just know that people have them.

For me it’s usually some TV show or other. I can rationalize it; for example, I can make a good case that my love of Kim Possible shows my feminist leanings, my support of girl empowerment, and come up with plenty of other pseudo-intellectual nonsense, but the truth is I watch it because it’s funny and Kim kicks ass.

But I want to mention one guilty pleasure that is in some ways truly remarkable: Lost Girl.

At first blush, this is your classic guilty pleasure. Vampires! Werewolves! Succubi! Conspiracy theories and lost civilizations and lots of fight scenes! Lots of hot women in tight leather outfits! Gratuitous ow-neckline cleavage shots!  Girl-on-girl make-out sessions!

bo and lauren
See?  Told ya.

And let’s just stop there and back up a minute. Because here’s the thing:

From a perspective of how women are treated and how GLB (no trans characters that I can remember) relationships are treated, it’s one of the most level-headed shows I’ve ever seen.

The most obvious thing is who this show is about:  A woman.  And her female live-in, non-sexual best friend.  And the main character’s girlfriend.  And her main protagonists:  The leader of the “dark” folks (that’s what they call themselves)–also a woman–and her long-lost mother (yes, a woman).   (And oh, yeah; her sort-of boyfriend the werewolf.)

Seeing a pattern here?

The three main characters; what’s unusual for TV lead characters about this picture?

I haven’t even mentioned the many, many characters who are on for longer or shorter periods, like Linda Hamilton in a multi-show guest-starring role, or Rachel Skarsten as real-life valkyrie, or . . . well, you get the point.  LOTS of women, and front and center.  This show passes the Bechdel test with ease (although I’m sure there must be an episode somewhere in its five-year run that doesn’t).

And as a middle-aged guy who has always been aggravated by the way women’s roles in film and TV seem divided into two classes (ingenue, and mom), I’m absolutely thrilled that the powerful, strong, independent, sexy (it has to be said; she playing a succubus, for Pete’s sake!), tough, absolutely kick-ass woman who plays the lead is over 40 and (in real life) a mom.  A middle-aged woman who plays an independent person not mooning after some guy or is a mom?  Wow; who’d’a thunk?  And despite the “common wisdom” among Hollywood movie and TV types, it’s run for five seasons.  So put that in your sexist pipes and smoke it, you jerks!

Lead character, Bo, preparing to kick ass

And finally, I’m incredibly pleased at how unremarked the treatment of gays, lesbians, and bisexuals this show presents.  The lead character is a bisexual woman who has had men, women, and sometimes both as partners and lovers.  Various other characters are straight, gay, or lesbian, and no one makes a big point of it; it’s just part of their character.  We don’t have situations like “Will and Grace” or “Ellen” or many other shows and movies where a big deal is made of the fact that this or that character is gay or lesbian or bi and oh my god shouldn’t we get a lot of credit for being so brave?  Nope; it’s just a natural part of how the characters are portrayed.  And in my opinion, that’s what we’re driving towards, right?  Where being GLBT is so normalized and unremarkable that we don’t, well, remark on it.  (And a lesbian actress plays a lesbian character; heaven forfend!)

Now yes, this show definitely falls into the “guilty pleasure” category in many ways.  Being Canadian, it can show more nudity than US programs, and it takes this as far as it can–lots of beautiful women and men in very revealing clothing.  (Oh yes; men too.  You should see the scene where Bo, the main character, visits her mother’s house and is served–and offered “services” by–her mother’s shirtless, tight-leather-pants-wearing, hunky Chippendale’s male “thralls”.)  Lots of cleavage and tight leather pants and sex scenes.  Not to mention plenty of fighting with swords and knives and fists, claws and cross-bows, you name it.  Our Heroine has a trunk filled with weapons.

Beefcake on the hoof

So yes, “Lost Girl” is a guilty pleasure on one level, but on another, it’s quite a remarkable show.  If you at all like science fiction, fantasy, or strong, powerful, interesting lead characters, gender equality, and positively-presented (without a lot of self-congratulation) GLB characters and relationships, you might enjoy it, too.

Meetings, Smartphones, and You



Image courtesy of Mobility Digest

Recently I read an interesting post/piece of advice on LinkedIn about smartphone use during meetings in the corporate environment. The author, Travis Bradberry, provides a number of observations (mostly negative) and recommendations (ditto) regarding the use–or non-use, I really should say–of smartphones in meetings.  It’s a thoughtful article, but it misses a few points and, because I’m a blowhard, I thought I’d share.

I’ve been in high tech for a long time, and as you might expect from a bunch of nerds, we like to have the latest gear.  (I remember vividly a meeting in the late 90s, when the first PDAs came out; the meeting concluded, and then every nerd in the room gathered around the guy who had a new PDA, peppering him with questions, wanting to play with it.  We love our tech, we nerds.)  The point being that smartphones probably filtered into the meetings I attend in advance of, say, Wall St. banker meetings or Madison Avenue ad team meetings or whatnot.  By 2008, every nerd in high tech probably had an iPhone or a not-unreasonable facsimile.  So I’ve had plenty of real-world experience on how tech use and meetings collide.

Meetings are hard.  Not “hard” in the sense that working 14 hours in the heat of a tobacco field is hard, or down in a coal mine, or even driving a truck.  But the purpose of a meeting is to get agreement on the items this particular group of people has to decide on, or relay some critical information to a group.  And the hard part is getting those done without petty bickering; boring the majority of the people (all at once or in turn as topics come up that only one or two people care about); pedantic descent into arcane details (engineers do this a lot); not getting agreement; losing control of the meeting so the key information isn’t relayed; and on and on.

By far the biggest risk for a meeting attendee–particularly when you’re attending a meeting run by someone several levels above you in the hierarchy–is massive, profound, unbelievable boredom.  This isn’t anyone’s fault; if executives didn’t meet with the “individual contributors” (as we working stiffs are called), they would (rightly) be seen as “out of touch”, so they need to “address the troops” on some kind of regular basis.  The problem with this is your typical executive sees the world from such a rarefied level, where everything is corporate profit and loss, meetings with other executives at other companies, trips to give talks at various industry events, meeting with high-level politicians, etc., that to an IC they are speaking of stuff that has very little to do with an IC’s day-to-day (or even year-to-year) life.  Sure, it’s important that they’re out there doing that stuff, getting government contracts, and so on, but you’re writing code/error checking code/writing documentation/creating marketing collateral/selling to other companies/doing IT work/etc., and that stuff, well, in a very real way it simply doesn’t matter.

Even when an exec is meeting with a small group, it’s important to remember that he (it’s almost always a “he”) has very little idea of what the people in the room with him do day-to-day.  In my field, I’ve met with many executives who had no idea what a tech writer even was, let alone what I did every day.  So as you might imagine, there’s a pretty big disconnect between the executive and the ICs in that room.  The executive wants to make contact, but the people are bored.  And what to do is always a challenge.  And your typical IC is constantly aware that every minute he or she spends in that room is one minute less spent fixing code/writing content/doing IT work/etc.  What to do?

Back in the day, people took notes in notebooks, on memo pads, on graph paper, etc.  Some physical method of keeping track of things.  And in those boring meetings, you could simply doodle, or work on your novel, or write sarcastic notes to yourself, or maybe polish off that thank-you note to granny.

Laptops, tablets, and smartphones are an absolute boon to the boring meeting issue.  If you can get away with bringing a full laptop–and this has become more acceptable over the years–and the meeting is such that your participation is unneeded other than your physical presence in the room, you can get work done, check your email, and even discreetly web surf (if you have the nerve).  That meeting time is much less wasted.  Yes, there was a big push to get people to leave their laptops behind for meetings, but over time people have recognized that a) It didn’t do much good, and b) Plenty of people take their notes on their laptops.  (In my case, I used an elective at age 12 in order to take typing, writing was such a laborious chore for me.)

(The “Agile stand-up”, by the way, is one attempt to battle this from two directions. On the one hand, these meetings are limited to 15 minutes, guaranteeing to the participants that any boredom will be short-lived.  And since you’re literally supposed to be standing up, using a laptop is pretty much impossible.)

But smartphones (and Blackberry’s back in the day) allow you to do Internet stuff anywhere, with a tiny device.  And as we’ve reached the saturation point with smartphones in the population (and you can guess how saturated the high tech industry is!), people have come to use their smartphones instead.  And this is really honking off some people, as Mr. Bradberry points out.  Unfortunately, some of the suggestions he makes, and the assumptions behind them, bear a bit more examination.

For example, Bradberry points out “The more money people make the less they approve of smartphone use.” Alas, the more money people make, the higher up they usually are in the corporation, and those folks tend to use their smartphones more during meetings than anyone.  (Some of them seem to be using them as another way in which execs show their importance to the peons–“Your puny meeting is not nearly as important as my daughter’s Instagram pic that she just texted me, but please do carry on.”)  There are a couple of issues here, the most obvious of which is the blatant double-standard.

But to be blunt, one issue is that meetings are too frequent, too long, too boring, and include people that they don’t need to. Executives and directors live by meetings–it’s a major part of their job–but individual contributors don’t, and forcing them to attend a ton of meetings is not an efficient use of their time. Certainly some amount of attendance is necessary to coordinate work, but in my experience the amount of meetings and meeting length is excessive. People break out their laptops, tablets, and smartphones in self-defense.  If you want to continue to see “productivity increases”, Mr. or Ms. Executive, you shouldn’t squawk when your employees are trying to squeeze in work during boring meetings.

Should people be playing Tetris or Minecraft of checking their Twitter feed while the VP is lecturing?  No; it’s rude.  But on the other hand, if the room falls asleep because the exec is speaking so far above their heads they can’t even see his tail-lights, that’s even more rude.  If you see a lot of smartphones out, might want to reality-check your agenda, or engage with your folks more directly.

So the second part of this is: Executives need to recognize that individual contributors are not thrilled to be taking time out of their day to watch power-point presentations and listen to (as Peter put it in “Office Space”) “eight different bosses drone on about mission statements”. Keep your meetings to the point, concise, and as short as absolutely possible. If you can end a scheduled 1-hour meeting in 20 minutes, your people will love you, and smartphone, tablet, and laptop use will plummet.

Bradberry cites some stats that I think are important to keep in mind:

  • 86% think it’s inappropriate to answer phone calls during meetings
  • 84% think it’s inappropriate to write texts or emails during meetings
  • 66% think it’s inappropriate to write texts or emails even during lunches offsite

I have to agree that people answering calls during meetings seem rude.  But you know: I’ve done it.  Because my boy had injured himself and I needed to respond right away, or because my wife was in a dire situation because her car had broken down on the freeway.  I would like to see some stats, but I don’t get the sense that people answer their smartphones for any reason other than critical ones during meetings.  And (again in my experience) they leave the room so as to provide minimum disturbance.  In high tech, this doesn’t seem to bug people very much.  And honestly I think that’s because tech folks are more used to tech, and they have started to create etiquette to deal with the new smartphone reality.

Smartphone etiquette is still evolving. It was once verboten for folks to bring laptops to meetings; then we went through a period where it seemed that everyone was bringing their laptops but no one was paying attention; then a period where laptop use in many companies expressly forbidden during meetings (which was hell on me as I noted above). But now some people bring them for note taking, presenting information, etc., and some don’t, and those that do seem to better recognize that they need to practice active listening even when the lid on their device is open. Soon, it won’t be an issue. Smartphone use in meetings will evolve similarly, I predict. Smartphones are really only 7 years old; it will take a little time.

So in short, yes, ICs need to be aware that it honks people off to be seen taking out your smartphone, even if you’re using it for note-taking. But managers and execs need to also recognize that meetings are seen by ICs as (at best) a necessary evil, and do their part to keep them short, to the point, and infrequent.

That’s my worm’s-eye view, anyway.  (And I’m not the only one who feels this way.)

Beauty, Physical Shapes, and Our Twisted Societal “Requirements”

Image courtesy of Box Six

Lots and lots and lots of people have written posts, articles, scholarly dissertations, and entire books about the standards of beauty that are pushed on our society by media from movies to newspapers to billboards to you name it.  It’s a bit presumptuous of me–to say the least–to think that I can add to that.  But recently an college/internet friend was lamenting the insanity of body image and just by losing a few pounds (not on purpose), suddenly she was getting lots of comments about how “skinny” she looked.

I’m glad that people are complimenting her, but the subtext here is really unfortunate. “Skinny” as the correct goal, as the only proper body type for a woman to have, as if those stick-like runway models you see posing in Calvin Klein or whatever are the epitome of feminine perfection.

It’s just as twisted in Hollywood, of course.  I’ve done a bit of research and found that the vast majority of Hollywood actresses are both tall (few under 5’8″), slender, and small-busted (B or A cup sized).  When a woman gets up to a C cup, suddenly she’s “curvy”; you get over that and they start calling you “full-figured” or “voluptuous”, which in Hollywood-speak–but not in the real world!–means “overweight”.  In a culture that calls Jessica Alba (5’7″, 125 pounds, B-cup) “curvy”, you know something is out of whack.  And women that are Belle Epoque-style like Christina Hendricks or Kat Dennings stand out so much that you can’t hardly read a profile of them without their size being mentioned.  It’s insane.

(The thing that struck me is that in a country where the average height of a woman is 5’4″-5’5″, anyone under 5’7″ is considered “short”.  I keep thinking Rachel McAdams is short, but she’s 5’5″, for pete’s sake!  And when you watch “Firefly”, Christina Hendricks looks like she’s quite short, but in reality she’s 5’8″–she’s just surrounded by a bunch of men who are all over 6′.  Which is a whole different conversation.)

I once opined to a friend of mine that I really appreciated seeing actresses that were built like women, with busts and hips and whatnot.  She almost hit me.  “Plenty of women are naturally thin!” she said.  “There’s no one build!”  And she was absolutely right and I had been stupid, and that’s basically the point of this whole rant:  We come in all sizes, and it doesn’t matter a damn bit what your size is or even how you look.  I have dated women varying from 6′ tall and very slender to under 5′ and very curvy, and who cares, you know?

When you love someone, or even just like hanging out with them, you will stop focusing on the inevitable imperfections–my chipped tooth; my thinning, graying hair; my over-abudnace of stomach fat, my lack of much of a chest or a six-pack (which I’ve never had); etc.–and start seeing mainly, or even only, those things that you like.  Maybe like an ex-gf, you are fascinated by the lion-like yellowish tinge at the center of my irises, or maybe you find my chip-toothed smile endearing, or maybe my perpetually rumpled look makes you feel comfortable.  Whatever.  It doesn’t matter, ultimately, because if you like the person, you’ll like things about their physical appearance.  You just will.  Or you’ll simply cease to notice it.

Now I would be an idiot if I said that looks and build and whatnot didn’t matter at all.  Of course they do; perforce, they’re the first thing you notice when you meet someone.  And of course you have preferences; everyone does.  But if you limit yourself by those, you’re ruling out a pretty wide swath of people that you might quite like, or even want to partner up with.  Just because I tend to prefer curvy brunettes and redheads doesn’t mean I haven’t dated slender blondes (or slender women in general); just because I like being significantly taller than a partner doesn’t mean I only dated short women and indeed, I married someone who in our wedding pics looks taller than me.  Etc.  It’s the person, you doofs.  They say beauty is only skin deep, and I have to say I agree with Rosie O’Donnell in “Beautiful Girls”:

Implants, collagen, plastic, capped teeth, the fat sucked out, the hair extended, the nose fixed, the bush shaved… These are not real women, all right? They’re beauty freaks. And they make all us normal women with our wrinkles, our puckered boobs, hi bob, and our cellulite feel somehow inadequate. Well I don’t buy it, all right? But you fucking mooks, if you think that if there’s a chance in hell that you’ll end up with one of these women, you don’t give us real women anything approaching a commitment. It’s pathetic. . . If you had an once of self-esteem, of self-worth, of self-confidence, you would realize that as trite as it may sound, beauty is truly skin-deep. And you know what, if you ever did hook one of those girls, I guarantee you’d be sick of her.

And it’s true, kids; it really is.  If the person you’re with is not someone you like talking to, believe me:  The morning after in bed is going to be damn awkward.

I don’t know how to change the warped perceptions of “Beauty” (with a capital “B”) coming out of media, but we can all do it one person at a time, right?  Stop judging those women by their slenderness or big boobs; stop judging that guy by his nice ass and awesome six-pack.  Listen to Rosie; those things fade.  Let’s all get a grip.

A Little (Well, Quite a Bit of) Windows Hate Venting


, , , , , ,

Photo courtesy of ZDNet

So let’s get the caveats out of the way right up front, shall we?  First off, I am not a Mac acolyte (or MacAholic, or Mac Booster, or AppleHead, or whatever you call folks who think the sun rises and sets on Apple products).  Yes, we have a bunch of Apple products at our house, but that’s primarily because once I got an iPhone and Sami got her work Macbook, it just made life simpler to have everyone on the same OS.

At heart, I’m a UNIX nerd, and have been since the Reagan administration.  I’m perfectly comfortable with GUIs that run on top of UNIX (or UNIX-ish) systems, like the Linux GUI, or the Mac desktop, or (and especially) the late, lamented Silicon Graphics desktop.  But at heart, I’m a UNIX nerd who has VI keyboard shortcuts coded into my lizard brain.  It’s just how I am

But second, I don’t like Microsoft, the company.  I don’t like how they used their OS market dominance to force products on people and kill (often technologically superior) competitors; I don’t like how they steal ideas from anyone and everyone, change them slightly, and try to pass them off as their own.  And I most especially don’t like their huge, bloated desktop programs, which they seem to change every couple of years for no reason other than to update the color scheme or move menu items around.  (Or to add my least favorite GUI innovation ever:  “Ribbons“.  An opinion in which I’m hardly alone.)

To get to the point here:  For the last several years, I’ve been working on a Mac (for business reasons–I tend to work on what they give me, and haven’t owned my “own” system in a long time, unless you want to count my iPhone and iPad mini).  There are any number of things about the Mac interface that annoy me–and seriously, don’t get me started on that horrific piece of design known as iTunes–but in general I find the Mac user experience pretty solid.  In fact, it reminds me very much of my beloved SGI Indy box.  But recently I changed jobs and am back on a PC, and my Windows hate has surged once again to the fore.

Rather than get into all the ways that Windows drives me nutty–how hard it is to take screen caps compared to on a Mac, or the difference in complexity in deleting a program (on a Mac, you just drag the thing to the durn trash can!)–I can sum it up pretty quickly:  Defaults.

You know all those settings you can change in Windows?  The default font size of your emails; whether calendar alarms chime and with what sound; whether the top and sides of the screens perform that new “docking” maneuver; etc.  All those options have settings put in by Microsoft out of the box.  Those are the defaults.  And for me, the difference between the (SGI) Irix GUI and Mac desktop, and the Microsoft desktop, is that Microsoft sets all those defaults wrong.

No, I don’t want all those noises, chimes, and alarms to sound for all those applications.  No, I don’t want click noises when I move Windows around.  (I find Windows so noisy I keep the desktop muted all the time.)  No, I don’t want Internet Explorer to be my default browser.  No, I don’t want “ribbons” fully opened in all my apps by default.  No, I don’t want all my past emails “grouped by date”, I just want a flat list.  (And you can’t even set that to be the default; you have to change it for every single mailbox by hand!)  No, I don’t want the email list to be a stack; I want it to be a stack, with the most recent item at the bottom of the list, not the top (and when I reset it, please open it that way the next time I fire up Outlook.)  Etc.

And then there’s that little “Windows” button that everyone has on their keyboards.  OK, fine, but it’s right near the shift and control keys, which means half the time I try to use keyboard shortcuts, I get the Start menu instead.  It got so bad at a previous job, I actually pried the damn key off my keyboard.

The keyboard in question

Of course, there are other little annoyances as well that drive me nuts.  That you have to click in a window before you can scroll in it; a Magic Mouse on a Mac will scroll any window, no matter where you clicked last.  Or how hard it is to kill an application in Windows vs. Mac–in Windows, you have to open the task manager, where on a Mac you can just right-click that sucker.  But in the main, it’s those damn defaults.

Yeah, sure; some of those settings are wrong on the Mac, too.  But the majority of them, they’re fine.  On Windows?  I spend the first day or two resetting a whole bunch of defaults so that I don’t get annoyed every time I try to perform the simplest actions.

Some of you more savvy tech folks out there are saying, “What’s the big deal, Doug?  So they set up the defaults in a way you hate; you can always change them!”  Yup, that’s true.  I could argue that the fact that I have to spend the first couple of days on a new system changing all the defaults strongly implies they’ve screwed up on their choices pretty badly, but I won’t.  The real problem is:  Figuring out how to change these defaults is practically impossible.

There are a couple of reasons for this.  First, there is a veritable ocean of default settings, and when you’re looking for one drop of water in an ocean–or even in a bathtub–it’s going to take you a long time to find it.  And it turns out that Microsoft’s inability to correctly guess what their users are going to need on a default basis extends to where they put their various system settings.  For example, you know that thing the desktop does when you drag a window to the top of the screen and it plunks it into full-screen mode?  That’s a feature set by default, called a “hot spot”.  It allows for auto-sizing and docking, and some people really like it.

Me, it drives crazy, so I wanted to turn it off.  It’s similar to the fact that when you launch new programs they always pop up on top of whatever windows are on your desktop.  Hey, if I launch a program that takes forever to come up, and then go into another window to do something else, I don’t want that damn program to grab control of my screen!  I tiled a different window to the top on purpose!  Similarly, if I want my window to go to full screen mode, I’ll do it myself, thanks.  Most of the time, I just want it to go to the durn top of the screen!

So okay, I want to turn it off.  Where is that?  Why, in the most obvious place imaginable!  You go to the Ease of Access Center from the Start menu (which is no longer labeled “Start”, I might add), select “Make the mouse easier to use”, and click a radio button labeled “Prevent windows from being automatically arranged when moved to the edge of the screen“.  Yeah, that’s right; under a mouse settings location.  Intuitive, no?


As a rule of thumb, let me just say this to you UI designers out there:  If your users have to go to Google to find out how to modify settings in your app, a) your app design is a failure, and b) your online help system sucks.  (And seriously:  You don’t want me to start on the lameness that is the MS help system.  When a user doesn’t know if she’s going to get pop-up windows, a separate help window, or get sent to the browser and the MS web site, you’ve got a screwed-up system.)  Just sayin’.

And for me that pretty much extends to the entire OS; it is a rare day when something I’m looking for is where I first look.  Or the second location.  Or even the third.  The vast majority of the time I have to pull up the help window.  How is this intuitive interface design?  Does Microsoft even have a user experience team?

Bet it saves them a ton of money, thought.

I’m not an engineer.  I trained as one, sure, but I’m not one.  But I’ve had a ton of experience in fiddling with new user interfaces, and on this, Windows really tanks.  And I know I’m not alone in thinking this.

[puff puff]  Okay; I’ve gotten that out of my system.  For now, anyway.


High Tech Sexism



Image courtesy of Business Insider

The high tech world is kind of a funny place.  (Aside from the fact that it’s populated by geeks, I mean.)  On the one hand, nerds don’t really care much about you so long as you’re a nerd.  Black, white, Asian, Indian; male, female, trans; gay, straight, questioning; monogamous or polyamorous; kinky or vanilla; sci fi or fantasy or romance or “lit-ruh-chure” or detective novels; it doesn’t really matter to a computer nerd.  If you can code and fit in with the nerd coding culture, you’re fine, you’re golden.

On the other hand, that culture was created largely by young, straight, white, middle- and upper-class men, even boys (maturity-wise).  So the only way to fit in is to model the behavior of that group.  Which for me–as a straight, white, middle-class male who got a degree in computer science from UC–wasn’t that hard.  But for a lot of others?  Well, the problem here is obvious.

For my entire career it’s been clear that there’s rampant sexism (among other inherent bigotries) in the high tech culture.  As this culture has developed heavily from nerd programming culture, when you think about it, it’s not very surprising.  And lately, there has been a lot more focus on this.  Articles on how women are treated in high tech; articles on the low number of women taking engineering degrees; articles on how few women stay in the engineering track in the high tech industry; articles on the dearth of female high-tech CEOs; articles on women being intimidated and pressured when they speak out about the obvious sexism.

None of this comes as a surprise to me.  What does surprise me, frankly, is how so many folks are trying to either defend this aspect of the culture, or wave away the accusations.  I think the nature of these defenses can be summed up by this person’s (a white male, naturally; probably straight, Christian, and middle-class too) comment on an article in HuffPo about how 40% of female engineers leave the field:

Most male engineers have similar complaints, and leave the profession too. As the old saying goes; “If you can’t stand the heat, stay out of the kitchen”.

Look:  I’ve been in high tech my whole career, over 27 years now.  I love it.  I love the people, and the cool new tech I get to see all the time; I love being on the “bleeding edge” of tech development; I love that the world has come to recognize the value of what my nerdly brothers and sisters are doing, and to even give us acclaim.  I wouldn’t want to do anything else.  But I have to say to folks like the gentlemen above:  STFU.  I have seen unbelievable, rampant sexism in this culture.  At all levels, from executives down to junior hackers, in ways both blatant and subtle, from giant multi-nationals down to tiny startups, it’s an anti-female culture, and it’s disgusting.  And to wave off that fact is doubly disgusting.

Before you make flip comments, create false equivalences about how it’s “just as bad for male engineers”, pretend that there’s really no sexism, how about you do this:  Spend several years having the first thing out of people’s mouths be a comment on your clothes, hair, or appearance (When was the last time you heard a guy at work tell another guy, “Hey, nice blouse!” or “I like what you did with your hair”, or “Are you wearing makeup”?).  Go to dozens of meetings over the course of a decade or two where every time you tried to speak or bring up a topic, solution, or idea some guy spoke right over you or the moderator basically just ignored you.  Endure years of snarky, snide, or derogatory comments about how often you’ve had to come in late, leave early, work at home, or take vacation days to deal with family issues (while your male co-workers simultaneously get praised as being “good dads!” for doing exactly the same things).

How about you spend years or even decades receiving 10-20% less salary than male co-workers doing literally exactly the same job?  How about you suffer through a few months of dealing with the hostility and anger of your co-workers because of your need to take time off after you give birth (not to mention ignorant comments about how you should “just deal with it” while suffering postpartum depression).  Or maybe enjoy the delightful emotions of watching men with less experience and qualifications promoted over you multiple times.

I have seen all this, consistently, everywhere.   At meetings large and small, in companies huge and tiny, all the time.  It’s consistent.  Yes, you can squawk that this is “anecdotal evidence”, and it is.  Of course, it’s completely validated by every single woman in high tech I’ve ever spoken to on the topic, from low-level folks toiling away on front-line phone support to high-powered VPs.  Often when I make these observations they snort or roll their eyes; it’s so obvious to them, it’s like they can’t believe it’s news to the likes of me.  That’s how prevalent it is, how entrenched.

I’m proud–incredibly proud–of what my industry has and continues to contribute to the country and the world.  I love the attitude so many of us have that every problem can be solved, if we just apply enough brainpower and tech to it (as misguided as that sometimes can be).  I love working in this industry.  But that doesn’t blind me to the rampant, horrific sexism (and often racism, homophobia, and other bigotry) that it contains.  So instead of trying to wave it away, or pretend it’s not so bad, or arrogantly and condescendingly telling women to “suck it up,” we do something about it?

And folks, the first step on “doing something” is to admit we have a problem.  Until we do that, we ain’t getting nowhere.  So let’s admit the problem, and get going, shall we?

Some Thoughts on Ferguson and Police States


, ,

I can’t believe it either

Many of the folks in my Facebook friends list/feed don’t share my political leanings and views, and I think that’s great, actually. But for those of you that don’t and have been good enough to read this, on the topic of what’s happening in Ferguson, please hear me out.

For years, we long-haired, unwashed, wild-eyed hippy types have been screaming about the military-industrial complex, the erosion of our civil rights, the incredibly lopsided (and frankly racist) way laws and policing are applied, and what many of us believe is a scary march towards a police state. For these warnings we’ve been marginalized, called crazy, and said to be over-reacting. I understand this POV completely; sometimes, we do sound crazy and are over-reacting. I could argue that no progress is made without people on the “fringes” pushing hard on the center (and I truly believe that’s the case), or that when you are polite and respectful you’re easy to dismiss, or that after years and years of trying to effect change you get durn frustrated, but the point is that fringe people are on the fringe, and the vast majority of us librul hippy types have been steadily gathering data, generating surveys, keeping records, and now have a whole lot of information on which to draw.

And the result? The military-industrial complex is growing–the Pentagon budget has increased wildly since 2001. Vast amounts of military-grade equipment is being sold second-hand to police departments all over the country–materials that were developed for the Marines to fight in Falljua, deployed in East Podunk, New Hampshire. The vast majority of stop&frisk stops in NYC are of black people. And everywhere, with every race, color and creed, if you don’t “comply” with everything a police officer says, even if what he or she is telling you to do he has no authority to make you do (such as forcing you to stop recording him or her), compliance is forced with violence, mace, Tasers, and often detention (even if you’re not charged with a crime). The number of these incidents, the number of armed PD deployments, the number of violent arrests and incarcerations, has increased massively in the last generation.

These are simple facts. There is no disputing them; they’re facts. And as a wild-eyed, long-haired, left-wing hippy type I ask you: What kind of state is it that forces you to comply with “the authorities” no matter what? And where “the authorities” are heavily armed and armored? That’s the definition of a police state, folks.

I’ve buried the lede here, but this brings us to the recent events in Ferguson.  Ferguson is a textbook demonstration of this situation.  African Americans are stopped and checked for contraband at a significantly higher rate than whites, while at the same time whites are actually carrying contraband at a higher rate than blacks!  The police force has moved in to quell “riots”–riots caused by their own behavior!–with tanks, tear gas, armor, military-grade weapons, and so much other force that I was scarily reminded of the brave Chinese man standing in front of the armored column in Tianamen Square in 1989.  The police have been making fragrantly blatant demands of the people of the town–illegal, incorrect, and often unConstitutional demands–and then punishing them with tear gas, rubber bullets, and incarceration if they “fail to comply”.  Blacks are being disproportionately effected by this.

And I submit to you that this is a crystal-clear view of exactly the things we wild-eyed, hairy, smelly hippies have been screaming about for 35 years or more.  And it’s happening in town after town, city after city, state after state.  Thousands of police departments are asking for this type of material, few of which receive the training needed on it.  (The military has weeks and weeks of boot camp to deal with it.  Hell, it was two weeks worth of fencing lessons before they even let us hold the foils!  If that’s the case for fake swords, how long do you reckon it takes to become proficient with, say, a flamethrower or a sniper rifle?)  This, folks, is a Constitutional-violating, military-industrial complex-supplied police state.  We were right.  And frankly, I don’t see how anyone can look at the events in Ferguson and disagree.

This is coming to your town, and it may be your head that’s cracked with a baton next, your family breathing tear gas, your kids in the line of fire of minimally-trained, armored officers in military vehicles.  Are you really going to wave it all away, or are you going to stand up and do something?  Call your state and national congressmen; call your senators; go to city council meetings; write letters.  Because if you don’t, they’ll assume it’s all fine by you, and the next bone broken by an overzealous cop may be your own.

Equal-responsibility Dadhood


, , , , ,

What to expectJoey
Image courtesy of Ruddy Bits

Lately I’ve been thinking a lot about motherhood, dadhood, co-parenting, and the work/home/family ratio that we all struggle with.  (And why is “motherhood” a word, but “dadhood” isn’t?  Seriously?)  I just changed positions here at work, and in my new slot I have to go into the office, well, pretty much all the time.  Now, in this regard I’m no different from the huge majority of the rest of the planet, but I had been blessed over the last 13-14 years or so to be able to spend a lot of my work time in my home office, so this is a huge change for me and my family.  So I’ve been thinking about it.

Then at the urging of Rebecca Traister after I sent her a few thoughts on her column in the New Republic on this very topic, I thought I might share some of my observations from the point of view of a man who has been, largely, a stay-at-home or work-at-home dad for most of my kids’ lives.  (My daughter is now 19; my son 16.)

A key point that Rebecca touched on, and that my experience validates, is that even for your new-aged, fully-evolved, committed-to-co-parenting, sensitive, post-Feminist-era guy, our society is so overtly geared toward motherhood rather than dadhood or (much preferably) parenting that a guy practically has to be rapped in the teeth before he “gets it”, before he understands at a visceral level (that many women seem to understand without any coaching on the delivery table, if not sooner) the huge commitment involved in parenting.  For Rebecca, it happened right away:

A very similar thing happened to my husband and me. After a C-section, and in the midst of the rigors of breastfeeding, we made an unspoken agreement: My job was producing milk. His job was everything else: diapers, clothing, bathing, figuring out the naps and soothing and pacifier and bottles for the pumped milk. When I emerged from my post-partum cave a few weeks after the birth of our daughter, my husband, a criminal defense attorney, had to teach me how to change a diaper; he had to show me how the little flaps on the sleeves of the onesies kept our daughter from scratching herself. He was the expert; I was the novice. But because every social and cultural script pushed me, swiftly, toward equal expertise in these matters, we wound up co-parents. Had it worked in reverse, the chances that he would have felt pressure, guilt, or incentive to dive into the nitty-gritty of wipes and burping would have been extremely low.

For me, it took longer.  I was determined to be a “co-parent”, and am pretty damn stubborn.  I was very much brought up in the Ms. Magazine, “women are equal”, “No means no!”, “Our bodies, our selves”, “Free to be you and me” 70s liberated mom environment, and I was not going to be one of those typical dads.  (Quite aside from the fact that, while I can set up a home network, configure a router, keep all the house gadgetry working, etc., I’m incompetent when it comes to, say, fixing a leaky faucet.)

But that being said, it still was very difficult for me to get myself in the mindset of being a full participant.  My job urged me to come back to work immediately, half-time for six months rather than take 3 months of no-pay family leave.  And because I did, while I was definitely a full participant for the time I was at home—changing diapers, dealing with the diaper service, sterilizing milk bottles, feeding the new baby, splitting the midnight-six shift as much as possible, the fact was I wasn’t a full participant.  And I certainly didn’t get it at a visceral level.

But then we adopted my son, and because my partner made more money than me, and had a better stock option plan, we decided that I would quit my job and stay home with our new son.  I was the primary care-giver for him and my daughter—driving them to and from preschool and kindergarten, doctor’s appointments, Gymboree, etc.; shopping and making dinner; doing the laundry; dealing with the home upkeep; and everything else so that Sami could simply work and not have to worry about anything.  After that year, I worked at home for the next 8 years.  As far as Joseph was concerned, Dad never went to the office until he was 11.  At which point, some health issues on my partner’s part forced her to stop working and I had to take whatever job I could to keep us afloat, forcing me to actually commute to California from Austin on a regular basis.

Now I’m pretty sure my partner would agree I (and this is how she puts it, not me) do “more than my fair share”.  Laundry, dishes, grocery shopping, bill paying, kid shuttling, etc.  This is not to brag, but just to say that I am a very full participant.

And that’s the problem, isn’t it?  Any time a guy says, “Hey, I’m a full participant!” he’s either not believed, or treated as a braggart.  But the truth is in my job in high tech, it’s damn hard to juggle the work responsibilities against the family ones.  And it’s even harder, given that our two kids have special needs.

And unfortunately, work is not structured to encourage and support parents who want to work at home, even in jobs (I am a technical writer, so working at home–as I’ve demonstrated off and on for nearly 15 years now–is absolutely a workable option) where it is doable.  Editing, for example.  Coding.  Many phone support positions.  There are lots of them.  But the business world, and management, simply is not comfortable with this.  (Look at Marissa Mayer of Yahoo–a high tech company that deals in virtual products!–who decided she wanted everyone on site, for example.)

The other part of the problem is society and social norms.  The unspoken (and in some cases, like mine after my daughter was born) overt pressure for the man to leave parenting to the mom–particularly in the very early stages–in huge.  There’s pressure on women, too, no question, not to mention discrimination both subtle and overt–a reluctance to hire child-bearing-age women because you might “lose them” to motherhood after training them, pressure on new moms to be back at work as quickly as possible and not take the full legal guaranteed family leave time off, the unspoken criticism by co-workers when a woman disappears for 3 months because she had a child (companies usually try to “absorb” the extra work using the existing team rather than, say, hire temporary contractors to cover the absence–saves money, you see), and on and on.

Our society wants you to work at the expense of the family, but the guy in the relationship is expected to not be as interested, not be as involved, not be as engaged, and believe me, you feel it.  And even if you’re determined to not let it effect you, as I was, too often you have to be rapped in the teeth with a hard fact before you change your perspective.

I’m writing about this because, like Rebecca says, the more guys who speak out, the more chance we have to change the situation.  I can’t change reality so that only guys get pregnant–wouldn’t that cause a rapid change to family-related work issues!–but I can speak out.  So I am.  Now it’s your turn, other guys.


Some Thoughts on Suicide


, , ,

Image courtesy of the Guardian Liberty Voice

As I write this, seemingly the world but certainly much of the country is mourning the death of the incredibly talented and comedically brilliant Robin Williams, possibly from suicide, according to the Tiburon sheriff.

With everyone and his brother–including me on Facebook–eulogizing Williams, I’m not going waste time on that.  Instead, I wanted to talk about the manner of his death, and a tiny little bit about the nature of his disease.

Now, I am not and never have been particularly suicidal.  I’m too arrogant and self-interested, and obnoxiously believe the world is generally a better place with me in it than without me.  But there was a time when I did, quite seriously, consider killing myself, and I’ll never forget it.

I suffer from chronic neck pain, a condition I’ve written about once or twice in various blogs here and there.  In my mid-30s, I was out skeet-shooting with my father-in-law and exacerbated a design flaw in my neck–my spinal column is very narrow up in the cervical area–causing a disk to bulge into my spinal cord, crushing some nerves and causing me immense pain.  And when I say “immense”, this is not typical Doug hyperbole; this is the kind of pain so intense that 12 Vicodin a day not only did not make me sleepy, but only controlled the agony sufficiently enough for me to minimally function.  I would wake at 3am in pain in advance of my 4am dose; I drove my car one-handed, the other propped painfully on the arm rest.  Etc.  It was unbelievable.  “Worse than labor pains, I’m told!” my orthopedic surgeon cheerfully told me.

I had surgery, relieving me of the worst of the pain, but since then, for the last 15 or so years, I’ve had associated pain around that area, at the base of my skull.  I get regular shots in the back of my head to control the pain; I go to the chiropractor regularly; I see a pain management doctor every 4 weeks; I take an almost-absurd cocktail of drugs.  By and large, the pain is controlled and “managed”, though I’m never quite free of it, even on the best days.

By and large.

But I do have occasional “flare-ups”, where the pain approaches and sometimes reaches the same levels of agony that I sustained back before the surgery.  And one day, sitting on the floor of the shower, head in hands, water pouring down on me, desperately waiting and praying that the additional morphine, Excedrin, Advil, and tequila I had ingested would do something, anything, to alleviate my agony, I reached the Dark Place.

If you’ve thought about suicide, seriously thought about it, thought about actually doing it, you know what I’m talking about.  The Dark Place is where you–literally–feel you can’t go on, you can’t take any more, the only way to end your suffering is to end your life.

“Cowardly”; “a waste”; “selfish”; I’ve heard all these and more with regard to suicide, and felt that way myself.  But in that Dark Place?  You’re in massive, unbelievable emotional (or in my case, physical) pain.  You can’t imagine it ever getting better, or going away.  You think of the days, months, and years of pain stretching ahead of you–decades of suffering, suffering, suffering–and you think, “What’s the fucking point?”

Think of me, there in that shower.  Naked (best not contemplate that image too closely!), cross-legged on the tiles, head hanging down, the water pounding down on the back of my neck,, the pain like someone who weighs 300 pounds pressing a dull knife into the back of my neck just below my skull over and Over and OVER and OVER again, endlessly, never to stop.  15 years of pain and suffering behind me.  My grandmother lived to be 90–40 more years of suffering and pain, pain, pain, endless pain stretching ahead of me.  Pain and bills and pain and guilt and pain and worry and pain and workworkwork and pain and . . .

And you think, ya know, I have plenty of morphine there in the bottle.  More than enough.  I’ll fall asleep and that’ll be it–no 40 years of constant, non-stop, unendurable pain.  Haven’t I given enough?  Haven’t I tried enough?  How long do I have to keep on before I get a friggin’ break?

Now obviously, I left the Dark Place.  No, that’s not entirely accurate; I thought of Sami and my two kids and the other folks who–God only knows why–love me and care about me, and I held onto that thought tight and hauled myself out of that Dark Place by desperate strength, holding on to the thin reed of hope that the pain would abate, would get better, and I wouldn’t be facing 40 more years of it, ever and ever amen.  And when I was done, I turned off the shower, dried off, and went and lay in bed for several hours, feeling like, well . . .

Do you remember the scene in Return of the King, when Frodo loses the ring, it’s destroyed, and he’s dangling over a river of lava, not convinced whether he should bother helping Sam haul him back up?  But he does, he climbs out of his own Dark Place–40 years of longing for the ring, and suffering the hurt of losing it, the pain of the spider’s sting, the pain from the knife wound in his shoulder, the PTSD of carrying that damn thing for so long–and lets Sam lead him out.  And then he passes out, waking up in a soft bed in Ithilien, Gandalf leaning over him.  Remember the look on Elijah Wood’s face?  He’s “saved”, yeah; he’s still alive, but he’s wounded, and exhausted, and clearly not entirely sure he really wants to go on.

Yeah, that.  That’s where I was that day, laying on that bed, trying to leave that Dark Place behind.

It sucks at you, the Dark Place, like an effin’ black hole.  It pulls at you with the gravity of a promise of an end, an end, dammit, to the suffering.  And after years, decades of suffering, why the hell would you not want an end?  Why wouldn’t you deserve an end?  Haven’t you done enough, suffered enough, tried enough to get “better”, to end the pain, to leave that Dark Place behind?  How much longer do you have to try before you’ve earned your rest?  Earned an end to all that?  And if that end is only The End, so what?  How much more do you expect a guy to take?

Now look:  I’m fine.  I can still see that Dark Place, still feel its gravity, but it’s no more effective on me than the gravity of Neptune is on planet Earth; it may perturb my orbit a tiny, essentially immeasurable amount, but that’s it, really.  I’ve seen that, for my own pain, my physical pain, there are other options, things can improve, and so my thin reed of hope is now more like a strong metal ladder, bolted to the concrete and wood framework of my life.  I’m in a safe place, and I’m not worried.  And if I get close to the Dark Place again, there’s this good, solid ladder.

But what about psychological pain?  Pain that is unquantifiable, literally “all in your head”?  And what if you’ve been suffering for 40 or more years?  And have made multiple trips to that Dark Place?  And are staring another 30 years of pain and suffering in the face, having tried multiple times to leave it behind, build your own ladder and bolt it to your foundation?  And what if your foundation is termite-riddled bare wood on dirt instead of a good ol’ solid concrete slab?  What then?

Yeah, metaphor-heavy.  I’m sorry.  But you see the point, don’t you?  You see how a person’s genius, their ability to make other people happy, to make other people laugh, doesn’t do jack when you’re trapped in that Dark Place, and not only can’t find a way out, but can’t even imagine a way out.  And even when you can, when you can bring up the image of escape, all you can think is, “And jesus yeah, I may get out of here, but what then?  30 more years of this?  No!”

Robin Williams is gone, maybe from suicide.  But you won’t hear from me about “what a waste”, or that it was “selfish”, or that he should have “battled harder”.  Unless you’ve been in that Dark Place yourself and climbed out–and like Williams, climbed out multiple time–you really should keep your opinions to yourself.  You don’t know.  Even I don’t know.  But from where I sit, feeling even the tiny tug of my own Neptune-distant Dark Place, I know enough not to judge.

We are without Robin Williams now, and the world is poorer for it.  But I understand why he decided to leave.  And maybe now you understand, just a tiny bit better.

Avoiding Subsidizing Overpaid Ass-Clowns in an A La Carte Consumption World


, , , , , , , ,

Image courtesy of zenshaman.com

For about half my life, roughly, the media that I consumed was essentially collectivized.  That is to say that everything I saw or read or listened to was from a very limited set of corporate producers.  The individual content was from a huge mass of folks of course, but they were collected under the heading of “The TV Networks” or “The Big Publishers” or “The National Newsweeklies” or “The Big Record Companies” or what have you.

Over time, the collectors changed somewhat–cable and satellite TV became big business; the phone company was broken up, acquired media properties, and started consolidating; big players in one industry (Warner, e.g.) bought big players in other areas (Time, record companies, etc.).  But from the consumer perspective this all had the appears of deck chairs shuffling around on the Titanic; we were all still sailing on the USS Media Collective, where a very limited number of companies controlled a huge percentage of what we these days call “content”.  And it was in the interests of these big media companies to become even bigger, to acquire even more properties, leading us to a place like the current proposed Comcast/Time-Warner merger.

I was thinking of all this recently because of a big push by the New York Times to try to get people to subscribe to just their opinions section.  The New York Times opinion section is immensely popular, so much so that about 10 years back, they tried to put a paywall in the way of people who wanted to read just that content.  And like the vast majority of pay walls, it was a monumental failure and they gave it up.  Now they’re trying again.  But the thing is, I don’t want to pay some monthly subscription fee and get stuck with their idiot columnists like (shudder) David Brooks or Maureen Dowd or cliche-thrower & metaphor mixer Tom Friedman or right-wing anti-sex moron Ross Douthat; I just want to read Paul Krugman whenever I like. So I’m not going to subsidize people I consider overpaid ass-clowns just for that. And I doubt very much I’m alone in that regard.

And this got me thinking about how much the Web era has changed our expectations, how we consume (and want to consume) content, and the effect that’s having on these big–but terrified–media companies.

Big media companies want to continue to force you to purchase things collectively.  You know how it works:  If you want HBO, you have to get a cable or satellite subscription, and you have to pay for some kind of “premium” package, forcing you to buy dozens (or even hundreds) of channels of programming you don’t give a rip about just so you can watch “Game of Thrones” for three months out of the year.  Or you have to get a ruinously-expensive “add-on” package to the premium package if you want to watch, I dunno, hockey or football or whatever beyond what the networks offer “for free”.

It’s the same with newspapers; you may just wants the sports and comics (or in the case of my bff the rocket scientist, the comics and the technology section), but you also have to pay for the ads, the obits, the opinion section, the business section, and whatever else they put in there.  Or in the case of the New York Times and their brilliant new Opinion Subscription strategy, they want me to subsidize people I consider overpaid ass-clowns just to get the one or two people I think are worth actually shelling out dough for.  And every newspaper has that issue to some degree.

Hell, it’s even the same with music; they want you to pay $10-20 for a whole album, not just buy the one song from that album that you’re interested in.  Do you really want the entire “Despicable Me 2″ soundtrack, or do you just want “I’m Happy”?  And media companies want you to spend $15 just to get your personal dose of Pherrell.

Now, there are reasonable arguments to be made for forcing people to pay way more than they want for packages of stuff they’re not interested in so as to subsidize quality “minority”-level content.  But about 20 years ago, something funny happened that started us moving toward an a la carte world:  Mosaic, the first legitimate Web browser, was introduced.  And that, combined with the Net Neutrality-induced low bar of entry to publishing content, opened the content floodgates.  Mix in things like Amazon, iTunes, portable media players, iPads, and whatnot, and you have a world where not only are people familiar with buying only what they want, when they want, to consume at their own leisure, they expect it.  People get irked when their favorite podcast is late, or when they can’t download this week’s episode of “Mad Men” the day after it is broadcast.  (Not to mention the insanity of companies like HBO trying to force you to wait nearly a year to watch programming like “Game of Thrones”–a topic I go into in boring detail in other post.)

Now there is an entire generation–a generation as familiar with YouTube and NetFlix and Amazon Instant Video and iTunes as I was with the flavors of Slurpees available at my local neighborhood 7-Eleven–that has grown up with that.  (And don’t even get me started on social media!)  My son doesn’t care that the episodes of MythBusters he’s watching were filmed 7 years ago; my daughter doesn’t give a rip that the Anime she is enjoying were broadcast originally in Japan in 2003; they are products of the Internet age, and don’t care.  And for me, a long-time nerd, that the episode of “Top Gear” I’m watching was made in 2004 matters to me not a whit; it’s still fun to watch.  These are the times we live in, and the big media companies simply don’t get it.

Used to be, when I moved someplace new–and when I lived in Santa Cruz, I did it on an almost-yearly basis–I did three things immediately:  Unpack and set up my stereo, get out all my books and put them on the shelves, and subscribe to my newspaper of choice.  Getting everything else set up took a back seat–even the phone.  But music, books, and news were critical.

Now?  Now, I take out my iPhone, iPad, and Mac, and I have all three immediately.  I haven’t subscribed to a newspaper in nearly a decade.  My books are all in boxes.  I don’t even have a stereo.  My entire music and book collection I carry with me all the time, and the news I can access whenever I like, wherever I like.  For the media companies, this is of course a monumental disaster.  For the consumer, it’s unbelievably convenient and wonderful.  Talk about overcoming the PITA principle!

Until such a time as media companies like HBO and Time Warner and Comcast get on board with the fact that not only do we live in an a la carte world, but that denying people that access is counter-productive not only to their business model but also to their bottom line, we’ll continue to get pushed to sign up for things like the New York Times’ new Opinion-section Only Subscription App.  And I’ll say it again:  I doubt I’m the only person in the world who doesn’t want to subsidize overpaid ass-clowns just to get the content I want.

It’s an a la carte world, media companies; time to get over it and move along.  Or you’ll get run over.


The PITA Principle


, , , , , , ,

Image courtesy of Wiggins Marketing

OK, yes: I should be flogged for such a bad pun.  I beg forgiveness.

I’m in high tech, and in high tech we love our acronyms.  We love them so much, we even have an acronym for them:  TLAs, or “three-letter acronyms”.  Sometimes you get longer ones, but often they’re three letters.  But in this case we have a four-letter acronym: PITA.  “Pain in the ass”.  And I want to share a theory with y’all about why certain things get adopted by the public and world at large, and other things don’t.  I call this The PITA Principle.

The PITA Principle is simple:  The more of a pain in the ass something is to do, the less likely people will do it.  This seems obvious, right?  But the thing is, when you look at a lot of things that seem confusing from a rational perspective–why don’t people buy electric cars more often?–it’s because of The PITA Principle.  Having an electric car is more of a PITA than a gasoline car.  The world infrastructure is designed around gas cars that can be refueled in a few minutes, every 300 miles or so.  Gas stations are distributed accordingly.  People plan their trips based on this.  Their subconscious expectations are all geared towards it.  So why would you switch from something that goes 350 miles on a single refueling, said refueling taking less than 5 minutes, to something that goes less than 100 miles on a single charge, and recharging takes hours?  Even if doing so is cheaper, and more environmentally sound?  The PITA Principle, baby; it’s easier.  I think it’s that simple.

This explains the adoption of a ton of things that might–especially to curmudgeons–seem weird.  Why email rather than physical mail?  It’s easier!  The PITA Principle!  You can email in seconds, from your laptop, wherever you are; to mail something physical requires stamps and envelopes and licking and walking to the mailbox and paying money.  It’s not much of a PITA, but it’s more of one than sending email.

Which also explains why teens text so durn much; it’s even less of a PITA than email.  And furthermore, it’s less of a PITA (for a teen) than talking on the very same phone!  “Why?” you might reasonably ask.  Because when you talk on the phone, you have to be in a location with a reasonable amount of privacy, as does your calling partner; you have to deal with the emotional content of their voice, and correspondingly control your own vocal dynamics; you have to hang up or put the person on hold if interrupted, and so do they; and on and on.  It’s more of a PITA.  Texting is easier.  Teens text.

Or move on over into the political realm.  Despite the fact that the Republicans’ platform is out of step with more than 2/3 of the country (seriously; look it up), they continue to be competitive, are in charge of the House of Representatives, numerous states, may grab the Senate, and continue to be competitive in Presidential elections.  How is that possible?  Democrats far outnumber Republicans; Democratic positions (raise the minimum wage; increase Medicare and Medicaid coverage; improve Social Security; get government out of doctor/patient decisions; etc.) are wildly popular compared to Republican positions.  How do they keep winning?  Yes, incumbency; yes, Gerrymandering; yes, cheating.  But I also believe the PITA Principle plays a big role.  What’s easier?  Voting for the guy (or woman) who you’re familiar with, whose name you know, who you are used to.  “The Devil you know.”  The PITA point is lower.  Incumbents win because voting for them is easier.  The PITA Principle.

This is reflected in a lot of high tech success stories.  Not all, but some.  Why did Apple sell a b’zillion iPods, when there were so many other MP3 players out there?  Because by browbeating record companies and artists and publishers and making iTunes pricing very consistent, and making the downloading process easier and simpler than the competitors, Jobs lowered the PITA factor to a point where it was significantly better than his competitors, and thus won the market.  Why do people still buy more iPhones than Android phones?  Lower PITA point.  (Though Android is now very, very close, and in some ways better.)  Why do iPads continue to outsell other tablets?  The PITA point, which not only takes in the tablets themselves, but how they interact with iTunes, your computer (particularly if you’re using a Mac desktop or laptop), and the other iPads, iPhones, and Macs in your home.  Apple’s products, in the main, have extremely low PITA points, and they charge accordingly.

You can also see this, very much, in a business environment.  For example, at a previous job at [formerly awesome company that no longer exists], one team was performing software source control using a very sophisticated, graphical interface tool, while another team used a very rough-and-ready, command-line tool for their source control.  The graphical tool was more powerful, more technically sophisticated, did a better job and ensuring source security, was superior at preventing source collisions and workflow errors . . . and people hated it, and we all eventually moved back to the command-line tool, kludgy though it was.  Why?  The graphical tool was way more of a PITA to use and maintain, and the command-line tool was simpler and easier to use (and easier to spoof when something went wrong, too).  The lower PITA point won out, even though the company was actually selling the graphical tool!  A lower PITA point buys you a lot.

I’m sure someone smarter than me, with better math, economic, sociological, and business knowledge, would be able to put together charts, graphs, figures, and PowerPoint slides to make this into a true scientific study.  I’m sure there’s some kind of way to enumerate PITA values for particular products or processes, and correlate PITA points to prices and profit margins, but I’m not that guy.  John Nash could probably do it and win another Nobel Prize.  But I’m just a humble writer.  A humble writer who sincerely hopes someone smarter does take up this gauntlet, and see where it goes.


Get every new post delivered to your Inbox.

Join 49 other followers