Postal strikes, crankiness, and the Internet

Early in my career, what seemed to be an annual ritual was the social hysteria around a possible shutdown of the Canadian postal service?  It was probably every other year or maybe every third but it felt like an annual event.  When strikes actually happened, there were enormous disruptions to endure for businesses and individuals.  The Post Office and union would eventually agree to terms, often under government legislated binding arbitration, which would result in increased rates for stamps, some erosion of service, and just a general overall crankiness between all sides.  These events and outcomes were predictable and feared.  Not so anymore.

It may have slipped your awareness but Canada is on the cusp of another postal strike this week.  My purpose with this post is not to take sides.  I care very little about this because, frankly, the service means very little to me.  I employ the Internet for every banking transaction and for a large proportion of my magazine reading.  This is not a unique reaction; millions of people behave this way. There are still a few items that arrive in the mail but gone are the days of feeling extremely inconvenienced if inside postal workers and/or letter carriers were to strike.   If you need proof that old “technologies” like the mail service have been permanently sidelined, you can register with McKinsey & Company to read this article about the Internet’s growth dividend.  There are all kinds of examples of business disruption caused by the overwhelming human embrace of the Internet, and on balance it has been an equally overwhelming plus for our world.

Privacy should be baked in

The individual is most often the last consideration of software development.  The first consideration is achieving the goals of the business.  Oh sure, we in the industry talk endlessly, and sincerely, about the end-user experience but it’s usually in the context of how their experience affects the overall business.  Despite the simple and straightforward phrase, I’m not convinced we know what end-user experience really means.  It’s true that it means, in part, an understanding of the ease with which a user interacts with an application to get the results they need in the time they deem acceptable.  I argue that it should also mean that the software development cycle should take the approach that each end-user deserves the respect of an application designed from the beginning with their privacy as a top priority.  Ann Cavoukian, the Ontario Information and Privacy Commissioner, describes it best in the short videos found on her organization’s website, Privacy by Design.  

So many of the data breach issues we see ever so frequently these days could be avoided if system and application design would take more seriously basic privacy rights of the end-user from the very beginning.  A separation of business data from the personal IDs we tag to users/consumers would go a long way towards minimizing the personal exposure so many fear in today’s online business.  But it takes careful thought and deliberate intelligence from the outset.  Are businesses willing to invest in that?  Or would they rather put their brand on the line later on?

Rocks of a couple types

I recently had a conversation with a former mentor, someone who I still admire deeply.  We chatted about many things but one in particular stands out.  He said that one of the things he’s learned over his career is to try, to strive, to only associate and be around people who strive themselves for personal excellence.   It may sound like Type A but it’s not.  He means that no matter the physical, economic, or circumstantial place a person is in, if they are trying to be their best, then that’s what he finds compelling and is the energy he wants to be around.  In my opinion, there aren’t many more noble pursuits in life.

That sort of energy reminds me of what I think is the most profound scene in the movie 127 Hours.  It’s not when Aron Ralston (James Franco) snips away his dead arm, although that is both a grotesque and fascinating scene.  No, it’s the next one where he stumbles across the desert seeking help.  The entire movie is good, great actually, but that final segment of him needing to find assistance or if he fails then the entire struggle will be pointless… that made it for me.  The guy could’ve given up after 96 hours, maybe even 72.  Heck, he didn’t even give up after 127 hours when he had nothing but the brilliant beauty of multi-hued rock for miles around.

Orwell can teach us

George Orwell was more than a dark prophet whose story, 1984, made many of us fear for our future.  He was also a man who felt deep passion for clarity in the English language.  Here are a few simple rules he championed:

(i) Never use a metaphor, simile, or other figure of speech which you are used to seeing in print. 

(ii) Never use a long word where a short one will do. 

(iii) If it is possible to cut a word out, always cut it out. 

(iv) Never use the passive where you can use the active. 

(v) Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. 

(vi) Break any of these rules sooner than say anything outright barbarous.

Following those rules will not in themselves make anyone a great writer (ideas are where things start) but following them will certainly make what you produce easier, and more interesting, to read.  Finally, please note the last rule.  There is something to be said for civility.  I am a member of the Globe and Mail Catalysts group and one of the most heated topics was how online newspapers should deal with uncivilized and (almost always anonymous) vicious commentary in online forums and in reaction to published articles. 

In consideration of that final rule, I never thought I would have the reason or opportunity to say this but for this specific purpose it makes sense…. I wish the world could be more Orwellian.

Seven pieces of information

The apparent deficiency of our short-term memory is a popular topic of conversation in our circle of friends.  We’ve all had moments of blankness when it comes to recalling what we did last night, or what we had for lunch today.  We blame it on aging and the general hectic pace of our world, but what if the answer simply lies in the physiology we are all born with?  What if our ability to store short-term memory has a surprisingly small limit?   A study by George A. Miller concluded just that.  It’s called, “The magical number seven, plus or minus two: Some limits on our capacity for processing information”(published in the Psychological Review in March 1956) and he concluded that the human brain has the capacity for storing only seven pieces of short-term memory.  Humans are fascinating creatures but unlike computers whose capacity exponentially expands with every passing year, ours stays fixed.

Seven pieces.  No wonder Twitter is so successful.  Such small bits to store and recall.

Big data redux

If you’re a veteran of IT, you’ve possibly been thinking that data, on its own, is completely unimpressive.  We collect and store so much of it (235 terabytes collected by the Library of Congress in April alone, according to McKinsey & Co), that we’ve reached a point where we (IT folks) are only impressed when someone says they can make sense out of all that information.  Counter-intuitive, right?  Well, only for people who don’t have to manage data everyday.  Once you get past the amazing physics of the hardware and the network, and the advanced intelligence of the software programs, the data itself is simply an inert set of bits and bytes.   I’m exaggerating, but just a bit.

Making sense of impossibly complex warehouses of information is what the current Big Data rage is all about.  I’m a believer that companies, and people too, who figure out how to precisely read the tea leaves of massive amounts of data will be the true beneficiaries of the computer age.  The secrets of success are not so easy that they can simply be  collected and stored on rows and rows of disks.  Like many things in life, making sense of data and finding enduring success requires effort, the effort expended to correlate seemingly random factoids and finding larger meaning within the combination.  My bet is that the effort will be very rewarding.

Presenting: no shortcuts allowed

I just delivered a presentation over the web to a few colleagues across the continent on the company’s cloud computing strategy and the experience reinforced what I’ve believed for an awfully long time.  A person will really struggle to sound credible and authentic if they:

  1. Don’t believe in the pitch themselves 
  2. Have not done a decent amount of research into the topic
  3. Can’t summon up personal stories to illustrate their key points
  4. Have not rehearsed the flow

As usual with my presentations, I deviated quite a bit from script but the theme stayed true.  I deviated because it felt right at the moment and I think that flexibility and nimbleness is important too.  It seemed to go well.

If only they knew

It’s been said we live in cranky times, where people are apt to curse if you create an obstacle as you pause to bend and tie a shoe.  Naturally, we collectively blame the Internet, and software in general, for shortening both our attention spans and our fuses.  It’s much more complicated than that but let’s start there, with the (silly) notion that the world is spinning out of control and software is making things worse. 

I was watching CBC Newsworld the other day and grabbed the remote with irritation to switch to the BBC because for (who’s counting) the 8th time in an hour, they showed the same inane commercial about life insurance for the really old. It brought to mind an idea I’ve had for years. Wouldn’t it be absolutely terrific if, like we can do with computer networks, the television networks could monitor to see how many people were connected at any given moment, and how many left them to watch another network?  Wouldn’t that drive some very significant changes in programming?  You bet.  Too bad that technology doesn’t exist.  Instead they rely on sampling and statistical extrapolation.  But something sort of close to that level of granular monitoring does exist for an adjacent form of media. 

In this whitepaper from Ontario’s Information and Privacy Commissioner and her initiative around Privacy by Design, a technology is described that detects when someone looks at a sign. It tracks the number, and by pattern recognition (size, shape, movement, clothing) determines the gender and approximate age.  The best part is that it doesn’t store the information.  It’s altering the way marketers target their message and it’s this type of capability I think that would drive huge changes in the television industry.  If they knew that 35,000 people switched the channel to something else every time a Cold-FX commercial came on, don’t you think they would yank that commercial?  That would be one small step in making it safer to stop to tie our shoe or slow down on a busy sidewalk to look in a shop window.

Google and other wonders, from 100 years ago

Google executive chairman, Eric Schmidt, has some cogent things to say in this McKinsey Quarterly interview about business culture, technology, and social issues.  His voice is just one in seven billion of course, but the fact he has so successfully grown the world’s premier Internet company in only the last decade means we might want to listen a little closer to what he has to say.  Some of that is:

  • Most businesses don’t disappear; they transform, morph, or age.  The question is how they will change to adopt real-time telemetry (automatic transmission and measurement of data).  Technologies, however, do disappear.  Think of pagers and (arguably) watches.
  • Businesses should make sure their management never hires their friends.  Interestingly and counter-intuitively, at least to me, he learned this from the two young Google founders.
  • Consensus without discord is likely not a true consensus.  If you have discord only, then you have a university.  For business, you need discord plus a deadline in order to arrive at what is likely a true consensus.
  • The acceleration of cloud computing is seeing the top technical people building the most powerful applications on mobile platforms first.  It’s a big shift that is rapidly destroying some business models.  Media is an example of one that has already been disrupted by this.
  • There is so much data available now (due to telemetry) that we have the basis for such things as instant language translation (speak into a phone in one language and it comes out in another language on the other side), and uniquely built designer drugs (drugs manufactured just for me based on my physiology and specific condition as opposed to drugs for me based on a profile that I happen to fit along with millions of other people).

My regular readers understand by now that I like to describe what I’ve seen, read, or heard and add my own thoughts or interpretations.  I won’t dispute anything I heard in that short Schmidt interview.  In fact, it all sounds completely plausible.  What I will say is that when I hear people like Eric Schmidt talk, my mind goes to my late father and how he would have marveled at the world today.  He was a guy born in the time of horse and buggy, Teddy Roosevelt, and Wilfrid Laurier.  Henry Ford had yet to develop the concept of the assembly line.  A depression, two world wars, many regional conflicts, and personal hardship  failed to dampen his interest and enthusiasm for life.  Next year would have been his one hundredth birthday.  If he was alive, he would have declined the drugs as he always did but he certainly would’ve been eager to call someone overseas to have a conversation, thrilled to know that each party could speak comfortably and fully in their own language.

High-tech misunderstandings

Here’s a little story that may boost your faith in the ability of strangers to behave with compassion and humor under the most mundane and anonymous of circumstances.

I once received a text message that caused me to look over my shoulder and all around, expecting to see someone filming my reaction.  Having never experienced the sort of vivid language that I imagine can only be had within the intimacy of a deranged relationship, I thought it had to be a joke that someone was assaulting me over my phone with accusations of abuse, cold-heartedness, and abandonment.  I decided it had to be a she since the words contained the depth of drama and hurt only a young woman can seem to muster.   I reread the text and as I did, another one came in from the same person. The second text was over the top apologetic as she acknowledged that she had sent the first text to my phone number in error.   The words of the second text were as full of regret as the first was full of pain.  Maybe most people wouldn’t have but at that point I decided to reply to her.  I told her it was okay that she accidentally sent me the text, that everyone makes mistakes. I wished her luck and said I hoped things worked out for her.  She immediately replied saying, “LOL!  Thanks  for understanding. I wasn’t expecting a nice wish. Thanks man.”   That was that.  I smiled and the story made for a good little snippet of conversation at a dinner party we had that evening.

High tech has brought the world tremendous benefits.   I don’t have to list them here.  But there are also ways that high-tech has been less than tremendous.  And, guess what, I don’t have to list those here either.  I’ll let this Wall Street Journal article called High-Tech Messages Can Get Lost in Translation do that for me.  I encourage you to read it.  It’ll make your day and may even liven up your next dinner party.

Data ownership; keeping individuals in the Internet game

A fascinating discussion took place last Thursday evening at the University of Toronto’s  Rotman School of Management.  It was one of a series of moderated talks at the school dealing with a variety of topics.  What hooked me into registering and attending were the speakers, Jaron Lanier and Tim Wu (who, BTW, grew up around the U of Toronto campus). If you’re knowledgeable about their histories, you’ll know they are original and controversial.  Each is famous, Lanier for being widely acclaimed as the inventor of virtual reality and leading critic of Web 2.0, and Wu for not only coining the term Net Neutrality, but for being part of the Obama administration as advisor to the Federal Trade Commission.  The talk was wide-ranging but what I want to highlight here is the topic of data ownership, the notion that data should belong to whoever produced it; that an article, a blog, an email, even a tweet should be considered as perhaps possessing original thought and if so, should qualify it to be owned by its creator. This idea stems from the concern that the Internet, as we know it, is increasingly at risk of becoming a closed environment, controlled by a very few massive corporations and governments.  These entities are blurring the boundaries between content providers and carriers.  In some cases, they have combined to become both.  This development may seem benign on the surface but it will likely end up stifling the openness and expansiveness of the Internet.  

What may forestall that is if, among other things, original thought in the form of data is owned by individuals who could, if they choose, sell it.  The belief is that individual ownership of data will thwart a worrying trend we see today where data is increasingly produced and carried by corporations, and thus crowds out individuals.  As well, the data produced by one person is often rehashed and reused by others, with no credit or acknowledgment given to the former.  Owning and selling data sounds simple, right?  The actual technical functionality to make that scenario workable is already in place today.  By utilizing search algorithms and massive server farms, along with correlation and tagging (in other words, what Google does every second of every day), associating unique writings with specific individuals is possible.  As Lanier said, “It’s a computer. Complexity is a no-brainer.”  What’s not simple is figuring out how to fairly monetize the data.  And that means not only figuring out how to spin your data in ways in which others will decide there is enough value to pay for it, but to also determine how much they should pay. And that’s a topic I won’t broach here other than to say it may end up looking not much different from the way musicians today are compensated, in theory, each time one of their songs is played on commercial/satellite radio.