The melding of social and business data

It had to happen eventually.  When Mark Zuckerberg said, “People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people. That social norm is just something that has evolved over time”, it meant to me that he, like a lot of other people, recognize that society has accepted the tremendous benefits of Internet-connectivity and that those benefits outweigh downside risk.  The list of risks, beyond the obvious cyber-crime, malware, and viruses, should include one with the potential for greater impact, that our personal information may be used for profit by those to whom we do not grant explicit permission for use. 

Today’s edition of the Wall Street Journal contains an interesting article about insurers mining data to profile clients.  We should all know that the insurance industry has forever relied on massive amounts of data in order to properly set premiums and to guarantee profit.  This is just smart business and indeed there is nothing unusual about exploiting data for those purposes.  The difference this time is that they are not only mining their own data that they collected.  Instead of relying upon blood and urine tests to form the basis of their decisions, they are attempting to extract accurate answers to that thorny question of who they should and should not insure from many external data sources that include: online shopping histories, catalog purchases, magazine subscriptions, and social-networking sites.  Sounds like a rich store of information upon which one could begin building a profile.  Apparently, the testing is at the earliest stages and processes are being built to allow for review of borderline cases.   That’s good because based only on what I read in the article, I think the “technology” has a long way to go before it can be trusted.  While I applaud the effort, I think there’s still too much room for error.  There’s too much hypothetical chance of someone being declined for a policy, or having to pay a much higher premium, perhaps, because of behavior they exhibit when interacting within their communities on Facebook, as an example.

I’m a big fan of computing, of its power and potential for improving our lives and the way we conduct our social and commercial affairs.  I’m also a big fan of circumspection, and human insight.  Let’s hope the two can always work well together.

Ten tech-enabled business trends

The August 2010 issue of McKinsey Quarterly included an article titled Clouds, big data, and smart assets, the sub-title being: ten tech-enabled business trends to watch. I hesitate to say, but say it I must, that even though the article was published way back in August, three long months ago, it is still highly relevant. Of course that’s a joke but you get my point; this industry we work in changes rapidly, or maybe it’s just its attention span that changes rapidly. Anyway, the article is very interesting (you’ll have to sign up for a free registration to read the entire thing) and it includes such thought-provoking items as:

  • companies using the massive amounts of available data to conduct constant experimentation
  • imagining anything as a service
  • innovating from the bottom of the pyramid

Intelligence is having a sense of perspective

I think back at my father’s life when technology seemed to be barely a word, one spoken very sparingly outside of institutes of advanced research.  Perhaps he heard it while listening to the CBC when the Russians announced they were sending a chimp to outer space but aside from that I cannot imagine another time before 1980 when the word would’ve meant anything more to him than television.  Radio, telephones, and cars would not have qualified because they had been around almost his entire life and, frankly, there wasn’t a lot to them that would qualify them in today’s terms as technology.  It’s different now, of course.  As I look around at the desk in front of me I imagine if my father was still alive I would need to spend days with him to explain the gadgets and how they could even remotely be seen as tools of legitimate work.    

I have two blackberries (long story involving IT policies limiting the user’s freedom), a phone that looks nothing like the Bell models he was familiar with (where are the wires? and what the heck is that thing with the loop and looks like an ear?), a Bose headset (for listening to webcasts), a laptop, a monitor, which of course he would see as a TV ( “And this is work?” he’d ask), an iTouch (for fun and inspiration), and an Italian lamp whose LED bulbs, I’ve been told, will burn bright for many decades, maybe centuries, after I join my father wherever he is.  All this stuff… well, most of it… is critical to my work and I would be almost useless in my job if it weren’t for their existence.

It’s easy to be seduced by words and symbols.  Words can convince us that something we suspect to be true, that we want to be true is, in fact, true.  Devices that make us perform functions faster are often symbols of sophistication and advancement but does their existence and utility, and my owning of them, mean I am any more intelligent than my father?  I doubt it.  Multi-tasking is certainly, almost indisputably, something my generation excels at compared to my father’s.  But in comparison to mine, the same can be said of the generation presently in their 20s and 30s.  Does their ability to juggle more things simultaneously translate into greater intelligence?  Maybe… but I wouldn’t bet on it.  This generation to generation comparison of intelligence is much-discussed and it makes me wonder, does it matter? Learning positive lessons from previous generations, whether conscious or unconscious, lessons that improve life for many, is called evolution.  Clearly that has happened for, well, forever.   Shouldn’t the incremental progress be the focus and not whether this generation is smarter or dumber than the one before or after? 

I’ve been in IT for my entire career and while I have seen, and been part of, the massive change of the last 30 years, I don’t feel any more intelligent than my father, the lover of good conversation, a variety of high-brow fiction, and gorgeous Italian arias.  Sure I know and can explain, and am an enthusiastic supporter of (besides the devices I mentioned above), why a company would want to exploit virtualization and the Cloud, or how to build and track a web application.  But my father knew how to set time aside after a long day of work, sit on the front porch at dusk, and as the heat and humidity lifted, listen to Ernie Harwell  call balls and strikes as the Tigers marched towards the World Series.  Who’s smarter?

Tribute to Software Quality

The world lost a fellow named Watts Humphrey in the month of October.  Mr. Humphrey was known as the father of software quality, a topic that for most people is unlikely to stir strong emotion, and I suppose that could mean he excelled in his mission.  If the phrase “software quality” conjured up for most people feelings of anger or disregard, then perhaps good old Mr. Humphrey should not have felt proud of his nickname.  Instead, and especially in the last 10 years or so, software quality has been so important to world commerce, human health and safety, and human leisure that one can seriously ask the question, “Without optimal software quality, what would we have?”  In helping us arrive at this point, where we take for granted that things just work, we should hit pause and ponder for a second how remarkable that is.  When we start our cars, or turn on our iPods, or order up a movie on cable, we don’t close our eyes, cross our fingers, and hope for the best.  We may do that when we watch a shuttle launch but I think that anxiety has more to do with the explosive that is set to be ignited than it is about the quality of the software that is controlling it.  But for the everyday computerized tasks and machines of our lives, we don’t think about the behind-the-scenes process at all.  We take for granted that these things will produce the result we expect.  Yes, I know about software bugs but I also know computing and computing processes are more sophisticated than they have ever been, and increasingly glitches get fixed before we even know anything happened.  For all of that, we can thank people like Watts Humphrey who toiled away in unglamorous occupations, thinking of logical ways to improve how machines and people can work better together.

Banks and Cloud Computing

I was doing some research for my monthly customer newsletter and stumbled across a pretty informative website called, Bank Systems & Technology. Admittedly, it’s not the jazziest name around and you can safely assume that the riskiest phrase to be found is Disaster Recovery. Anyway, I found an article  that discusses the progress banks are making in embracing Cloud Computing. We know that the financial services industry has special considerations (whether to move to the cloud mission critical applications running smoothly on legacy platforms, security concerns that can dwarf other industries, etc) and the author does a good job of touching on these while providing a synopsis from a recent survey.