Global Cities

AT Kearney, the management consulting company, published a report last year ranking the top cities in the world against various measurements.  The report is titled The Urban Elite and can be viewed here.  Easy to read and make sense of, the report does a thorough job of explaining why it ranks the cities the way it does.  Starting with a list of the top 65, it proceeds to break-out 5 categories and the top 10 cities in each.  For example, the list of the top 65 is a rating of how well each city did for all 5 categories combined (Toronto at #14 and Montreal at #31 are the only Canadian cities on the list).  The 5 categories that are then broken out are: Business Activity, Human Capital, Information Exchange, Cultural Experience, and Political Engagement.  The usual suspects show up in the top 10 for each category (New York, Paris, Tokyo, Hong Kong, London, etc) but also others like Singapore, Chicago, Boston, and Toronto (shows up at #9 for Human Capital).  To give the Human Capital category a bit of context for you… it means, a city’s investment in brain power, how well-educated and diverse is the population that resides there, any internationally known universities, etc.

I found the report extremely interesting but not necessarily surprising at all.  The world is becoming increasingly mega-urban and competition now is truly global.  It’s not inter-provincial or inter-state like it was just 20 years ago.  As a resident of Toronto I am happy to see my city on the list and especially in the top 10 for Human Capital but it also causes a bit of anxiety knowing that cities all over the world are doing the right things too and just because a city is ranked well one year does not mean it will stay there for long.

Big Data, Big Job

Still with the Google theme I started with my last post.  I read an interesting report called The Promise and Peril of Big Data on the Aspen Institute website .  The author, David Bollier, discusses a lot of large numbers to characterize the exponential growth of data, including something called, and I quote:  “a yottabyte, which is a trillion terabytes or, as one website describes it, “everything that there is.” 

 And in ten years we’ll probably be able to fit all that on a thumb drive. (joking, at least for now I think)

The report mentions Google often, and with good reason.  That company is in existence because they figured out how to efficiently collect massive amounts of information and, through correlation, monetize the exercise.   What I find interesting about the report is the debate around whether that’s all we need as a society.  Whether the collection and correlation cancels out the need for scientific theorizing and the modeling and testing of theories.  At this point I side with the science folks on this one and this quote from Michael Chui of McKinsey & Company sums it up nicely: “Theory is about predicting what you haven’t observed yet. Google’s headlights only go as far as the data it has seen.”  Pretty clear-cut, no?  It can be argued that Google has done and continues to do a tremendous amount of good via the way it collects and transforms data to become, in a sense, more than it is.  But still, it hits that wall of limitations.  It can’t know what it can’t see.   At least for now.  Who knows what sort of sophisticated programs can be developed to mimic the thought processes of top scientists? 

All this leads to a link that caught my eye today.  It came to me through a subscription email and it was touted as “industry reports” about how the top jobs in technology are related to data mining and machine learning, business intelligence, and analytical statistics.  I don’t disagree with this but I was disappointed to discover that when I clicked on the link, the “industry reports” was a blog site for a person from Microsoft.  That would have been fine except the blog post did not contain any links or references to any published studies or reports.  The list of top jobs was that one person’s opinion. My point for including this anecdote in this particular post is that, yes, there is a tremendous amount of data being generated and not all of it is useful or even honest.  Google has its work cut out for it.

Seeking infallibility?

 It seems they’re in the news everyday and I’m okay with that.  I love Google, the tool.  Although I like to see good companies succeed, I don’t own the stock so I have no strong feelings one way or the other about them as a company.  But, oh their search engine….. that is certainly an incredibly useful utility.  And they seem to be very serious around innovation and I’m a fan of that.  But Google’s existence has caused many people of a certain age to rhetorically ask, “What did we do before Google?”  Meaning, how did we find things out about anything before Google gave us a simple way to do it?  For you young people, the answer of course is that we had libraries, encyclopedias, tribal knowledge, lies, make-believe, and imagination, usually in that order.  Are we better off now that Google contains all human knowledge (tongue-in-cheek)?  I think so but those other methods are probably getting a little dusty and rusty by now. 

They really are a fascinating story though.  From a seemingly boring little utility that promised to make searching easier and faster, they’ve grown to be seen as threat to personal privacy, which caused their (now former) CEO, Eric Schmidt, to utter these immortal words, If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”  It’s a good line, and a good point, I suppose. (See WSJ: Eric Schmidt for more)  But before we start resigning ourselves to the thought that Google can do no wrong and it’s inevitable that we will be ceding our entire lives over to a massive database and some analytical software, you should read me and my algorithm, a hysterical article by Seth Freeman that was published a few days ago in the New York Times.  I think I’ll start paying attention to that sidebar on my Gmail.

Connectivity and the push for total uptime

Two recent articles may point, I think, to the convergence of two apparently unrelated topics.  The first article was written by Google guys, Eric Schmidt and Jared Cohen, and published in Foreign Affairs back in the 2010 November/December issue.  Titled, Digital Disruption: Connectivity and the Diffusion of Power, it posits that due to the ubiquity of connection technology, this century will be full of challenges and surprises for governments, media, and corporations around the globe.  I’ll paraphrase but basically, the genie of information is out of the bottle and containing it will prove to be almost impossible.  This presents both tremendous opportunities for human advancement and significant challenges for how we govern ourselves.  They wrote the article just before Wikileaks hit the front pages to, in part, prove their point.

The second article by Randall Stross was published last week on January 8 in the New York Times and discusses why the fabled five 9s of enterprise computing (which translates to only 5.26 minutes of downtime per year) will probably never come true.  For those of us who’ve worked in IT for the last 20 years or so, the five 9s was a cliche like shooting for the moon.   We knew we could never afford to build, nor were we sophisticated enough in our processes to support, the redundancy necessary to enable such extremely available systems.  Now it seems even Google agrees and they state that four 9s (52.56 minutes of downtime per year) is their goal.

So where is the convergence, you might be asking.  If Nicholas Carr is right when he equates the future of computing delivery with electricity, then shouldn’t five 9s be what we expect and demand as consumers?  Electricity in most developed areas of the world rarely ever goes down at all, for the entire year.  Enough redundancy is built into the supply infrastructure to ensure that when we plug an appliance into an outlet or when we flip on a light, we get what we expect, power.  If universal connectivity is truly upon us, if more communication and social media applications like Facebook and Twitter continue to crop up, then it seems logical that we can anticipate computing suppliers will figure out how to deliver the services virtually all the time (five 9s).   It’s clear to me that one is driving the other.

Public speaking with the world watching

I’ve been in a role that required speaking publicly for the last 13 years or so and while I’ve been in front of some large audiences (one in particular in Sao Paulo with translators was very interesting and sort of funny), my experiences pale in comparison to what high public officials have to subject themselves to.  We should know that they all have speech writers, speaking coaches, and all manner of support personnel to ensure their events go as smoothly as possible.  But despite all the preparation and rehearsal, it still comes down to that one person having to deliver that speech with clarity, passion, and authenticity.  Which leads me to this.

The New York Times provided video examples today of how U.S. presidents in the last 50 years have dealt with events of national mourning caused by (with the exception of the Challenger explosion) individuals committing violent acts.  The piece called Executive Consolation is an excellent primer for anyone interested in understanding the power and peril of leading through words and their delivery at that level.  I was only a young boy but my parents were strong political observers and I can remember seeing Lyndon Johnson give his speeches after each of the Kennedy assassinations.  Perhaps because he was not a great public speaker (his strength was in privately securing legislators’ votes; twisting arms), I had a good baseline for my lifelong interest and appreciation for the craft of public speaking.  No matter your political leanings, the videos do a good job of re-capturing those terrible, emotional, and risky periods, where words made a difference, even if they were mostly read from notes and teleprompters.  But you can tell that even in those types of situations, sometimes the most impact comes from the unscripted.  Watch George Bush give a masterful off-the-cuff performance at Ground Zero.  The cowboy knew instinctively how to raise up the people at that moment.  Or watch Bill Clinton at the end of his 9-minute speech and you can tell he’s not looking at his notes anymore as he delivers what are probably his most powerful lines.  Words matter and they matter even more when they can be delivered with honesty and conviction.

Blogging the easy (and wrong) way

I think it’s time someone hit RESET on the whole blogging thing and everyone became educated about one simple ground rule.  Blogging is not just about passing on found information that you feel might interest others.   It is about many things but one thing that I would like to see agreement on is this.   If bloggers cannot be original and if they feel they must pass on the work of others, then they should take  the core message of that found information, agree or disagree with it, contribute some of their own thoughts and opinions, thereby adding value to the conversation that may have been launched by the writer of the original piece.   It seems to me too many bloggers are using blogs as elaborate Twitter pages.  Twitter I understand as a conduit for information found elsewhere, although I still believe you should in the 140 characters or less tell us why you feel the information is of interest or matters to the recipient(s).

I’ll tell why I’m riled up about this.  I read a book review on the New York Times website and the reviewer quoted a line from the book, a very witty line that I thought portrayed the author of the book as a clever person.  I Googled that line and I was surprised (yes, I might be very naive) to see that a blogger had cut and pasted that entire book review (including the author’s name, but not the publication) into their blog.  No commentary was added, no added value was evident to me.  Our collective knowledge had not been advanced.  The irony is that the book being reviewed posed a question about whether the Internet is changing the way we think.   Perhaps it’s making us think less.

Easing into the new year

I had a professor in my final university year who declared that because of the good spirit that prevails over the holidays, people need gentle coaxing to get back into the grind of regular life.  And so, he felt that the first class one teaches after the start of the new year should be brief and full of levity.  It’s probably only a slight exaggeration to say that that class was my favorite in all my years of learning.  In a nod to his philosophy, I will keep this post light. 

Simple question of the month: What do the following items have in common?

Facebook lawsuit – If you’ve seen the terrific movie, The Social Network, you may be curious to learn that just because a lawsuit reaches a settlement, it doesn’t necessarily mean the matter is closed.

video games are changing the economy – It’s probably a good thing that kids’ thirst for fun is now driving the economy more than how it’s driven by the military’s thirst for weaponry.

polar bear cam – Ingenuity meets nature in minus 30 C temperatures at the North Pole.

If you answered that these items have nothing in common, besides being interesting and being only marginally technical, you’d be right. But I did get you to smile, didn’t I?