Monthly Archives: November 2012

We Think Therefore We Are: The Study of Collective Intelligence

Endless studies have been made regarding analyzing the nature of human intelligence as individuals, but now with the growth of the Internet and social media (Facebook, LinkedIn, Twitter, etc.), new models and approaches are being called upon as the new field of ‘Collective Intelligence‘ comes to the fore.

Collective intelligence is simply what its name implies: studying the nature of intelligence as a series of groups or collectives. And the man who’s leading the development of this new field is Thomas Malone of MIT (http://io9.com/5962914/the-emerging-science-of-collective-intelligence–and-the-rise-of-the-global-brain). As Dr. Malone put it:

I’d define collective intelligence as groups of individuals acting collectively in ways that seem intelligent. By that definition, of course, collective intelligence has been around for a very long time. Families, companies, countries, and armies: those are all examples of groups of people working together in ways that at least sometimes seem intelligent.

Well, at this point, one has to point out that groups of people don’t always act intelligently – at least not in ways that we may regard as being smart – as Dr. Malone also pointed out:

It’s also possible for groups of people to work together in ways that seem pretty stupid, and I think collective stupidity is just as possible as collective intelligence. Part of what I want to understand and part of what the people I’m working with want to understand is what are the conditions that lead to collective intelligence rather than collective stupidity. But in whatever form, either intelligence or stupidity, this collective behavior has existed for a long time.

So what does any of this have to do with the price of beans in China, you may ask? After all, it all seems kind of obvious when you think about it (as the old saying goes, ‘a person is smart; a group of people aren’t’).

Think again:

What’s new, though, is a new kind of collective intelligence enabled by the Internet. Think of Google, for instance, where millions of people all over the world create web pages, and link those web pages to each other. Then all that knowledge is harvested by the Google technology so that when you type a question in the Google search bar the answers you get often seem amazingly intelligent, at least by some definition of the word “intelligence.”

Or think of Wikipedia, where thousands of people all over the world have collectively created a very large and amazingly high quality intellectual product with almost no centralized control. And by the way, without even being paid. I think these examples of things like Google and Wikipedia are not the end of the story. I think they’re just barely the beginning of the story. We’re likely to see lots more examples of Internet-enabled collective intelligence—and other kinds of collective intelligence as well—over the coming decades.

If we want to predict what’s going to happen, especially if we want to be able to take advantage of what’s going to happen, we need to understand those possibilities at a much deeper level than we do so far. That’s really our goal in the MIT Center for Collective Intelligence, which I direct. In fact, one way we frame our core research question there is: How can people and computers be connected so that—collectively—they act more intelligently than any person, group or computer has ever done before? If you take that question seriously, the answers you get are often very different from the kinds of organizations and groups we know today.

Collective intelligence analysis is a field which reviews how people think on a collective basis (which brings to mind Isaac Asimov’s famous “Foundation” series in which a futuristic galactic Empire’s eventual collapse and fall is predicted by a so-called “psycho-historian” who creates his magnum opus involving a detailed plan of the galaxy’s future foretelling the fall of The Empire – and then developing a detailed plan for establishment of a replacement Empire by predicting how people will behave in terms of groupings – ala presumably collective intelligence analysis – and all of this 1,000 years after the fall!).

All sounds far-fetched? Perhaps. But as many stock market programs and financial market computer services functions following the 1987 stock market crash, systems have been put in place to prevent ‘panics’ or other group / market scenes in anticipation of how the herd acts (as the old saying goes, ‘it’s fear and greed which largely motivates Wall Street’) – and understanding how the herd thinks (depending, of course, which herd you’re looking at) is now, more than ever before, important.

We’re talking about big money here – and much, much more than just about money.

We live in a crowded world with diminishing resources: the ice caps are melting, population groupings becoming restless (witness the recent study regarding the so-called Arab Spring riots collating directly to rising food prices – http://www.mindfulmoney.co.uk/13103/sector-watch-/the-economic-consequences-of-rising-commodity-prices.html). With all of this, it’s now – more than ever before – a matter of our survival to know and understand how people think, especially in terms of groupings as there are as few things as dangerous as when groups of people come together (anything can happen: an impromptu football game, a sudden Shriner’s parade – or worse!).

It’s also worth noting that understanding how groups of people work is also fundamental to the success of any political / electoral undertaking.

So what; old news. We’ve all come to expect and realize how people behave badly (or otherwise) in large groupings.

Not really.

All of this also underscores a subtle – but very significant development – which Dr. Malone points out: where do we draw the line between collective intelligence and cognitive intelligence inherent within our computer networks? It’s becoming more and more like the ‘push / pull’ scenario: who or what is doing which? Or can we ever really tell?

This distinction is one that’s becoming more and more blurred – and only promises to continue doing so as we become more and more intermeshed with our social media / computer networks. Who we are is increasingly being defined as part of a greater electronic vision of ourselves: at what point does the mirror reflect upon itself and its electronic image – and not just directly on us?

No matter how you look at it, there is tremendous potential within this field of study. Perhaps with a little luck, we can not only understand how and why groups of people behave as they do, perhaps we could eventually figure out how to keep them from doing stupid things.

Succession (Again) in the United States,…!?

The 2012 election was a watershed moment in our nation’s history. For the first time (as we’re being told) a working majority (at least those who bothered to participate) consisted of individuals whom are considered ‘minorities’ – i.e., non-white males.

Or is that really the case? Well, maybe not quite this last election, but as one editorial pointed out:

Those “demographics” — non-white voters — represented just 28 percent of the electorate in 2012, according to analysis from my Chicago-based policy shop. That 28 percent has the potential to grow exponentially in 2016, as Census data tell us that today’s age 10-19 population — many of whom will vote for the first time in the next election — is 45 percent non-white. While these youth numbers don’t correlate directly with the realities of the 2016 electorate (aside from the obvious fact that 14-year-olds can’t vote, there are compounding factors that will keep many of these youth from casting a ballot on November 8, 2016), the new demographic reality is evident. (Blogger’s note: this is my highlight). Further proof is offered by a recent PEW Hispanic Centerreport demonstrating how the Latino electorate will likely double in 2016, accounting for a jaw-dropping 40 percent of the growth in the eligible electorate over the next four years.

Call them voter ID requirements or voter suppression measures, these efforts backfired — horribly — for the GOP. An estimated 11 million Latino voters cast a ballot this year, up 13 percent from 2008’s record-breaking figure of 9.75 million. And some speculate that voter suppression was precisely what drove “angered and shocked” African-American voters to the polls — to overwhelmingly throw their support behind Obama.

Slate put it best — or at least most bluntly: “Only white people voted for Mitt Romney.” (http://www.huffingtonpost.com/sylvia-puente/latinos-election-2012_b_2146203.html).

(And to think that the man has Mexican family roots; too bad he didn’t think of bringing that up when it might’ve helped,…).

So if 2012 didn’t really have a “non-white” majority, then we can expect this to change – sooner than expected. Welcome to Tomorrow,…

Personally, the true, overlooked irony in all of this (and perhaps I’m showing my own age) is the fact that we actually had a Presidential election choice between a African-American / Black candidate and a Mormon candidate; if you were to try telling that to people back some forty years ago, you probably would’ve been laughed out of the room.

Given the historical trials and tribulations the Mormons underwent, this is a rather remarkable event. Similarly, given the pain and suffering that African-Americans / Blacks have undergone, it is truly telling that a nation would not only allow but encourage people to rise to the best and become not only part of the fabric, but actually attain high, powerful leadership roles.

So even though the ‘last stand(ees)’ decry the changing nature of our country – i.e., the ‘there goes the neighborhood’ folk – for all practical purposes it already has happened: they threw their weight behind somebody who, but a couple of decades ago would also have been considered a total outsider: a Mormon.

Talk about radical change; but then again, this is ultimately what the United States is about.

Usually the biggest changes are the ones that creep up on you.

Some people had foreseen this event taking place: witness the Reverend Martin Luther King, Jr. who, during a BBC interview back in 1966 predicted that “in forty years from now, we shall have a  black president.” (http://www.bbc.co.uk/search/martin_luther_king).

But with every revolution there is a counter-revolution; some folks don’t like things to change – and this is to be expected. Witness the rise of the “Succession Petitions” – i.e., petitions submitted by people requesting to place a motion on their respective ballots to have their states succeed from the United States of America.

Fort Sumpter, anyone?

So the petitions keep on coming in: more and more people (though still few and scattered) openly speak of having their respective states separate from the United States of America.

Interestingly enough, the majority of these come from Texas, but given Texas’s prior history as (for a brief time) an independent nation, it should come as no surprise (http://www.examiner.com/article/texas-succession-request-gathers-over-113-000-signatures-historic-move).

But here’s some food for thought: checking out the election of 1860, with Abraham Lincoln’s election as President, note where Lincoln’s electoral base derived from:

Now compare this map with the 2012 presidential election:

Looks familiar?

Granted, Nevada, Washington along with Colorado and New Mexico were not states at the time of the 1860 election, but you get the picture.

What these maps show is that sadly, some things haven’t changed.

Maybe it’s no coincidence that President Obama regards Abraham Lincoln as one of the greatest presidents of our nation (as was witnessed for the past several months by his pre-election day statements). And depressingly, given the attitude of some folk, maybe it shouldn’t come as any surprise: like Lincoln, people who represent change aren’t always welcomed.

It’s important to remember we’re all part of something bigger; you learn to live with that reality  – and above all, with one another. Maybe that’s one important message that Thanksgiving teaches us: we all bring something to the table – and when we do, there’s more food to eat and nobody goes hungry. 

Succession is not a valid option; divisiveness must stop. Recognition of the economic rights for all must be attained, otherwise it’s not going to get any easier to do any business in this country; and if you don’t believe this, then just watch China in the coming decade as they start going down a path that the United States already did – back in 1880, beginning with the labor unrests and the riots in the following years after a bloody Civil War.

We can’t afford that.

The Union: now and forever.

Old and Cold Versus Young and Warm

…and no I don’t mean that in terms of what you say when you’re at your local bar trolling around for a date on a Saturday night. Rather, it is the result of extensive weather data review (which summarized) states the following:

If you’re 27 or younger, you’re never experienced a colder than average month.

Wow; that’s heavy.

Think about it: this says that temperatures have already been gradually increasing for some time now – nearly thirty years now – to the point where nearly two generations of humans haven’t experienced any major cold snaps. Every land surface in the world saw warmer-than-average temperatures except Alaska and the eastern tip of Russia. The continental United States has been blanketed with record warmth — and the seas just off the East Coast have been much warmer than average (for which hurricanes like Sandy love to feed off of).

The report, issued by NOAA (that’s the National Oceanic Atmosphere Administration – the folks who bring you the official weather report) goes on to state:

The average temperature across land and ocean surfaces during October was 14.63°C (58.23°F). This is 0.63°C (1.13°F) above the 20th century average and ties with 2008 as the fifth warmest October on record.

* The record warmest October occurred in 2003 and the record coldest October occurred in 1912.

* This is the 332nd consecutive month with an above-average temperature.

Bad weather costs money – big money. Bad weather is bad for business. Period. And while Hurricane Sandy’s damages of around $50 billion (give or take an odd $100 million or so) was certainly big and expensive,  these damages will likely be overshadowed by the huge costs of the great drought of 2012.

Drought of 2012?

Yes, indeed: experts have been busy comparing elements of 2012’s drought to that of the Great Dust Bowl of the 1930’s, and while it will be several months before the costs of one of America’s worst drought will be tallied, the 2012 drought is expected to cut America’s GDP (Gross Domestic Product) by 0.5 – 1 percentage points, analysts from Deutsche Bank Securities announced this week.

Now that’s a big cut – and talk about bad timing: just as when things are starting to turn around economically, we get hit with this.

Nor is this just about money. Hurricane Sandy was involved in the deaths of over 113 souls, but the deaths associated with drought and heat waves are even higher, as (according to NOAA) the heat waves associated with the U.S. droughts of 1980 and 1988 had death tolls of 10,000 and 7,500 respectively, and the heat wave associated with the $12 billion (in damages) 2011 Texas drought killed 95 Americans.

As for those who denounce the numbers, suggesting that global warming is merely some sort of liberal-minded anti-business conspiracy, well, that’s your choice (and frankly, try telling that to the insurance carriers!). But just like baseball and politics, the numbers don’t lie.  Something serious is going on, and for those folks who don’t want to acknowledge it, that’s their choice. It’s worth noting, however, that a number of these folk who denounce all of this as some sort of weird conspiracy are also among the very same folk who are going to refuse acknowledging those polling numbers and data from the last election  – and with the aftermath of the last election, a lot of those folk are out of a job or no longer collecting on their contracts.

In Las Vegas (or any other casino town), the numbers are just that: numbers. But unless you know the numbers, odds are you’re the one who’s going to be standing outside looking in, wandering about, seeking a place to crash on the Strip.

In this case of understanding what’s happening to our climate, however, we’re looking at placing ourselves in a very unforgiving position: Earth is the only one planet we have – and without it, we’re dead.

Period.

There are just some things I won’t bet against the house on – and where I live is one of them.

Maybe it’s time we all started thinking this way.

For more, check this out: http://grist.org/news/if-youre-27-or-younger-youve-never-experienced-a-colder-than-average-month/

The Revolution Will Be Printed

(Apologies to Gill Scott Heron).

Although it’s a little early the signs are there: 3-D printing is here to stay – and it’s only just beginning.

Just what is 3-D printing? Simple: get a CAD (Computer Aided Design) program, design something – for example, the statues that you see in this blog post picture. The one on the left is the original of which the one of the right is cast from via 3D printing.

In effect, you enter your design onto a computer program and plug it into what can loosely be called a ‘printer’ and voila! Instant object. To be more specific, here’s how it all works (via Wikipedia):

Additive manufacturing or 3D printing is a process of making three-dimensional solid objects from a digital model. 3D printing is achieved using additive processes, where an object is created by laying down successive layers of material. 3D printing is considered distinct from traditional machining techniques (subtractive processes) which mostly rely on the removal of material by methods such as cutting and drilling.

To perform a print the machine reads in the design and lays down successive layers of liquid, powder, or sheet material, and in this way builds up the model from a series of cross sections. These layers, which correspond to the virtual cross-section from the CAD model, are joined together or fused automatically to create the final shape. The primary advantage of additive fabrication is its ability to create almost any shape or geometric feature.

The applications are rather limitless. At the present time, 3D printing is used in the fields of jewelery, footwear, industrial design, architecture, engineering and construction (AEC), automotive, aerospace, dental and medical industries, education, geographic information systems, civil engineering, and many others items.

It’s a hot, growing field, so much so that even Chris Anderson, former Editor in Chief of Wired magazine abruptly left Wired to get involved in the 3-D industry, stating that “3D will become bigger than the web.”  http://www.zdnet.com/chris-anderson-why-i-left-wired-3d-printing-will-be-bigger-than-the-web-7000007535/

Maybe so, but I wouldn’t leave your day job just yet.

It’s going to take some time before 3D settles into a marketplace scenario that is comfortable and accepted by all. Why, you ask?

Simple: because it scares too many people.

Right now, the idea that an average small business or individual can buy their own 3D printer and make speciality products is going to get some corporations pissed off – and whenever some corporations gets pissed off, they do their typical thing: they go and buy off (er, donate campaign funding) to some legislators and pass laws limiting the technology until such time they feel that they have a handle on the application.

Think about it: depending on how things would go, it wouldn’t be too much of a stretch for folks to get together, form a cooperative (as but one example) and make their own toys, rather than go out and buy – say, Legos – from Toys R Us.

It’s not too far a stretch for folks to start doing their own communal / cooperative thing and save money. Which is why 3D printing is gong to be something bearing very close watching as it drills down directly into the heart of free market competition and blows away the primary foundation of true capitalism: eliminating the middleman.

It’s a revolutionary idea: we’re talking about going beyond the workers controlling the machines of industry like the old socialist / communist ideal; rather, it will ultimately be about the consumers who will determine their own needs and demands – and make what they want when they want it, as opposed to going out and diving into debt buying it.

Imagine: instead of communal block parties where people come tougher and exchange Thanksgiving pies or Christmas cookies, they can come together, buy into a machine or a series of machines and make their annual Christmas toy list, materials / items they need for day-to-day living or even items that they can trade with others for things that they couldn’t easily make on their own.

And potentially, 3D printing has another plus point: by limiting and printing only what one individual or a group would need, it lessens the negative impact against the environment: you only print what you need, and not create thousands of widgets, along with the manufacturing process and environmental waste associated with standard subtractive creation.

To be certain, 3D printing is not free: time, cost and skills are required, but as mentioned, the notion of creating a series of cooperatives is not something totally out of line. But don’t hold your breath just yet: this Workers / Community visionary thing is a long way off from developing because it’s facing a tremendous challenge: the very Captains of Industry and Champions of the Free Market.

It never fails: we hear the usual yahoos who clam on about how it’s all about free enterprise, open market capitalism; let the market decide how things work out and screw regulatory oversight – but when somebody actually goes forth and takes them up on that very notion these very same yahoos then run to their congressman and buy them off – er, raise substantial campaign funding – to enact legislation to keep this very thing from happening.

And conflict has already arisen (http://animalnewyork.com/2012/3d-printed-guns/):

Through his organization Defense Distributed, Cody Wilson has been raising money to design the “Wikiweapon,” a gun that could be created completely from 3D-printed parts. 3D printers like the Reprap and Makerbot, which use computer-controlled nozzles to perfectly replicate digital models in materials like metal and plastic, have recently become more affordable and accessible to mainstream consumers at less than the cost of a new Apple computer. The machines make it possible for individuals to print out custom-designed objects without any of the mess, fuss, or regulation of a factory line — whether that’s a one-of-a-kind necklace or something deadlier.

That’s right: the technology has already advanced where one can literally make their own guns right in their homes – in fact, this technology is already being used by the major gun manufacturing giants as the costs and returns on this technology is something they appreciate and recognize.

Hence the conundrum: do we now pass laws defining what one can or can’t print / create / make in the comfort of their own home?

I can hear the arguments already: “but it’s just like making distilled alcoholic products at home; that’s not permitted!” Granted, ‘moonshining’ is a dangerous business, if for the simple reason that most moonshiners wind up getting hurt or killed because their stills blow up. But that’s a false argument, as any Scotch connoisseur (as but one example) will tell you making moonshine is not the same as making 15 year old Macallan: there’s a reason why it’s called 15 year old Macallan and that’s because it’s aged in specific oak wood barrels under specific conditions with specific water sources using specific recipes: it ain’t the same so therefore, one is going to pay the premium for the 15 Year Macallan (assuming that they’re into that kind of thing).

Moonshine is not 15 year old Macallan: I’ll go to the store and get my Macallan, thank you very much.

And to argue that one needs a license to bear firearms is rather facetious as gun regulations vary widely from state to state. In fact, there’s an entire underground industry based upon purchasing weapons from one state and selling them in other more regulated states (funny how’s there’s really no major national license required for the purchase and regulation of firearms in this country,…).

But you do have the right to bear arms in this country – right?

So why not make our own guns?

It’s a frightening argument. And for the record I am a believer in the right to bear arms as frankly, I don’t entirely trust “my government”, whatever that means nowadays. But by the same token it is my honest and fervent belief that there are many people out there who should not be allowed to bear arms (much less be in possession of a valid driver’s license!) simply because they’re just too damn stupid or scary.

So now enter another potential point of consideration: rather than go out and buy that replacement part – say, a drain plug, a piece of plastic from your car, a button from your computer, etc., etc., etc. – you now could, with a little effort, come up with your own replacement part that’ll be nearly just as good – if not better, if you or anyone else you know understands and can use this kind of technology.

And don’t think for a moment that the Captains of Industry are already aware of this. I’m certain some of them are (and if they’re smart, they’d better be!) having late night thoughts about this potential development.

Kiss your residuals goodbye,…

And now we have technology that’ll allow folks to go out and make their own automatic weapons?

Please pass me my antacid bottle; between the 15 year old Macallan scotch and the antacid, I’m going to need all the help I can get to help me sleep tonight,…

Privacy is Dead; Get Used To It

From the Overlooked News Department,…. At a recently held (November 9th) congressional hearing regarding privacy, nine (9) major data mining sites testified and answered a number of rather startling and revealing questions (as reported in http://www.propublica.org/article/yes-companies-are-harvesting-and-selling-your-social-media-profiles) – among them:

Their responses, released Thursday, show that some companies record — and then resell — your screen names, web site addresses, interests, hometown and professional history, and how many friends or followers you have.

* Some companies also collect and analyze information about users’ “tweets, posts, comments, likes, shares, and recommendations,” according to Epsilon, a consumer data company. 

* Acxiom, one of the nation’s largest consumer data companies, said in its letter to lawmakers that it collects information about which social media sites individual people use, and “whether they are a heavy or a light user.” The letter also says Acxiom tracks whether individuals “engage in social media activities such as signing onto fan pages or posting or viewing YouTube videos.”

* Epsilon, a consumer data company that works with catalog and retail companies, said that it may use information about social media users’ “names, ages, genders, hometown locations, languages, and a numbers of social connections (e.g., friends or followers).” It also works with information about “user interactions,” like what people tweet, post, share, recommend, or “like.”

* Data companies of course, do not stop with the information on Twitter, Facebook, and LinkedIn. Intelius, which offers everything from a reverse phone number look up to an employee screening service, said it also collects information from Blogspot, WordPress, MySpace, and YouTube. This information includes individual email addresses and screen names, web site addresses, interests, and professional history, Intelius said. It offers a “Social Network Search” on its website that allows you to enter someone’s name and see a record of social media URLs for that person.

In the words of Captain Renault (from the movie ‘Casablanca’) “shocking to see gambling taking place in this establishment!”

Everyone knows this is taking place – and so what?

Actually, it is getting to be a rather big deal. One of the key factors which lead to the re-election of President Obama (http://swampland.time.com/2012/11/07/inside-the-secret-world-of-quants-and-data-crunchers-who-helped-obama-win/print/) it was the very use of this data which lead to extremely well-targeted listings and action items:

…campaign manager Jim Messina had promised a totally different, metric-driven kind of campaign in which politics was the goal but political instincts might not be the means. “We are going to measure every single thing in this campaign,” he said after taking the job. He hired an analytics department five times as large as that of the 2008 operation, with an official “chief scientist” for the Chicago headquarters named Rayid Ghani, who in a previous life crunched huge data sets to, among other things, maximize the efficiency of supermarket sales promotions.

None of this should come as any surprise. As far back as 1997, my colleagues and I had written a number of articles and reports on this trend. As noted in an article for ASIS (American Society for Information Science) presented during the 1997 Washington, DC conference (http://www.asis.org/Bulletin/Feb-97/lutz.html), I noted that:

We are witness to the demise of our notions of privacy; this trend is congruent with rapid technological development. Luddites could argue that as technology grows, privacy dissipates; thus, technology must be curbed (so the argument goes). The genie is, however, well out of the bottle. Modern conveniences and economic advantages far outweigh any notions of denying the benefits and comforts which we amply enjoy. … In the coming century, our identities will be how we appear on innumerable databases; our visage reflected in the hidden cameras and how we stand within society’s walls defined in the roll calls of databases. The time is right, therefore, to educate both the public and legislators about the relationship between ourselves and the tools which gather information about us and our fellows.

And this was back in 1997.

We’ve well surpassed the point of no return: as Lutz’s Law of Privacy states, there is an inverse relationship between privacy and convenience: the more of one, the less of the other. Add into the mix wireless / handheld communications devices and now, more than ever before, you are who and how you appear within the electronic realm. Arguably, you and how you appear electronically is more important than how you appear in person as job recruiters, credit agencies, services or strangers who wish to meet and greet you will judge you more by how you exist online than how you are in person.

Competition is everywhere: whether be it for those seeking elected office or businesses seeking an edge and expanding their costumer base. Now, more than ever before, how and who you present yourself as is more important than ever before. Increasingly, you will find others – employers, potential clients, contacts – determining and deciding whether or not to work with you / hire you on the basis of what you post or whom you associate with – and as the evidence suggests, this is going beyond just your posting the photos of that ‘lost weekend’ on your Facebook page that you and your fraternity buddies did.

Wondering why you didn’t get that job or obtain that contract? Think  about it.

But before you let your paranoia get the best of you, just remember: it can work both ways. Given the increasing reliance folks have in using online services, who’s to say that you couldn’t beef up your profile more and gain the edge you need?

So the next time you consider LinkedIn, consider also MyBrand as well as revisiting your Facebook page. Add more appropriate pictures and keep your personal commentary through more secure means. Be careful with whom you associate with and who you link up to.

You never know who’s watching – or who would be interested in tapping you for opportunities you didn’t know existed,…

The Business of Prediction

A king shall fall and be put to death by the English parliament shall be. Fire and plague comes to London in the year of 6 and 23. An emperor of France shall rise who will be born near Italy. His rule cost his empire dear – Pay-nay-loron his name shall be.

from the Quatrains of Nostradamus

Let’s face it; we want to know the future – and why not? Wouldn’t it be cool and save us a whole lot of trouble if we knew what tomorrow will bring? It’s remarkable to note that, during times of great uncertainly and dissociations, we increasingly  turn to prognosticators and seek out answers; this trend is evident on several recent events:

* Predicting the weather. Knowing when and where hurricane Sandy was going to land did not stop things, but it made for a far more effective response and coordinated effort. Compare Sandy to Katrina and you can well appreciate how far we’ve come in terms of emergency management and practical planning.

* Election results. No where is this more true than the numerous pundits who sought out the future and turned to a variety of models, castings and other such approaches.

* Business / economic trends and developments. Increasingly, Wall Street awaits the word from Washington and the Bureau of Statistics and the Department of Labor to learn of the latest trends and developments, seeking to know when and what the forecasts are in terms of employment, investment, trade, commodities and other market developments.

Now enter Big Data and AI.

As noted by such sites as fivethirtyeight blog (our kudos to Nate Silver!) it’s no longer so much what the pundits are saying: they’re only in it for the ratings so naturally, they’ll always have a slant (or, as my grandfather used to say ‘beware a person who believes in their own bullshit‘). Using the cold, hard facts and level-headed statistics – like those utilized by fivethirtyeight – demonstrated how well we’ve advanced just in the past ten years along in terms of prediction.

I have to note my own role in the business of prediction. Some fifteen years ago, I developed a means of predicting when and where crime would likely occur, offering a tool for local police to utilize (this was known as – surprise! – Predictive Crime Analysis). This was achieved through a means of data analysis vis-a-vis GIS (Geographical Information Systems) and did not require a large computer; rather, it required dedicated staff submitting accurate information, conducting close data review coupled with utilizing proper statistical determination / relevance to any given data set and then mapping same. The result? In one locale, we were able to reduce crime by over 40% in the first year.

Fantastic, right?

Wrong.

After awhile, nobody wanted it as it cut back on police overtime and, in some instances, forced the criminals to cross over into neighboring towns to conduct their activities with the neighboring towns grouping together and placing political pressure to stop this effort. In the end, it was removed, retired and now forgotten (although I do still have various articles and papers discussing this; please feel free to contact me if you want to learn more).

Author’s Note; Fast forward fifteen years later, and the irony is that those very same towns are being asked to “share” their police personnel to help deter the rising crime wave in the neighboring town where crime was once down 40%,…! Arguably, despite their best efforts to deter the future, the future came forth and changed them,…!

The point is that sometimes knowing the future is not always agood thing, as by knowing the future, we (sometimes) change the future (like that famous Twilight Zone episode, where the individual seeking to learn his future finds out that he will die in twenty-four hours; it is suggested that by his learning his fate, he only increased his chances of making it happen). It is a conundrum noted by Quantum Physics by the (famous) thought experiment known as Schroedinger’s Cat: the very act of looking into the box changes the outcome of what it is you’re seeking to understand as the very act of observing a physical phenomenon can affect the outcome.

Now, this is not to suggest that by predicting the future direction of a hurricane (or other large-scale natural events) that we can change things, but it is not too far to hazard a suggestion that by knowing the trends of business, commodities, trades, employment developments, voter perceptions, etc. – that we can also change the nature of what it is we seek to understand – or control.

It is an axiom that in conducting any type of precognition, you need to set aside your beliefs – both conscious and sub-conscious – if you’re going to do a good job. This is not easy for sometimes, we just don’t like what the future is telling us.

But the opportunity! We are in an age of Big Data and information review unlike any never seen before in the history of mankind. We now have the tools and processing power. We can download and obtain data on a multitude of subjects and developments, convert it and feed it into systems that we can readily program and design for any variety of applications. Now, more than ever before, we can predict our futures in ways never even realized. The trick is, doing it right – and accurately.

Now then, that being said – allow me to enter a prediction of my own.

We shall soon see an AI arising from the bulk of Big Data – sooner than we realize. And quite possibly, it may even already be operating amongst us (as noted in my prior posts),…

Perhaps Nostradamus wouldn’t be such a bad name for such a computer.

Hurricane Sandy Aftermath: The Overlooked Revolution

“There is no security; there is only opportunity.” – General Douglas MacArthur

In the aftermath of Hurricane Sandy, fond memories are being resurrected; one of them waiting in long lines for gasoline. I recall similar times when I was much younger – riding in my grandfather’s car while we waited in line for the gas station during the 1970’s fuel shortages. On such days, he would buy a stack of newspaper and catch up on his reading, turning what would be a nerve-racking time for most folks into something he actually looked forward to doing.

But it’s more than just long gas lines we’re talking about: the (temporary) collapse of public transport networks is a development that will take months to correct fully. Understand: it’s not just that the railroad ties and tracks were washed away, entire foundations and track beds we removed, as if the railroads never existed! Add into this delicious mix the difficulties of living without electricity and the damage inflicted upon food distribution networks and you got life (as we know it) presently in portions of the United States Northeast Corridor.

No, it’s not Road Warrior / Book of Eli time, but it certainly is not all that peachy keen either. At least we haven’t resorted to cannibalism (nothing beyond the usually accepted daily limits; after all, only the strong survive in New Jersey).

Given the fact that over 400,000 people commute to and from the New York metropolitan center on a daily (work) basis, it’s forcing new notions of work and management. Humans are, if anything (at least the more successful ones) adaptable (as my grandfather, a retired Marine would tell me, ‘adapt and overcome; never let it get to you’). And there are few things as ‘encouraging’ as natural disasters which face us to deal with matters. So here are some rather notable developments which are going to become more and more mainstream:

1) Who’s needs an office? With telecommuting – Skype, Join.me, or other web services – there’s really no need to have conference rooms or regular face to face meetings, save for the rare office get togethers or perhaps to entertain the special guests. Fact is, we do far more on the jump than we ever have before and frankly, this is the future: entities save money by having smaller office space (cutting back on utility costs, rental, etc.) along with encouraging greater productivity through telecommuting. With the advent of ubiquitous laptops (remember how prohibitively expensive they were? Now they’re the primary computers of consumer choice), smartphones and the network infrastructure to support it all – telecommuting is the shizzle:

As reported in Live Science:

For one thing, hardware has changed. “Five years ago, we all had desktop computers. Now we all have laptops.” That means an employee has easy access to files and can easily move from office to home to coffee shop with minimal interruption. With Wi-Fi now available nearly everywhere, an employee can theoretically work as well at the office as at a Dunkin Donuts or a neighbor’s house. (Comcast, for example, has offered up its normally password-protected network of hotspots for everyone in Sandy’s path.) (http://www.livescience.com/24512-telecommuting-to-work-post-sandy.html?cid=dlvr.it).

Some would point out (and rightfully so) networks fail, but you’d be surprised the relative ease emergency cell towers can be established and activated – this in combination with wi-fi stations – network failure may certainly be inevitable, but unlike landline telephones, are now far more easier to reactivate (just so long as you’re not a certain major telecommunications giant working out of midtown Manhattan who arrives at the brilliant idea to store their major truck / emergency cell network service in a sub-basement five levels below street level during a major hurricane in an effort to save money as it costs more to store trucks on higher garage levels,…!).

2)  Online GIS Mapping services. Although this is nothing new (MapInfo’s Discovery application has been around since the late 1990’s) the explosion in open source GIS solutions (who needs to invest a fat wad in ESRi products when all you need is a basic and effective solution?) combined with online distribution applications creates tremendous potentials for both private and for general public access.  Let’s fact it: right about now, wouldn’t it be cool to have an app that links to an online GIS solution telling you which gas stations have gas, or where’s a functioning Wi-Fi service, or perhaps an app that can warn you about where the latest zombie outbreak is taking place and offer you suggested byways to avoid them? (If anyone is interested, contact me as I have a whole bunch of other ideas and professional contacts ready to make things happen,…!).

And online GIS goes beyond just offering temporary solutions: the marketing potential alone opens up new arenas and services. As Edward Tufte in his incredibly ground breaking works (http://www.amazon.com/s/ref=nb_sb_noss_1?url=search-alias%3Daps&field-keywords=edward+tufte) so aptly pointed out, it’s not just about getting the information, but distributing it – and making what you have stand out more from others.

There are few tools like online GIS which can do exactly that.  To learn more, check out this online GIS application and you’ll see what we mean: http://www.mangomaps.com

3) The Cloud is everywhere. As more and more entities (and we’re not just talking about private businesses, but also non-profit and government) are moving their services to cloud-based systems to better insure against losses (side note: insurance companies are now giving more reductions in premiums for businesses who do institute a qualified and comprehensive records management plan), while enabling greater employee data access and collaboration – not to mention allowing employees to work from a variety of locales: whoever you are, so too is your office.

In some ways, it can be disturbing: the merging of work and home life is now, more than ever before, closer together. But perhaps it’s time for folks to come to the realization that the benefits outweigh any fears or concerns. Fact is, mother’s can become more stay at home; entrepreneurs dramatically lower their costs and (potentially) can employ more workers (although it’s important to note that the traditional notion of an employee is changing; more on this in a future post,…) while continuity is better insured – a vital point for any entity, regardless if they are for profit, non-profit or government.

Hurricane Sandy’s impact is going to be around for a long time, not the least of which we can expect a growing evolution in how we do work.

Improvise, adapt and overcome: use the tools presented before you.

Oo-Ra!