VIDEO: The Future of Work from the 4th Industrial Revolution Series

April 7, 2019 - 11:06 pm

The future of work from Foundations of The Fourth Industrial Revolution (Industry 4.0) by Jonathan Reichental


VIDEO: Government Services in the Fourth Industrial Revolution

March 28, 2019 - 9:27 pm

TRAILER: Foundations of the Fourth Industrial Revolution

February 8, 2019 - 1:52 am

You can watch the full video series here.


PODCAST: Privacy and the Fourth Industrial Revolution

- 1:49 am

Listen to the podcast here.


New Course Available: Securing Cryptocurrencies

January 23, 2019 - 6:57 pm


Course: Foundations of the Fourth Industrial Revolution

January 3, 2019 - 5:29 pm

Upcoming changes with the fourth industrial revolution from Foundations of The Fourth Industrial Revolution (Industry 4.0) by Jonathan Reichental


S3 Ep3: Indexing the Brain

December 26, 2018 - 3:52 pm

Jonathan and Tom discuss a variety of topics related to the ethical aspects of emerging technologies. It is a rambling discussion recorded live at the CTO Summit in San Francisco on November 2018. Music credit: Kevin MacLeod.


VIDEO: My TED-Style Talk at Flux About Cities and Blockchain

October 30, 2018 - 10:57 am


PODCAST: My Interview on the IT Visionaries Show

October 26, 2018 - 6:23 pm

Click here to enjoy.


FREE REPORT: The Future Belongs to Cities by Dr. Jonathan Reichental

September 19, 2018 - 4:04 pm

Click Here to Download The Future Belongs to Cities Report


VIDEO: What Problems Might Blockchain Solve for Government?

September 9, 2018 - 11:02 pm


S3 Ep2: The Improv of Life

September 7, 2018 - 3:08 pm

Jonathan and Tom welcome their guest, Dr. Tia Kansara. It’s a wide ranging discussion that includes thoughts on self-driving cars, blockchain, and sustainability. Recorded at a restaurant in the Ferry Building in San Francisco. Plenty of laughs and food for thought. Music credit: Kevin MacLeod.


VIDEO: Interview on Future of Cities with Silicon Valley Innovation Center

August 30, 2018 - 6:43 pm


What an Honor! Addressing the United Nations in NYC on July 9, 2018

August 10, 2018 - 5:14 pm


Every disruption demands dialogue

August 4, 2018 - 4:26 pm


A Day Without Teaching or Learning is Like a Day Without Water

July 23, 2018 - 12:48 pm


S3 Ep1: Truth Decay – The Live Show

July 20, 2018 - 7:59 pm

Special live audience edition of Drinking Wine Talking Tech recorded in Sacramento, California on May 17, 2018. It’s raw and unedited. While the recording begins somewhat distorted, it improves later. Jonathan and Tom discuss a wide range of tech issues.


PODCAST: Digital Transformation in the Workplace

July 9, 2018 - 9:59 am


My Blockchain and Crypto Online Courses Rated Best in 2018

- 9:50 am

More info here:


Palo Alto’s CTO: The Smart Future of Cities and Society

March 31, 2018 - 11:11 am


Drinking Wine Talking Tech Season 3 – Now In Pre-Production

March 12, 2018 - 9:59 am


VIDEO: Interview on CNET

- 9:56 am


I’ve Joined the Faculty at the Ignite Institute, California

January 14, 2018 - 2:55 pm


S2 Ep7: [Bonus Show] Is Any of This Real?

December 23, 2017 - 1:29 pm

Jonathan and Tom discuss a wide variety of topics with the leading tech podcaster in New Zealand, Paul Spain. As usual we ask the tough questions and get very few answers. This time we even ask when we will get to experience the matrix. There are a lot of bad jokes.


S2 Ep6: The World of Smart Things

December 19, 2017 - 10:27 pm

Jonathan and Tom welcome guest, Alex Hawkinson, CEO of SmartThings. They talk about the future of the connected home and what it is like to work for Samsung. It is a fascinating journey into the mind of somebody who believes everything that can be connected will be connected. There is lots of wine too.


S2 Ep5: Everything Will Be Connected

December 16, 2017 - 1:30 am

Jonathan and Tom discuss the state of device connectivity with expert, Bill Pugh. While nobody is following the script, the conversation covers some fascinating insights on where we have been and where we are headed with connecting everything. Extensive laughter ensues.


S2 Ep4: How Do You Define Innovation?

December 9, 2017 - 4:55 pm

Jonathan and Tom discuss the complex and poorly understood word innovation with guest Saker Ghani. Saker is a global innovation leader and has worked on the iPod, iTunes, and the Yahoo home page. Currently with PwC, he lives and breaths the topic of innovation every day.


S2 Ep2: Can We Save the Planet?

November 23, 2017 - 2:07 am

Jonathan and Tom finally welcome their first guest. They chat with one of the founders of the sustainability movement, Gil Friend. They get real answers to our most serious human challenge. And yes, there is wine and a few jokes.


S2 Ep1: Preview of Season 2 – Guests and Drones

November 16, 2017 - 3:00 am

Jonathan and Tom are back for Season 2. Surprised by thousands of listeners to season one, this time they are adding amazing guests and tackling tough technology topics. Oh, and there is some wine too. In this episode they preview the topics and guests. They also chat briefly about a future of drones.


Begun the Drone War Has

November 5, 2017 - 7:01 pm

15-year old Luke Bannister had a proud and exuberant smile on his face. He had just flown the flight of his life hurtling at stunning speeds and taking steep corners, avoiding obstacles, and out-thinking all his opponents. In doing so, he walked away with the $250,000 first prize at the inaugural World Drone Prix in Dubai. His success was witnessed by 2000 spectators and envied by his 250 competitors.

Luke and his teammates are part of a global movement of drone racers, not quite mainstream yet, but quickly emerging as an exciting new sport. All around the world, competitions are being held with competitors of all ages and with drones of all types, big and small. Drone racing requires exceptional skills including strategy, reflexes, and nerves of steel. Leagues are sponsored by brands such as AIG, DHL, and Mountain Dew. The Drone Racing League has a TV rights deal with ESPN. It’s starting to be big business.

A couple of years ago, I attended a smart city event in Yinchuan, China. On one evening at an outdoor dinner, post meal entertainment included what initially appeared to be a fireworks display. Instead, it was an illusion. In place of explosions, a fantastic highly choreographed aerial dance was performed by what appeared to be tens, if not hundreds, of small synchronized drones. In addition to colorful sequences set to music, the drone spelled words and completed gravity-defying stunts. It was spectacular.

[Watch Intel’s 500 Light Drone Show Here]

U.S. Air Force 1st Lt. James Klein arrives for his piloting shift in Las Vegas. First for a briefing and then he takes his position inside a featureless building for his 10-hour workday. He’ll spend the day flying a Predator XP drone somewhere over the Middle East looking for persons of interest. He describes his job as 99% boredom and 1% adrenaline rush. Today he’s piloting alone but knows that in a few months he’ll be joined by a co-pilot. Not a human, but artificial intelligence (AI). The military bets that a human and AI can be more effective than either alone.

On this day, if James does launch a missile at a target he’ll have to head to his suburban home that evening and keep it to himself as he shares dinner with his family. Such is the nature of drone warfare in the 21st century.

Hurricane Matthew made U.S. landfall on Oct. 8, 2016. While a category 1, the storm caused significant damage and the worst came days later in the form of flooding. Paramedic Andrew Miller of Horry County Fire Rescue was sent to help evacuate stranded residents. Throughout his engagement he used a drone to help with operations. The drone developed a 360-degree, real-time overview of Andrew’s flooding area. He was able to gain a more complete understanding of the incident, looking at where the floods were, determining the best way to deliver services to those in need and providing critical information to rescue crews—not just verbal information but visual information of the situation they were being deployed to. For emergency workers of all types, drones are transforming emergency response. In industrial and many other organizational contexts, drones have the potential to do the same thing.

In early 2017, ground drones otherwise known as unmanned ground vehicles (UGV) started to appear on the streets of Redwood City, California. These drones were delivering take-out to residents. According to their maker, Starship Technologies, the ground drones are about 15 inches high, can carry three bags of groceries and weigh about 50 pounds when full. Their maximum speed is 4 mph and have nine cameras and proprietary mapping software that’s accurate down to an inch. Among the anticipated advantages, the experimental deployment is hoped to reduce traffic and delivery cost.

It’s All Fun and Games Until…

Unbeknownst to hundreds of workers busily making their way to work on a typical workday morning, a small drone with its noise drowned out by the hustle and bustle, was flying overhead and collecting the private information from smartphones in people’s pockets. It was doing this using software called Snoopy and was leveraging any of the phones with WiFi turned on. The drone captured an abundance of content including the websites that people visited, credit card information entered or saved on different sites, location data, usernames and passwords.

In the years ahead we will deploy billions of devices that will connect to the Internet—mostly wirelessly. In our increasingly connected smart homes we already do it with items such as our thermostats, entertainment, and home security systems. Our cities are becoming smarter with the deployment of millions (and soon billions) of sensors for everything from traffic management to air quality measurement. This Internet-of-Things (IoT) has remarkable advantages and will transform ourselves, homes, organizations, and cities. But they will be ripe for exploitation. And potentially by drones. Research teams from Israel and Canada demonstrated how a drone could exploit a software vulnerability in wirelessly connected light-bulbs and turn them on and off at a distance. Imagine for a moment a drone attack that turns off city lights in neighborhoods or citywide, or worse flickers them eliciting mass neurological responses.

For every advantage that a new innovation has, the bad guys will exploit it in their favor too. That’s life.

The Air Force in the United States, and in many other countries, are working on using drones in swarms. Rather than a handful of drones being used to attack a target, hundreds, perhaps thousands, can be used to overwhelm air defenses or to crash into targets. Technology is now enabling one pilot to commandeer a vast swarm of drones that completely changes air capabilities. But what happens when the bad guys do this too? Not limited to the battlefield, hackers could take over consumer and commercial drones in vast numbers and use them to disrupt public events with tragic consequences. They don’t need to weaponize them, the drone is the weapon.

And Here We Are

Almost unknown in the consumer space until the end of the 1990’s, drones have emerged as a serious industry in the first two decades of the new century. Fortune magazine estimates that from 2015-2025, the drone industry will have an economic impact of more than $82 billion and will create over 100,000 high-paying jobs. What’s responsible for the explosion in drone purchasing and use? For one, the technology has dropped in cost and comes in many shapes and sizes. Drone options range from just a few dollars to thousands of dollars depending on use and interest. Finally, they have incredible diversity of use including those I’ve explored in this article, but many more including capturing live events, delivery mechanisms, hobbyists, making personal videos, and in professional movie-making and journalism. Soon, drones will carry people. There are many more uses.

Regulation is finally catching up and in many countries drone rules are on the books. Some communities are fighting against drones as a noise nuisance, as a danger to humans, and as an expansion of privacy intrusion. Sure, there’s plenty of contention between enthusiasts and legislators and I’d bet that will continue for a long time. Make sure you know the rules and if drones interest you, consider becoming engaged in the debate.

Of course, as you’d expect, in response to the threats posed by drones used for nefarious reasons, a whole new defense industry is emerging that includes hardware and software solutions; and risk management services. Dedrone provides a good example of these types of emergent enterprises. New innovation will be required as the threats continue to speed ahead of the solutions.

For many of us, drones have been a side-show; something interesting but not necessary high on our radar. Those days are over. Drones must be considered in your organizational strategy, whether as a tool to optimize or improve a function of your business, or to understand and mitigate the risks they will pose in the years ahead.

I used to look at drones from a distance. Not anymore.


VIDEO: Smart City Interview on Belgium’s “Steven Talks” Series

October 26, 2017 - 9:15 pm


VIDEO: Keynote at Cybersecurity Symposium, San Jose, CA

September 29, 2017 - 12:59 pm


45-Min Video Interview on CXOTalk

July 22, 2017 - 9:27 pm


My 3-minute Pitch on the Smart City Imperative

July 12, 2017 - 2:05 pm


My Smart Cities Short Documentary

July 7, 2017 - 4:59 pm

Smart Cities: Solving Urban Problems Using Technology by Jonathan Reichental


Trailer for my New Short Documentary on Smart Cities

July 2, 2017 - 11:28 am


S1 Ep6: Season Finale – The Data Show

June 17, 2017 - 6:35 pm

Jonathan and Tom leave the best to last. This is the show you have been waiting for. It is the data show. But they take a different angle on this popular topic. Per usual you will learn nothing but you might smile. Music: Kevin MacLeod. Give us feedback via Twitter: @Reichental or on our official Facebook page:
Season 1 Episode 6 of 6.


S1 Ep5: The End of Ownership

May 13, 2017 - 3:29 pm

Jonathan and Tom take on the complex and speculative topic of the end of ownership. Why own anything if you can rent what you need when you need it, always have the latest, and not pay for something when not in use? You may not get the answer in this podcast but you will likely have a little laugh. Music: Kevin MacLeod. Give us feedback via Twitter: @Reichental or on our official Facebook page:
Season 1 Episode 5 of 6.


S1 Ep4: What Makes Silicon Valley Tick?

May 7, 2017 - 10:50 pm

Jonathan and Tom do it again. A meandering dialogue of nonsense with occasional moments of brilliance. Today they talk about innovation in Silicon Valley. If you are desperate for something to listen to on the treadmill, you might try this. Music: Kevin MacLeod. Give us feedback via Twitter: @Reichental or on our official Facebook page:
Season 1 Episode 4 of 6.


S1 Ep3: How Is Your Cyber Hygiene?

May 6, 2017 - 7:19 pm

Jonathan and Tom get together for another chat. Today it is about cyber hygiene or put another way, tips to avoid being hacked. It is another pointless 25-minute conversation. But you might learn something. Who knows? Music: Kevin MacLeod. Give us feedback via Twitter: @Reichental or on our official Facebook page:
Season 1 Episode 3 of 6.


S1 Ep1: Are Self-Driving Cars for Real? [Pilot Episode]

March 11, 2017 - 12:53 am

Jonathan and Tom have a little fun discussing self-driving cars. They share personal experiences and speculate where things are headed. Music: Kevin MacLeod. Connect with us on Twitter: @Reichental or on our official Facebook page:
Season 1 Episode 1 of 6.


My Chapter on Smart Cities Now Available in New Textbook

January 1, 2017 - 1:00 pm

Click Here to Preview Contents


PODCAST: My Interview on Cities as the Digital Hubs of the Future

December 24, 2016 - 3:15 pm


Still Struggling to Understand the Blockchain? Start Here.

November 27, 2016 - 1:17 pm

I’ve been monitoring, evaluating, and deploying new technologies for over a quarter of a century. It’s something I love to do. Monitoring emerging technologies gives us an opportunity to imagine the possibilities of the future. Even better, more often than not, deploying emerging technologies gives us the opportunity to positively change millions of lives. As processing power has skyrocketed, technology costs have dropped, and half the world has connected to the Internet (yup – still 50% of the world’s population has no access to the Internet), the rate of new technologies introduced has accelerated. We quickly forget that Facebook didn’t exist 13 years ago or that the iPhone will only be 10 in 2017. It’s worth noting however, that for all the new ideas that are introduced each year, most don’t succeed. That’s sobering if you’re an entrepreneur and a challenge if some of your success depends on trying to figure out and invest in what’s “next.”

I’ve seen a lot of new ideas. I mean a lot. I’ve been right a few times about what would succeed and been way off quite often. Sometimes my predictions were simply too early. Those that know me may recall that back in the middle 2000’s I was bullish on virtual reality (VR). I spoke and wrote about it widely. Turns out I was at least 10 years too soon with my prediction. Same with voice recognition. I did predict the emergence of social media (then called social computing), but didn’t anticipate fake news and all the other ugliness. I was an early user of Twitter and remain a fan, but it pains me to watch it struggle.

Now as I continue to monitor and evaluate a whole swath of emerging technologies I am particularly struck by this thing called the blockchain. This is a technology with a funny name but with the possibility of significant consequence. I’ve decided to write about the blockchain, something I’ve wanted to do for a while, because so few people know what it is, and what has been written about it is remarkably difficult to understand. As an educator, I want to make difficult concepts accessible to as many people as possible.

Selfishly, explaining things makes me understand them better too.

I don’t intend to get deep into blockchain here. I’ll simply discuss its basic concept and provide some examples of how it might be applied. If you’re not a blockchain beginner, you can probably stop right here.

What problem does the blockchain try to solve?

Let’s begin answering this question by looking at an example. We’ll use an online directory. This directory is simply a listing of people’s names, phone numbers, and email addresses. It is provided by a commercial company. The data in the directory is in a database and it lives on a physical server somewhere in the United States. When a person needs to access this directory and find some personal details, they use a web browser over the Internet. Simple enough.

This basic design has generally worked well for a few decades. However, those of us with experience quickly admit to its significant limitations. For starters, if data is changed in the database, how do we know the change is correct? What happens if that single server is successfully accessed by a nefarious individual or organization? Over time we’ve addressed these issues as best we can. For example, validation of authority can be achieved through producing a social security number. But authorities can create bottlenecks, they can be costly, and they can even be biased. Server security is implemented by any number of innovative commercial solutions, yet everyday we hear about major breaches of systems that result in credit card and identity theft. These problems and solutions all exist because we’ve embraced a database design that is inherently limited. Is there a better design, one that not only eliminates theses limitations but also vastly improves how systems interact and data is managed?

Rethinking the centralized database

Now let’s have a little fun. Instead of the database being on one computer, let’s place it on lots of computers–possibly thousands and eventually even millions. If a change needs to happen in the database, all of the versions of the database on every one of those computers needs to change—and here’s the secret sauce: all those computers have to agree to the change!

Instead of a single database on a single computer, let’s distribute the database onto thousands, even millions of computers.

Let’s explore this a little further. In this new design, there is no central computer or single version of the database. By definition, we are using a distributed database. A transaction such as new data or a change to existing data entered in a copy of the database on my computer–if accepted by all the other computers–will be made in their databases too so that the distributed database is identical in all instances. Making updates, which are encrypted for additional security, that are reflected appropriately across this network of computers is analogous with transactions in an accounting ledger, so we’ve decided to call it a distributed ledger. This distributed ledger adds a new immutable block (one that can’t be changed or deleted) of data every time there is a change. Imagining a long chain of blocks gives rise to the notion of a blockchain!

This new design is equal measure beautiful, simple, and powerful. That’s much of why it is so compelling.

So how does it solve the issues of the central database we discussed earlier? First, the distributed ledger requires that all the participant computers agree to a change. This consensus mechanism almost eliminates the possibility of security issues because the large volume of participating computers would all need to be breached. Additionally, since all the computers need to agree or disagree with a change, the network becomes the authority. Effectively, this new design doesn’t require a central authority. Since the integrity of the distributed ledger requires complete participation, risk is radically reduced, and trust is dramatically increased. As Don Tapscott says in his book Blockchain Revolution, we don’t need to trust each other in the traditional sense, because trust is built into the system itself. This is why we’ll sometimes refer to the blockchain as the trust protocol.

The Blockchain eliminates the middleman

Ok, so that’s the basics. We see how the distributed ledger helps with integrity and security, but what does it enable that wasn’t easily possible before?

Suddenly a completely new set of possibilities emerge. Let’s take something as important as online identity. While some creative solutions exist to this problem, blockchain technologies may be the ideal answer. Since the distributed ledger is the authority, it becomes near impossible to duplicate or impersonate an online identity since this authority ensures a single identify for you. It will reject attempts to foil the system. Done right we could see, for example, online voting on the horizon in the United States. Online identify would also enable better protections from copyright infringement. For example, if you write a song and use the blockchain to record your ownership to that song, we effectively eliminate any question of originality in the future.

With robust strength to authentication and validation in all manner of transactions, the blockchain can be an authority over an enormous volume of activities that today require expensive third-party participants. With the origin of the blockchain in an online currency called Bitcoin, the distributed ledger can eliminate many of the required brokers in moving money from one account to another. Imagine sending money instantaneously to a friend in another country with limited bank engagement and cost (and possibly none at all!)

Helping devices negotiate with each other

If the blockchain takes off, the future doesn’t bode well for intermediaries like brokers, notaries, lawyers, and anyone who makes a commission from transactions.

Now let’s kick it up a notch. Rather than simple transactions happening over this network, let’s imagine entire, complex contracts between two entities being recorded and validated by the network. These smart contracts could have triggers that the network would honor, thus eliminating much of today’s human interaction. All manner of intermediaries could be eliminated. Watch out notaries and lawyers! And welcome blockchain-driven machine actions. What might that look like? Imagine devices on the Internet (the Internet of Things) that need to negotiate together, say, facial recognition that results in the opening of a door. A smart contract would exist on the blockchain that would manage the interaction without the need for a central database or authority. It’s not out of the realm of possibilities that future devices using artificial intelligence could create their own smart contracts without any involvement from humans. Jeepers.

It’s quickly possible to see that the blockchain has considerable consequences.

So what might hold the blockchain back?

Like with most emerging technologies a lot of things have to happen to enable success. We’ll need to see the emergence and adoption of standards. We’ll need to ensure integrity and performance scaling to billions of transactions (maybe more) every second at a global level. There’s energy power and legacy technology challenges to contend with there. There will be resistance from impacted industries such as banking and finance. The blockchain’s core quality, that of no central authority, will be challenged and challenging in a world that is designed around the premise of regulation and control.

This all being said, momentum is on the side of blockchain. Radically redesigning technology and the way we think about many of the fundamental processes in our society requires serious understanding and possible plans for action.

Getting a grasp of the blockchain is a first step. But we’ll need to study it closely and try to anticipate implications that may not be obvious right now. Will we get ahead of it’s fake news flaw?

If I’ve helped you understand the basics of blockchain I recommend getting deeper into the topic. Fortunately, there is a lot of industry-specific content emerging and that may be a great place to go next.

I don’t know if the blockchain is the next big thing, but I’m not taking any chances. You?


My Green Screen Career Started This Week

July 26, 2016 - 11:32 pm



PODCAST: What Will The City Of The Future Look Like?

May 17, 2016 - 2:09 am



The Smart City Opportunity

April 6, 2016 - 11:40 pm

Assuming that current trends continue, our future belongs to cities. Already half of all humans on the planet live in cities, and by 2050 a full 70 percent of civilization will live, work, and play in an urban environment.

To get a sense of the scale and speed of this global transformation, the United Nations says that 3 million more people move into cities per week. It’s creating staggering challenges. Cities already burdened with intractable problems experience a constant stream of new arrivals all requiring housing, jobs, energy, transportation, and so much more. Urbanization has enabled so many of us to rise to a higher quality of life, but it has also created unprecedented problems that are quickly undermining the benefits it once created.

Anyone who lives in a city sees and experiences these issues first-hand. This isn’t an abstract topic. Together we must commit to bold new solutions. To create a future urban environment that is healthy and sustainable will require a new operating system for our cities.

What do I mean by a city operating system?

Every day, in thousands of cities around the world, a cycle repeats itself. For billions there is a march to work, school and other activities. Machines spin up. Transportation systems engage. Energy is produced and consumed. Products are made. Data flows. Buildings heat and cool. Services respond. By evening, we’re home, ready to rest and begin again tomorrow. There’s a complex system of interdependencies that enables this all to function. It’s a city operating system. For most of us, experiencing these mechanics every day is both amazing and exasperating — sometimes simultaneously.

We owe it to ourselves and to future generations to create smarter cities.

Today, under duress, this operating system is largely inefficient and its fragility is glaringly exposed. If you sit for hours in traffic bumper to bumper or press up against one another in a train or circle endlessly searching for a parking space, you know our transportation systems are broken. If you get frustrated attempting to find city information or eliciting a city service or trying to have a voice to influence change in your community, you know municipal government can do so much more to meet expectations. Around every corner, time after time, we all experience the product of an urban environment in desperate need of reinvention. We owe it to ourselves and to future generations to create smarter cities.

What will it take to create this new operating system?

First we need the motivation. Are we sufficiently convinced that radical innovation is needed? The evidence suggests that we’re moving in the right direction. From India’s 100 smart cities initiative, to impressive projects in Amsterdam, Singapore, and Palo Alto, California, to new cities like Songdo in South Korea, Yachay in Ecuador, and Masdar in the United Arab Emirates, there’s a remarkable new momentum for change. But these cities and their respective leaders still represent only an elite group that are both talking and doing. It’s a good start.

Here are three areas to begin:

1. Culture

What these first-movers have in common, too, are the unique qualities of leadership, vision, and an appetite and permission for some risk-taking. In other words, they have the prerequisite attributes of culture that is necessary for creating a new operating system for their cities. It’s not impossible, but it will be harder for those cities without these cultural qualities to embrace the essential reinvention ahead. It will come, but it will take time. It’s an opportunity for new leaders and a push from the private sector to enable some of that new culture to emerge.

2. Civic engagement

Next, a new operating system will only emerge through partnerships and civic engagement. Given the huge costs and an almost infinite number of other competing priorities, local city governments will be unable to assume the responsibility necessary for building these smarter cities alone.

There’s ample opportunity for a myriad of partnerships with the private sector, with academia, federal agencies and the local community. Local governments are good at and well-positioned for convening organizations and individuals. Many cities already host challenges, hackathons, and meet-ups; they create incentives and platforms for participation; and they invite problem solvers to co-create. These are great starting points. Ensuring that local government is accessible and open will help get results. Cities with open data portals help to unleash valuable insight that is used to build solutions and drive decision-making.

3. Civic innovation

Finally, we’re going to need a whole new generation of technology innovation. We already see the advantages of mobile apps that help people find parking spaces, monitor and manage their energy use, interact efficiently with city services, make informed environmental choices, cycle safer, elicit medical assistance and so much more. We see how traffic signals will soon work in concert with connected cars to help with traffic flow. We’ve seen cities from San Paulo, Brazil to Los Angeles, California reduce crime by using data and intelligent software.

Forward-thinking cities are experimenting with the Internet of Things to connect all manner of infrastructure to data networks. This connected infrastructure is providing completely new capabilities such as the ability to more easily detect and report water leaks; for empty parking spaces that announce themselves; and for a series of monitors that can initiate the timely dispatch of public safety personnel in an emergency. But this is just the beginning. A new operating system for cities is going to require big thinking, new ideas, and all the attendant new technologies, skills and people that emerge from this innovation.

History demonstrates that the capacity for human ingenuity particularly in the face of overwhelming adversity is a powerful force.

The choice we face

If you’re an optimist like me, you’ll see the creation of a new operating system for cities as an incredible opportunity. There’s a lot to be concerned about as we look out into the future of our cities. We’re facing unprecedented challenges. History demonstrates that the capacity for human ingenuity particularly in the face of overwhelming adversity is a powerful force. Today we face a crossroads. A path that leads to the reinvention of our cities and another which sees the continuation of the rapid decay and decline of our urban spaces. Together we can choose the right path and create a new operating system for our cities.


VIDEO: Meet 7 CIOs That Are Creating Smart Cities in Silicon Valley

February 26, 2016 - 11:16 pm


VIDEO: Silicon Valley Forum Interview on Smart Cities and Open Data

February 12, 2016 - 9:00 pm


VIDEO: Brave New Connected World – My Talk at Silicon Valley Forum

February 1, 2016 - 9:46 pm


VIDEO: Creating Data-Driven Cities – ODSC WEST 2015

January 30, 2016 - 11:34 am


VIDEO: What Does a Government CIO Do?

January 27, 2016 - 3:05 am

Video also discusses the role of the Internet of Things in a government and city context.


Embracing Diversity for the Future of our Cities

January 15, 2016 - 9:17 pm

“In the game of life, less diversity means fewer options for change. Wild or domesticated, panda or pea, adaptation is the requirement for survival.” – Cary Fowler

As a technology leader in Silicon Valley, I have the opportunity to attend and speak at many technology conferences. While some notable progress has been made, I can’t help but observe that the vast majority of participants are men. It’s even worse when the audience are managers and above. Then it’s predominantly middle-aged white men, a statistic supported by recent research showing women represent only 19% of board positions and 25% of management roles (

Unfortunately, the technology industry continues to be a bad reflection of the diversity in American society. Together we have to fix this.

The Tech Industry Diversity Problem

For many of us there’s no news here. We know the hard facts and in the last year alone the media has called out many technology firms in Silicon Valley that have made slow progress in addressing a lack of gender and racial diversity. The ensuing embarrassment has created some momentum and here’s hoping we see real progress in the months ahead.

The absence of diversity in many companies isn’t just about fairness and equality. Several studies demonstrate that it can impair economic growth; damage brand; and lower productivity. As one startling example showed: executive boards with greater diversity had a 53% better return on equity (Source: McKinsey).

Any one or combination of these factors should be enough to compel organizations to be proactive in addressing this inequality. But in the technology industry and increasingly for most business domains today, a lack of diversity may be hindering the very thing we need in much greater volume: innovation.

Put bluntly, the current state of diversity in the technology industry is materially impacting the marketplace of ideas. We’re limiting our potential to be more creative.

A Societal Opportunity Cost

“We must tell girls their voices are important.” – Malala Yousafzai

Ideas and the refined mechanics to execute on them are at the heart of Silicon Valley’s success. In a healthy marketplace, ideas live and die on their merit as determined by a population. Assuming an idea emerges and it has the support to get a decent run at the market, it has a chance to succeed.

But what happens when a voice with an idea is not given a chance to be heard? We exclude that voice from the marketplace of ideas. Since we know from research that diversity contributes to creativity, it should logically follow that less diversity equates to less creativity. We’re artificially inducing silence where there should be noise.

What is the cost to society in never realizing these ideas and the positive innovation they could generate?

Civic Innovation Needs Many Voices

With the 21st century well underway, the world continues to experience rapid urbanization. By mid-century, two out of three people on the planet will live in a city. These urban centers around the world are struggling to meet the significant needs ahead. Much of the infrastructure is in desperate need of repair and modernization; city services lag behind community expectations; budgets are stretched thin; and major transitions related to areas such as transportation, energy, and sustainability have all yet to play out.

In this context, civic innovation and particular that powered by technology, becomes a central support and driver for necessary positive change. To make our cities smarter, healthier, and more responsive will require new talent, new perspectives, and new ideas.

Simply put: If we’re going to meet the city challenges of the 21st century head-on, we’re going to need a lot of new voices at the table.

For the past four years, I’ve had the privilege to lead a technology team that is pushing the boundaries of civic innovation at the City of Palo Alto. Being able to reimagine the possible and experiment with new ideas in the heart of Silicon Valley were among the core reasons I chose to take this public service journey. With eyes wide open, I anticipated that introducing some risk-taking into projects and elevating the strategic role of technology into city operations would not be easy. With City Council and leadership support, that’s what we’ve done. Mostly we’ve had good results and we’ve learned from our mistakes.

While strong leadership support and Silicon Valley as our working backdrop have been essential components of helping our team succeed, I believe that more than anything, the top contributor to our ability to approach issues differently and to take greater risks has everything to do with our appetite and encouragement to listen to all perspectives. The diversity of our technology team, our City staff, and our community has directly contributed to a wider range of ideas; thoughtful perspectives; and a healthy amount of debate.

In other words, we’re better because as best we could, we embraced diversity.

Let’s Get to Work

“Strength lies in differences, not in similarities” – Stephen R. Covey

Of course we also recognize we can do better. After all, our potential new hires and workforce can only be as diverse as the conditions we are all helping to create. A local government organization is a microcosm of society. If we don’t encourage, enable, and facilitate, for example: more women and minority groups to pursue careers in technology, we’ll continue to perpetuate our limited options.

It’s clear then that if we’re going to address the complex challenges of local government and design and deploy more civic innovation, we’re going to need a more diverse set of voices at the table.

Diversity in all its forms has become essential to the success of all of us.

Sometime, hopefully in the near future, we will all be able to go to a technology conference and be warmed by the richness of differences in the audience. We’ll find the kaleidoscope of gender, race, age, ethnicity and more to be unsurprising. We’ll be struck by the range of ideas and perspectives in discussions.

As I reflect on the diversity challenge ahead and the compelling urgency to see change, I’m left with two questions: What will it take to get us there and how fast can we make it happen?

Together, let’s begin to answer those questions now.


This blog post first appeared in January 2016 in the League of Women in Government website:


VIDEO: BBC Featurette on Connected Cities

January 9, 2016 - 8:13 pm


The Impact of Social Media on Traditional Knowledge Management

December 16, 2015 - 1:12 pm

book-168824_1280Successfully implementing knowledge management, which is broadly defined as the identification, retention, effective use, and retirement of institutional insight, has been an elusive goal for most organizations. Some of the smartest people I have worked with have been frustrated by their efforts, not through lack of trying or ability but by the inherent challenges it presents. Now the emergence and impact of social media and the way it democratizes the creation and use of knowledge in the enterprise is forcing us to rethink our assumptions.

To understand and discuss the challenges of the traditional approaches to knowledge management, I’ve categorized them into two simple buckets: behavioral and technical.

1. Behavioral

In order for a Knowledge Management System (KMS) to have value, employees must enter new insight on a regular basis and they must keep it current. Out-of-date information has limited use beyond being of historic value. Seldom are either of these behaviors adequately incentivized. In fact, by being asked to share their tacit knowledge, many employees believe they are reducing their own value to the organization. In addition, updating the information requires real effort, which is rarely a priority against the core responsibilities of the employee. Even organizations that have dedicated resources for managing knowledge struggle to keep it current and to enforce adherence to their single source of truth.

2. Technical

If you want to find out something about your organization, say, the revenue of the business, it’s often easier to use a popular search engine than to use your own internal knowledge system. Try this yourself.

It’s remarkably difficult to organize information in the right manner, make it searchable, and then present it so the most relevant responses are at the top of the search results. Organizational information is hardly the example of pristine structure. While public search engines use algorithms such as counting the number of web pages that link to other web pages (a good measure of popularity) to function, internal systems have no such equivalent. Unstructured content is the king of the public web, whereas it is the bane of the enterprise.

The situation is compounded when employees are disillusioned by the effectiveness and effort to use the KMS and resort to old habits, like asking colleagues, improvising, or relying on non-official sources. The system often fails to be widely adopted—at best it is used by a small proportion of the organization—and no amount of effort is enough to see success scale.

Enter Social Media: The Changemaker

It may be time for you to rethink knowledge management in your organization. Social media, a disruptive phenomenon particularly in the enterprise, has the potential to completely disrupt traditional knowledge management systems.

In the old world order, knowledge was typically created and stored as a point in time. In the future, organizational policy or insight is less likely to be formed by an individual creating a document that goes through an approval process and is ultimately published. No, it will more likely begin with an online conversation and it will be forever evolving as more people contribute and circumstances change.

Social media takes knowledge and makes it highly iterative. It creates content as a social object. That is, content is no longer a point in time, but something that is part of a social interaction, such as a discussion. We’ve all seen how content in a micro-blogging service can shift meaning as a discussion unfolds.

The shift to the adoption of enterprise social computing, greatly influenced by consumerization, points to an important emergent observation: the future of knowledge management is about managing unstructured content.

Let’s consider the magnitude of this for a moment. Years of effort, best practices, and technologies for supporting organizational content in the form of curated, structured insight may be over. The redo is an enormous challenge, but it may in fact be the best thing that has ever happened to knowledge management.

A Silver Lining

In the long run, social media in the enterprise will likely be a boon for knowledge management. It should mean that many of the benefits we experience in the consumer web space—effective searching, grouping of associated unstructured data sources, and ranking of relevance—will become basic features of enterprise solutions.

It’s also likely we’ll see the increasing overlap between public and private data to enhance the value of the private data. For example: want to know more about a staff member? Internal corporate information will include role, start date, department etc., but we’ll now get additional information pulled in from social networks, such as hobbies, photos (yikes!) or previous employment. Pull up client data and you’ll get the information keyed in by other employees, but you might also get the history and values of the company, competitors, and a list of executives, gleaned from the broader repository of the public web. I’ll leave the conversation about privacy for another day.

It’s likely that social media-driven knowledge management will require much less of the “management” component. Historically we’ve spent far too much time cleaning up the data, validating, and categorizing it. In the future, more of our time and our systems will be used to analyze all the new knowledge that is being created through our social interactions. The crowd will decide what is current and useful.

Of course, formality will not entirely fade away. There will still be a role for rigor. Laws, regulations, policies, training documentation, and other highly formal content will require it. But it will live alongside and be highly influenced by social computing.

No doubt knowledge management is an enormously complex space and the impact of social media magnifies the challenges. However, the time is right to evaluate your knowledge management strategy. It may be time to begin anew.


VIDEO: How smart can a city be? – Jonathan Reichental Discusses with Tia Kansara

December 13, 2015 - 12:16 am


VIDEO: Interview on Apps, Smart Cities & More at WebSummit 2015

December 12, 2015 - 1:49 pm


VIDEO: Everything will be Connected (El Pais – English & Spanish Versions)

October 20, 2015 - 10:24 pm


Why Every Organization Needs to Be in the Cybersecurity Business

September 22, 2015 - 10:45 am

opendata3It’s a widely believed view by leaders that their organization is either one that has already experienced a cyber-attack or one that will be a target of an attack in the future. A more accurate conclusion by those who study this field suggests reality is somewhat different. Their assessment is that every organization falls into one of these two categories: It’s either already been attacked and knows about it or it’s been attacked and doesn’t know about it. Bad guys don’t always leave a calling card. Even more alarming, with many cyber-attacks being orchestrated over long periods rather than sudden attack-and-grab approaches, an ongoing effort may be underway right now without organizational knowledge.

It’s a grim assessment but sadly, a very real one. I’d be highly redundant if I listed just a few of the major, high-profile breaches that have taken place in recent years. It’s enough to report from the Ponemon Institute that 43% of all enterprises were the victims of a known cybersecurity event in 2014.

It’s not all bad news. A recent PwC survey noted that 76% of businesses executives acknowledge the serious risk to their organizations from cyber-crime. This is a positive sign. But let’s dig a little deeper. Recognizing a risk is an important first step, but it amounts to nothing if little action is taken.

Cyber-attacks, while clearly disruptive and often highly expensive, are now existential threats to organizations. It’s more than just the impacts of, say, brand risk and legal costs. A concerted and far-reaching IT security event can effectively destroy a business. Throwing a few dollars and some talent at the challenge is little more than rearranging the deckchairs on the Titanic.

Organizations need a complete wake-up call on cybersecurity.

From the highest level of the organization, cybersecurity must be made a priority with significant investment and executive and staff-level talent acquisition.

Bottom line? Organizations need to be in the cybersecurity business.

What exactly does being in the cybersecurity business mean?

Sadly, for most organizations, the investment and effort in security is the equivalent of insurance: it doesn’t contribute directly to the bottom line, but it’s an essential cost for every organization. Cybersecurity, if it’s being successful in your enterprise, will largely be invisible.

Let me be clear. Being in the cybersecurity business isn’t defined by employing the basics such as having anti-virus software and a firewall in your infrastructure. By that definition we’d be done already.

Being in the cybersecurity business means leadership of the organization has identified information technology (IT) security as an enterprise risk and is taking substantive and on-going action across all aspects of the organization to prevent future attacks.

Three things that organizations must do right now

1. Establish an enforceable cybersecurity policy

After you’re done reading this post, assuming you’re not sure, ask your team if a cybersecurity or IT security policy exists and whether it is current. Sure, you might have one, but does it reflect the realities of 2015?

A quality IT security policy will clearly outline the context and rules in which your organization operates and protects its digital assets. It will speak to dimensions that impact employees, customers, the public, and the wide range of stakeholders that interface with the organization.

It will be a document that has been endorsed by all leaders across the enterprise and it will be regularly updated as conditions dictate. There’s a large body of available knowledge on IT security policies, so a starting point is easy. If you recognize that your organization is now in the cybersecurity business, a meaningful IT security policy is a baseline artifact. Make it happen or improve upon what you have.

2. Train all employees in the basics of cybersecurity

Conventional wisdom suggests that the weakest link in cybersecurity in most organizations is its employees. But it’s more than that. Employees can be your best enforcers of a high-quality cybersecurity posture. Let’s take each of those ideas separately.

You know that your employees want to do the right thing. They deserve the insight on how best to protect your organization. It begins with the obvious such as guidance and enforcement of strict password rules. It should include what to look out for when evaluating whether to open an email attachment or enter security details in an online form. But it needs to go further to help employees know how to handle credit cards and social security numbers. It’s a leadership responsibility to ensure employees have the skills to do the job being asked and that includes protecting the enterprise.

On the second point: your employees can be some of your best enforcers too. Make it safe for them to report suspicious activity or for them to make independent judgement calls such as prohibiting tail-gating into restricted areas. An empowered workforce is a cybersecurity army that’s ready to be unleashed.

Finally, good behavior in cybersecurity must be reflected by leadership. They must demonstrate support for cybersecurity actions and be role-models in all aspects of your organizations IT security policy.

3. Complete an independent risk assessment of the enterprise

Let’s acknowledge that your organization has likely done many of the basics in IT security. Well done. To be in the cybersecurity business means doing a lot more. At this moment, do you have a confident understanding of where vulnerabilities are in your organization? The evidence suggests that many leaders simply don’t. Recent research by PwC puts the number as high as 50% of leadership that see cybersecurity simply as an IT risk, and not an enterprise risk. This nugget alone helps to validate why cybersecurity isn’t being made the enterprise priority it needs to be.

If your organization hasn’t done this recently, it’s time to get an independent assessment performed. In addition to providing you with sobering insight into your enterprise cybersecurity risks, you’ll have the evidence to create a case for action.

Congratulations, you’re now in the cybersecurity business

If you make cybersecurity an enterprise priority with strategic and tactical investments; hire the right talent; empower and train employees; and have an enforceable policy that reflects current risks, you will have a more resilient organization. In a world where we’re likely to never fully protect ourselves from cyber-attacks, we can take the necessary and urgent steps to be better able to anticipate, defend, and recover from attacks. To make this happen, like just about everything else in our organizations, it’s going to require bold and informed leadership.

It’s a new day for enterprise risk and a new day needs new thinking. You probably didn’t realize it before, but assuming you do the right things right now, you’ll soon be in the cybersecurity business.


Four Ways to Make Metrics Really Count

August 21, 2015 - 9:18 pm

metricmeetinglight2Sometimes it’s worth reminding ourselves that the most important skills and tools for organizational success aren’t the ones that cost a lot or are difficult to execute. Often they are the forgotten fundamentals; the mundane but effective techniques that can really help. Developing and using metrics, even in the smallest team deep-down in the org chart, is one of those areas. Get started or re-embrace your commitment to metrics and you’ll be pleasantly surprised at the results.

Rediscovering what gets measured gets managed

Are you able to view or run a report at a moment’s notice and determine how well your organization is doing in a specific area of, say, sales or service delivery? If you can then I congratulate you. You have timely data to help you make informed decisions for leading your organization to success.
Sadly, for many leaders this simply isn’t the case. Why might that be? Reasons can range from simply not making performance metrics a priority to not having the skills or systems to provide the desired data. Whatever the reason, a leader and a team without access to timely data is at a considerable disadvantage. Make metrics a priority and you can change the game in your favor entirely. Imagine being able to spin up a dashboard of data about things that really matter to your team and its objectives?

A discussion that incorporates relevant metrics lends itself to rational, data-driven decision-making. It reduces emotions and anecdotes and helps everyone focus on action. It’s one hallmark of high-performing teams.

If you’ve reached this far I gather you’re curious to learn more. Thank you. I now offer four ideas to make metrics really count for you, your team, and your organization.

1. Aim High, But Start Simply

It’s natural to begin the thought process for a metrics dashboard by believing you need to start with skilled talent; a data warehouse; and some form of data visualization tool. These things become more important over time, but let’s not start there. What matters most at the beginning is to get everyone convinced that the effort is worthwhile. Develop your best arguments for why data-driven decision making is good for everyone. Focus on the eventual positive outcomes.

Next, create an inventory of data that is currently collected and available right now. Don’t let the conversation drift to the desire to start collecting what you don’t already have. That will come later. It may be ugly, but every team has some form of existing data collection.

How about tools? Without any purchasing investment, there are amazing things that teams can do by simply using their existing productivity software or free online spreadsheets. In one of my teams, we did that for an extended period until we were ready to make tool investments. It’s doesn’t have to be pretty, it just has to be useful.

With some basic data sets and a modest tool, you’ve got yourself a metrics dashboard.

2. Have Regular Metric Review Meetings

For many teams, collecting data and preparing it for review can seem like a worthless chore. There’s a common belief that few look at the reports and they offer no value to the decisions that matter. This is unfortunately validated as truth in many instances.

If you’re going to collect and make metrics matter, they need to be reviewed together as a team. These review meetings can be some of the most educational and valuable times that teams spend together. Have the team members who are responsible for the data interpret and present the results.

When team members observe that decisions are being made based on the data, the result is often improved trust. Who doesn’t want that?

3. Only Collect and Review Metrics for Decision-making

Sure, this tip seems obvious, but it is seldom adhered to. Teams that already collect and present data must choose a time to run every metric through the filter of whether it is an actionable item. If the metric is no longer something to base decisions on, eliminate or archive it. Over the months and years, metrics-creep has likely resulted in a large and diverse set of reports. Periodic purging is necessary.

For teams new to performance dashboards, be discriminating. Carefully review the value of data that you’re proposing to collect and present. You and your team will be surprised at the dialogue this creates. A better understanding of what the team does will surface. Collecting and presenting actionable data effectively describes the purpose of the team.

4. Share Metric Insights

At first, your metrics and reviews will likely have a small audience. Begin with a small team that includes managers and supervisors. Meet regularly, document decisions, refine the process, and add new metrics. At some point it will become clear that others will benefit from visibility to a subset of your metrics. Who might they be? Think about senior management; your customers; your suppliers; your peers and other teams.

I’m a big believer in telling your team story on a regular basis. Sure, it can take many forms, but I’ll argue that metrics need to be a part of it. Team metrics quickly communicate the value of what your team does. It will greatly assist with decision-making; for example, when trying to make the case for additional funding.

I’ve found that bringing metrics to the table changes the conversation. It helps to reduce emotional elements, personal attacks, and anecdotes. Assuming the data is high quality, a superior dialogue should result.

And Finally…

A discussion on metrics isn’t the most glamorous topic for most of us. What most of us do desire however, are simple leadership techniques that bring better outcomes. If you’re not making performance metrics count in your organization or team, you’re missing an easy instrument to get better results.

The four ideas in this article are intended to motivate you to action or reinvigorate an existing, but neglected area of your team. Those that are new to metrics can follow the ideas and in an iterative process, improve on each one as time passes. It won’t be too long before you’ll want better data analytical tools and perhaps want to hire specific talent.

In my career I often remind myself to revisit fundamentals. It’s these reminders that can help sharpen the saw and get us reinvigorated in our careers.

Remind yourself to make metrics really count. You’ll be glad you did.


VIDEO: Creating a New Operating System for Cities – Jalisco Campus Party

July 31, 2015 - 1:11 pm


PODCAST: The Future Belongs to Cities – But We’re Not Ready Yet

July 2, 2015 - 9:09 pm


VIDEO: Promo for My Talk at Campus Party Mexico, July 2015

June 6, 2015 - 11:21 pm


Promo Pic for Digital Infrastructure Summit

April 10, 2015 - 12:26 pm



VIDEO: Marin County Council of Mayors & Councilmembers Dinner Talk on Data

March 28, 2015 - 1:48 am


Open Government Podcast Part 1 & 2

March 18, 2015 - 1:21 am

The open government podcast, a Canadian duo, interviewed me about the work the City of Palo Alto has been doing around government innovation and more. The interview is in two podcasts. Total time is 30 minutes.


CIO: Chief Inspiration Officer?

February 27, 2015 - 7:31 pm

lightbulb-rd“If you want to build a ship, don’t drum up people together to collect wood and don’t assign them tasks and work, but rather teach them to long for the endless immensity of the sea.”
Antoine de Saint-Exupery

Some time ago, as a member of a panel speaking before technology leaders from various industries, I was asked what single, most important piece of advice I would give a new chief information officer (CIO).

Just prior to the question, I was struck by how our discussion had been rather sobering in nature. We were dwelling on some of the more challenging issues facing our profession: excessive and increasing demand to deliver more solutions; overworked and under- appreciated staff; and a technology playing field changing the rules with far too much frequently.

Deep inside the discourse on the state of any profession, it’s understandable that the pain points often get all the attention. While careful discussion of current issues is vital, it’s also incumbent on leaders to balance debate. To focus entirely on challenges in this forum risked the potential to miss the complete story: a CIO has the ability to lead important and meaningful business change; to create enormous value; and to impact staff and customers in ways that delight.

The most essential role of the CIO?

When the panel facilitator turned to me to address the advice I would give a new CIO, I wanted to directly speak to what had been on my mind. I responded, “The CIO should not just think of him or herself as simply the chief information officer, but rather as the chief inspiration officer.” I went on to explain that in an environment where it is easy to be dragged down and feel beaten by some of the realities of the job, it is essential to remind staff of the enormous value of technology and the magic it can create in people’s lives and in the function of organizations. My point was that the skills to create an environment that inspires must complement a CIO’s arsenal of genuine leadership abilities.

Inspiring staff by creating a compelling vision and strategy for technology is one of the lowest costs, yet most effective activities a CIO can do.

As a technology leader, there are a lot of pressing priorities and demand for attention is high. Team members feel the burden of delivering increasingly more complex solutions with less available capacity and in faster time. Inspiring staff by creating a compelling vision and strategy for technology is one of the lowest costs, yet most effective activities a CIO can do. A vision that produces positive, tangible results reminds everyone why we all do this work in the first place.

So how can a CIO inspire?

It’s hard to learn inspiration, but if you find a great way to express your passion and have it connect with others, that will usually get you heading to the right place. To inspire requires a person to have relentless positivity. It requires brilliant storytelling. Request bold challenges of your team members. Participate in action. Most of all, a leader must believe in his or her words and it will shine brightly in their face, energy, and manifest in supporting behavior.

It’s also important to recognize that driving inspiration is not limited to the CIO. Regardless of your role, inspiring others has considerable value and it feels great.

Often we each need a reminder of the core behaviors that can make each of us respected and appreciated colleagues and leaders. A long time ago I took my own advice and made inspiration a personal job requirement.

A version of this piece first appeared in O’Reilly Radar and was also the basis for a keynote talk at a technology conference.


Palo Alto Weekly Cover Story: Big Data in Little City

February 21, 2015 - 2:19 pm

The Palo Alto Weekly features a cover story of my teams work at the City of Palo Alto. Click the image to read.


Disruption at a Moments Notice

February 14, 2015 - 5:19 pm

wheel of innovationIn the world of tech, we recognize that introducing a new product or service is often highly disruptive to an existing market and its competitors. What is relatively new is the speed and scale in which that disruption can take place. The torrent of punditry that pre-empts these introductions is notable alone. It has created an unusually unsettled technology marketplace.

The costs of sudden impact

In a hyper-connected world, the immediate reach and impact of the new provider can result in disproportional results from just incremental innovation–whether or not the solution even succeeds. It is the innovator’s dilemma in overdrive. This disruption at a moment’s notice largely eliminates the notion of a first-mover advantage. Flickr and Friendster can both vouch for that.

It would be easy to conclude that this disruption is a destructive force. Sure, there is something to be said for the uncertainty it can sow, and honestly it is impossible to know quite where it will take us. There is no doubt that organizations are being challenged in unprecedented ways and many consumers are riled by the constant volatility. I also have to believe that at some point every one of us has a capped quotient for fickleness.

At its core, the effect of disruption at a moment’s notice is an economic phenomenon. Clearly there is an important technical component, but introducing a new product or service that can have rapid and far reaching impact will shift existing market behavior — even if temporary in nature. In some instances, for publicly listed companies, the business introducing the technology may experience a bump in stock value and its competitors may see theirs experience downward pressure. Organizations too may wait for stability before making investments.

Let’s not be too hasty

When a new online service or app is introduced, pundits are quick to claim the imminent demise of its main competition. These existing organizations have worked hard over several years to earn, for example, each subscriber, friend, and follower. These analysts are often far too hasty and optimistic in their predictions. A sudden injection of viable competition is a great catalyst for innovation. It is one thing for customers to complain about existing functionality in the market leaders service, it’s quite another for that leader to respond to the potential existential threat from a new, disruptive competitor.  Google makes Microsoft a better software company. Tesla makes Ford a better car company.

Understanding the implications

As an example, observing the speed of innovation currently in the cloud computing space reminds us that intense competition and the risk of sudden disruption is bringing innovative, low cost capability to buyers quickly. Take a look at online storage. It’s a moving target, but intense competition and innovation is forcing the cost down towards zero. Disruption is a compelling forcing function.

Currently we see dynamic and healthy competition in the domain of smartphones. But it is also a fragile battle. Now largely dominated by a small set of participants — solutions created by organizations with healthy balance sheets — innovation is alive and kicking. But should one stumble, a dominant player could emerge and we could see innovation atrophy. We don’t want that. Consumers are advocating for disruption; albeit, managed disruption.

Conversely, disruption is causing mainstream businesses to die overnight eliminating any notion of predictability and dislodging people and downstream processes along the way. Robots and artificial intelligence are making us nervous about how their advantages may shrink human participation in the labor market. And a nexus of emergent technologies and behavior is demanding that we think about privacy in completely new ways.

Advocating for disruptive innovation

As an IT leader I encourage rigorous, disruptive innovation and competition as it helps to keep product and service costs low and it can accelerate the introduction of desired functions and surprising new solutions. I also want this innovation to restrict the ability for large, domineering players to create a closed Web or to reduce the very freedoms that make it so empowering.

But with this level of innovation, I’m also concerned by the potential long-term costs and with user fatigue. Organizations and their staff are increasingly experiencing the chaos and downsides of frequent change management.

Disruption at a moment’s notice has the capacity to elicit considerable change in the way many organizations operate and compete. Nobody wants to be Kodak or Blockbuster, both of whom had time to change, but underestimated the disruption to their industries. Everyone wants to be Netflix, who moved in record time from mailing DVDs to streaming online to become a market leader.

Being ready and able to respond to market surprises should be a focus for every organization. Do you know where the next disruption may come from that impacts your industry? Who’s thinking about that in your organization?

What all this means for how organizations function and how consumers respond over the long-term has mostly yet to be determined. Fortunately, whether we act or not, we can rely on the marketplace to largely help sort out what happens next.


Huffington Post: A Top 100 Most Social CIO on Twitter in 2015

January 16, 2015 - 11:06 pm














Civic Innovation: Ready, Set, Go!

- 10:59 pm

One of the reasons I moved from a successful 20-year career innovating in the private sector to spending time serving the public sector wasn’t because it was a premier place to innovate. In fact, I was attracted to the public sector exactly because it was the opposite.

Let me explain.

During the 2000’s, looking across industry domains, it was becoming obvious that highly disruptive business transformations were beginning. From newspapers to telecommunications; from entertainment to banking; from healthcare to retail and so on, there was hardly an industry that was–or would be–untouched by the transition from analog to digital; from manual to automation; and from inefficient to optimized.

But among all the transitional chaos, one big section of our economy was largely sitting idly, functioning poorly, and generally being bypassed by the energy seen elsewhere. This was the nation’s largest employer: the government.

An opportunity to innovate emerges

For me, and as it turns out a whole lot of other motivated people, this seemed like an opportunity to make a real difference. A lack of innovation was the incentive. For us, to push hard so that the spotlight, if just even partially, could shine some light on this neglected sector thus resulting in action, seemed a compelling mission. It was clear that a whole new mindset was required to transform government. For simplicity we categorized this work as civic innovation.

Civic innovation means many things. It can capture the emergence of online services that help citizens avoid standing in line at a government office. But it can also mean the work that’s needed for us to build cities that serve the demands of our communities in the 21st century. It can mean rethinking how we power our homes; how we deliver and use transportation; and how we provide quality, affordable healthcare. It’s a broad area and I’ll keep it that way for the purpose of this short piece.

Civic innovation arrives in small town America

It’s important to acknowledge that civic innovation per se is not a new concept. For sure, in many big, modern cities some amazing things have been taking place for some time. But civic innovation occurring in a handful of large metropolitan areas is not a movement that will touch the many millions of American’s who live mainly in small cities.

That said, there are some clear indications that make a lot of us very optimistic. A new generation of public servants, from governors to mayors to a vast hierarchy of civil servants, are bringing to bear a new vision for the future. And most promising is the first real venture capital funds being committed to start-ups that want to compete in this under-served market. Money usually follows where opportunity lies.

A call to action

Most of this is not news to those of us who choose to serve in this way. We’re seeing the formation of national and international groups focused on civic innovation work such as open government and smart cities. Conferences and meet-ups abound.

I write and talk about this area not to convert the already converted but to bring awareness to the vast majority who have little awareness of the emergence of civic innovation. It’s to provoke and inspire others to join in serving this opportunity. If the motivation isn’t altruistic for some, there’s an economic one that’s totally fair in an open market system. According to some estimates, the innovation needs of our cities may amount to a trillion dollar annual market by 2020. While so many of our economic sectors are continuing to shrink, government needs—particularly city infrastructure requirements—will balloon (I’m not referring to bureaucratic bloat, that’s a topic for another day).

What are we waiting for?

Communities are demanding more efficient governments. They also have high expectations for the way cities function. Many areas are already broken and much is on the brink. Solutions will come through public-private collaborations. Partnerships will need to emerge that bring together disparate stakeholders. What might that look like? Google, a company made famous for creating the world’s leading search engine, will need to seek partners to build cars for the software they’ve created to make those cars drive themselves. The future promises a surprising gang of heroes.

Finally, why is civic innovation a race at all? I call it a race because we need go fast and we need healthy competition to drive civic innovation forward. While certainly clichéd, this is a race where we need everyone to win. If we want to see our cities get smarter and for our governments to become more efficient through digitization, we’re going to need the innovators, the investors, the skills, and the vision to sprint towards a better future.

Are you ready? On your marks…

This posting first appeared in


Why is Nobody Talking About Small Data?

January 6, 2015 - 7:57 pm

It seems everyone is focused on big data. And why not? Today the world is producing an extraordinary volume of data. Our prolific machines and interactions are now venting a massive scale of data exhaust unprecedented in our short digital history. Big data is spinning up stunning visuals that are providing completely new understandings. Suddenly we’re embedded in the zettabyte era.

But doesn’t all data matter? Might our obsessive focus on big data come at the cost of data at the edges? That’s the small data. It’s the frequently used but largely unglamorous data that exists in every organization. It’s the monochromatic and fundamental storytelling that remains largely untapped. It seems big data has become the main act and all other data has been relegated to playing support.

Big data really is a big deal

No doubt, capturing and analyzing massive scale data is changing the world. That’s not hype. Public safety teams can anticipate crime through predictive policing; public sentiment can be derived with uncanny accuracy on almost any topic; marketers can promote products to potential buyers with pinpoint accuracy; and we’re tantalizingly close to medical breakthroughs only imagined just a few years ago. These are just the tip of the iceberg of the places big data will take us.

An impressive new industry is emerging in support of big data. Remarkable new software tools and an army of newly minted data scientists have come to bear on a flourishing industry. The results are impressive. Everyone recognizes this and there is much work ahead of us.
With big data, the datasets involved have records in the millions, if not billions and likely much more. Our new capabilities mean we can store, organize, analyze, and attempt to make sense of it all. We acknowledge that a large number of organizations have datasets of this scale and they desperately desire the potential competitive advantage of having data science applied against it.

But isn’t it possible that an equal number or more of organizations have datasets that fall well below this threshold and thus may be failing to realize the value in their small data? Will the brigades of consultants and the investments in new big data software pass by the magnitude of little opportunities that exist in every organization and every team?

Small data still runs organizations

Each of us who have, for example, worked in a business, served in government, or had a leadership position in a club, has used or been exposed to small data. It’s the spreadsheet, the contact list, the survey results. Small data is sometimes the by-product of big data, reducing it to tiny chunks that humans can understand. It’s the experiment results, the financial information, the queries, and reports that are so meaningful and so frequently requested. A legion of office workers toil over this data daily often hindered by incomplete skills and poorly understood productivity software. This is the data value that is so often only superficially gleaned. But it’s equally essential data that informs decisions every moment of the day. Sure it doesn’t make the big headlines like big data, but it’s no less important.

Is the value of big data realized at the cost of small data?

Yet, we’re all enamored by big data. We’re simply not talking right now about small data the same way. Of course we have to continue the pursuit of unleashing the full potential of the former. There’s no doubt it’s changing the world and in the months and years ahead it will be decisive in helping to deal with issues of climate change; for our understanding of the Universe; for enabling astounding medical breakthroughs; and to help power our future Smart Cities. But let’s not redirect all our energy such that we lose sight of the innovation necessary in managing and understanding data in the nooks and crannies of every organization.

Let’s also recognize that some of our big questions will be answered in small data. Critical signals in the noise are not limited to the big datasets. There are beautiful patterns in little queries if we know how to look for them.

The bottom line? A pat on the back to us. We’ve recognized the enormous and increasing value of data, inherently acknowledging that all data is important. We can’t ignore this fact in our pursuit of the new shiny thing. We have to continue our relentless innovation and skill building around data and we’ve got to make sure we’re including small data as we do that.


PHOTO: Cities Need a New Operating System for the 21st Century

September 20, 2014 - 2:15 pm



PHOTO: Promotion for Keynote at Campus Party Quito, Sept 2014

August 15, 2014 - 4:15 pm



PHOTO: Facilitating a Session on Lean Methods at IdeasCamp, Berlin

- 4:11 pm

Credit: Jasper Juinen


VIDEO: Palo Alto Apps Challenge TV Finale (1-hour show)

June 13, 2014 - 5:17 am


Reinventing Government One App at a Time

April 4, 2014 - 9:43 pm

appsOn one hot day last June, along with civic hacking events in 83 cities participating in the first-ever National Day of Civic Hacking, the City of Palo Alto, California, held an outdoor festival of civic innovation. Approximately 5000 people showed up to discover and be inspired by a wide range of technology-related talks and solutions for delivering government in completely new ways. While some software hacking took place, the focus was on beginning both the education and conversation on defining civic innovation and answering why it is so important to all our communities. The festival was a success and was highly praised by the community and at a special event at the White House later in the summer.

This year, as a follow-up and to coincide with the 2nd National Day of Civic Hacking, the City of Palo Alto decided it was appropriate and timely to move from facilitating the discussion about community-driven civic ideas to helping to provide a platform to build solutions. And from this the Palo Alto Apps Challenge was born.

The motivation to take action

Palo Alto is a small, but notable city just south of San Francisco. It’s the birthplace and heart of Silicon Valley. The city continues to be a place where great ideas emerge and come to fruition. Ideas here change the world. As the local public agency, Palo Alto has both a responsibility to leverage this environment and experiment with delivering services that take advantage of both state-of-the-art technology and local talent. It’s now clear that many other agencies watch us in order to learn what works and what doesn’t; the good and the bad. All of these characteristics form the motivation for the technology-related projects we work on and the partnerships we create.

The Palo Alto Apps Challenge helps to fulfill this responsibility. Through this multi-month, American Idol-style competition, entrants primarily from Palo Alto—but also from surrounding communities–submit app ideas that they hope to build based on the theme of civic engagement. The challenge is managed out of the City Office of the Chief Information Officer (CIO). Other departments assist as appropriate since an initiative such as this requires a wide variety of resources. Funding is largely through sponsors with the City only contributing a small amount for some project management assistance. The challenge is enthusiastically endorsed and supported by both the City Council and City Manager.

How does the challenge work?

The challenge works as follows. Ten finalists are chosen by a panel of judges—all Palo Alto residents with a technical or public policy background—and then the finalists must set about building a working prototype of their idea. Next there is a showcase event where the community is invited to learn about the finalist apps and to provide meaningful feedback. The last event is the grand finale, a televised and Web-streamed show that will highlight the apps and entrants and then elicit the audience and community to vote on their choice for winner using their phones and computers. There will be a 1st, 2nd, and 3rd prizes of $3500, $1000, and $500, respectively. Winners will also be offered free incorporation services should they decide to form a business for their app. Winner retain all rights to their idea and app.

On one level the Palo Alto Apps Challenge is a practical and engaging event that should result in one or more solutions that provide value to the community.  But we believe that this challenge is also about a deeper message.

It’s much more than an apps challenge

The future success of the US has much uncertainty. In the last few decades we’ve seen many of the traditional industries in America either disappear or be entirely reinvented. Anyone in the printed press industry knows exactly how this feels. The change is coming about as a result of globalization and massive automation. While many are pessimistic, within this shift is great opportunity. Automation is resulting in a world that is both software and data driven. Because of this, new skills and talent must emerge. As a nation we have to refocus on science, technology, engineering, and math (STEM)—and by the way, art too! Design is no longer an afterthought. The Palo Alto Apps Challenge is one way that one community can help to inspire a new generation of engineers and innovators. We never lose sight of this critical message and motivation.

What happens next?

At the time of writing, the judges had just announced the top 10 finalists. In total, 74 ideas were submitted and 30% of the entrants were under the age of 18.  The ideas chosen represented an eclectic variety of solutions. There are ideas for making it easier to find parking spaces; an app for giving a voice to youth on City Council items; an idea that adds gamification to the process of learning about the city; and a solution for crowdsourcing places that have good and poor support for physically challenged individuals. Next up, the 10 finalists will showcase their ideas at a community event at the Palo Alto Art Center.

At the City of Palo Alto we’ve decided it can no longer be business as usual. We recognize it’s not just about apps. It’s so much more. We are pushing the envelope on new thinking across our departments while also ensuring the important and routine work of government gets done.

It’s a whole new day in local government and we are firmly engaged.

For more information on the Palo Alto Apps Challenge, go to
A version of this article first appeared in the April 2014 edition of Transformations, the Alliance for Innovation newsletter.


Technoethics: A New Frontier for Leadership

December 18, 2013 - 1:03 pm

EthicsLeaders: You understand the value of technology to your organizations. You know technology can be the difference between competitive advantage and irrelevancy. So I’d like to share with you why I am optimistic about the future of technology and why as leaders we need to be concerned too.

Depending on how you slice it, we’re about 40 years into the information age. And what a ride it’s been. Beginning with the miniaturization of processing power, we’ve seen innovation after innovation. Transformation after transformation. Every aspect of our lives have been impacted.

Since the late sixties until today, in regular fashion, the number of transistors on a microchip has doubled every 24 months. As we all know, when you double something each time, the quantity rises exponentially. If you double 4, 4 times, you don’t get sixteen, you get 64. Over the past 40 years, processors have become very fast indeed. And they’ll get faster still.

We have gone from calculators to super-computers, from standalone to networks, from pagers to iPhones. Today we are solving all sorts of difficult problems from building smarter cities to curing diseases. All of these are products of exponential technology innovation.

As we look out to the next 10 years, we will enter uncharted territory. Computing power and capability will be at a point where if we can think it, we may be able to build it. And that’s why I’m optimistic. We’re going to solve the big problems of our day. As leaders, you’ll be a part of that.

But here’s the rub. This is the concern I have. The potential bad news and the challenge we have as leaders. If we can use technology to achieve all manner of outcomes, the question is no longer:

“Can we can do it?” But rather, “Should we do it?”

With great power comes great responsibility. I believe as leaders we’ll soon all enter a period where the ethics of technology will be the dominant technology consideration. We will enter the age of technoethics.

Great leaders will continue to lead in the age of technoethics.

But I’m an optimist. This isn’t really bad news. Technoethics will be a new skill set. An important skill set that we can all learn.

So when the question is asked, “Should we do it?” you’ll know how to help answer that question.

It will be the most important one.


VIDEO: How Data Can Help Create Better Communities

September 24, 2013 - 5:17 pm

dataIn an era of government deficits, it’s comforting to note that there is an abundant surplus of data. But until recently, leveraging value from data beyond its initial creation and use has been difficult. Today, this picture is changing. A combination of new technologies and a more enlightened emerging leadership is finding innovative ways to put data to work. Beyond much desired transparency and accountability, making government data more easily accessible is creating a wave of valuable community applications.  In this video, I discuss this topic, explore best practices, and share my thoughts on civic innovation.

[NOTE: Video is 1 hour 20 mins]

(Backup link if video above does not work:


VIDEO: CityCamp Palo Alto & HP

July 3, 2013 - 11:26 pm

On June 1, 2013, nearly 100 cities throughout the US brought together public and private sectors to use software, technology and ideas to build better communities as part of a National Day of Civic Hacking. In Palo Alto, I was the founder and creator of CityCamp Palo Alto, our event on June 1. Here is a video, produced by HP, that focuses on their important contribution and how it ties into their own strategy.


Does social computing have a role in government?

April 5, 2013 - 2:04 am

As of early 2013, there are over a billion active monthly users of Facebook and almost 700 million daily users. People from across the world use this social network to share and exchange stories, pictures, ideas, and more. These numbers suggest a compelling platform that is engaging humanity in a manner without precedent. Facebook and its competitors have convincingly demonstrated that people will share and collaborate with each other, and with strangers, in an inclusive manner not just for fun, but to make things happen. And yet, when most of the working population of those users goes to their places of employment, they use technologies that reinforce barriers to collaboration. Email—albeit an important business technology—primarily facilitates sequential and non-inclusive collaboration. Up until recently, the merits of social networking has had the hardest time successfully penetrating the enterprise.

The tide is turning. Today, an increasing number of organizations are exploring, experimenting, and deploying social collaboration tools. They are becoming social enterprises. Why the change? It may be because of better solutions or more leadership support or greater recognition of its potential value. Perhaps it is a mix of all these things and more. But without a doubt, workers rising through the ranks today are more apt to try social networking in the enterprise. They have already accepted and blended the use of technology in their work and home lives.

With enterprise collaboration in the private sector just entering the early majority phase of the technology adoption curve, where does that leave the public sector? As one would imagine, there are only a few innovative agencies that have taken the leap to build a social enterprise. Most notably the City of Boston and the State of Colorado have been pushing the envelope. With the recent deployment of a social collaboration platform citywide, we can now count the City of Palo Alto as one of those innovators.

City Manager Creates the Business Case
It all began in March 2012, when City Manager Jim Keene approached me and asked if there was a way for him to more easily communicate and engage with all City staff. There’s nothing I like better than a tough challenge to solve and this one met that criteria. We sought to keep costs as low as possible and at the same time to try to find a solution that was truly cutting edge. I also wanted to go beyond the original request and find a way not just for the City Manager to engage with staff, but for all staff to be able to connect more easily with each other, to share ideas and documents, and to solve problems together. If we could do that and more, we could create an agency that would have improved access to timely information and a platform to solve problems more quickly. This, we surmised, would have a very positive impact on the services we provided to the community.

At the City of Palo Alto we have the responsibility and privilege to be a role model in how a public agency should use innovative technology to serve a community. As an example, last year we deployed an award-winning open data service. We didn’t just repeat the work of others; we applied new ideas and innovation to our solution. We know other agencies look closely at what we do as guidance for them. So rather than going down the well-trodden path, we often want to chart new territory. Of course, this strategy has implicit risks, but we don’t apply it to everything. It’s a deliberate and balanced approach for the right, qualifying projects.

The City Begins an Experiment
In April 2012, after extensive research, and the application of some of my own personal work experience in this area, my team and I decided to deploy a small, short-term experiment with Salesforce Chatter. While several impressive solutions met the basic requirements, Chatter was compelling because it was closely modeled after Facebook. It was also exceptionally low cost, and being a software-as-a-Service (SaaS) solution, it was easy to deploy.

My expectation was that we would run the experiment for about three months and we would have around 50 early adopters. To my surprise many more people wanted to try it out. So the number of users grew very quickly. I then learned something important. With only a few months to try it, most users found this a deterrent since it seemed that it wasn’t worth the effort for the short amount of time it would be available. So I quickly made the decision to extend the experiment through the fall. In no time at all we had over a hundred users. Most were just viewers, with just a handful of staff brave enough to post items. Staff posted pictures and created groups. One particular group was used as a place to encourage members to eat healthier and do more exercise. Another group was for sharing tips and tricks for smartphones and tablet computers. These were not earth-shattering collaborations, but they showed the promise of collaboration in a manner that previously did not exist.

I personally committed to posting and commenting—common features of a social network—and I encouraged the IT team to follow my lead. By the time we were reaching the end of the experimental period, it just seemed too premature to end it. More people were joining and there were good discussions about social networking happening at the organization. In late fall of 2012, we had over 200 people signed up. Again I decided to extend the experiment until early in the New Year. At that point, we committed to making a decision whether to deploy it citywide or shut it down.

The Decision to Deploy Citywide
When my team and I reviewed the metrics in early 2013, the data was not overwhelmingly conclusive, but it was sufficiently persuasive to make a decision. In conjunction with our City leadership team and the City Manager, it was agreed to deploy Chatter citywide for a period of up to 18-months.

On March 1, 2013, almost a year after we started to think about social collaboration at the City, we invited all staff to participate. So far, so good. Lots of curiosity and great questions. It’s far too early to know if we are on a course to change the nature of work at the City. We’ll gather that evidence over the medium-term. But we’re doing things differently and opening our minds to a whole new world. We don’t want to play catch-up, we want to lead. It’s beginning to be clear who gets it and who is still trying to figure it out. Of course, we have plenty of people who don’t get it at all and are not shy to share their view that it doesn’t seem to offer them any value. But isn’t that one of the greatest challenges of innovation? Those of us tasked with anticipating a possible future, even when we have little idea what that future will bring must push forward with our ideas despite enormous pressure from the naysayers and antagonists. If there is success, everyone wins. If there is failure, we learn something and then we apply those lessons as we move forward with other innovative experimentation.

In 2004, nobody thought about Facebook. Nobody knew they would want it or even what value it could have in their lives. Less than 10 years later, Facebook–a social network–is one of the most remarkable phenomena of our time and a billion people discovered a new, fun and productive way to interact together.

Can we do the same at City Hall?


VIDEO: Palo Alto Continuing Open Data Push for City Government

February 23, 2013 - 3:29 pm

California Forward first reported on the city of Palo Alto’s Open Data Platform in August.  The city is using technology to create a more inclusive form of local government.  Months after its launch, we wanted to find out how if citizens are answering the call to become more engaged.


California Forward is a nonpartisan, nonprofit organization working to bring government closer to the people and move the state in the right direction – forward. They believe empowered local communities are best equipped to solve their own problems, and there is a critical link between many of the problems that threaten our future and our state government, which has become ineffective, unresponsive, and unable to fix itself.


VIDEO: Robots, Space Travel, Open Data, and Other Thoughts

- 3:19 pm

Antonio Savarese, journalist for the Italian magazine Data Manager, on a recent trip to Silicon Valley, joined me at City Hall to discuss a wide range of items. His published interview with me is available here. In addition, he recorded an interview which can be found here. His questions allowed me to elaborate on some of the work my team and I are doing at the City of Palo Alto and also for me to provide my thoughts on the future of technology. It is a short 14 minute video.



VIDEO: Streams, Gardens, and Clouds – My Lecture on Open Data at UC Berkeley

February 9, 2013 - 7:50 pm

Data Innovation Day was held on Thursday, January 24, 2013. The purpose of Data Innovation Day is to raise awareness about the benefits and opportunities that come from increased use of information by individuals and the public and private sector. Events were held across the U.S. The following is my lecture at UC Berkeley on that day.


@Reichental is #3 on List of 50 Most Social CIOs on Twitter Worldwide

January 17, 2013 - 2:40 am

Posted in Huffington Post on January 16, 2013.

Click Here To Read



VIDEO: Early Lessons in Using Lean Methods in Palo Alto

January 2, 2013 - 6:43 pm

The city of Palo Alto, Calif., is stealing an idea from the commercial technology industry to improve services for its residents. In this video, city CIO Jonathan Reichental offers lessons learned from Palo Alto’s use of Lean Startup principles during several recent technology projects. The Lean Startup approach — which lets users test unfinished versions of new apps and websites — is routine in the commercial space. Now it’s catching on in government.


Why Every Public Agency Needs a Data Strategy

November 2, 2012 - 2:08 pm

In his second guest column for EfficientGov, Palo Alto CIO Jonathan Reichental looks at the Open Data movement, and the criticality of “open government” in the 21st century.

Despite the fact that we face an increasing scarcity of valuable resources, one area of growing abundance is data. From information-producing activities such as the global supply chain, to our own personal behaviors, the digital world is producing data on a mind-boggling scale. At a recent conference, Eric Schmidt, the chairman of Google—an organization that knows a thing or two about data—stated that every two days we are creating the same amount of information that we did from the dawn of civilization to 2003.

On this scale, we no longer refer to it simply as data, we call it big data.

As devices and behaviors produce increasing volumes of data, a new visibility is emerging. For example, look at the insight we get from Google searches that result in a better understanding of the spread of seasonal flu. We are able to see formerly hidden patterns and make more-informed decisions. Organizations know more and more about you. Privacy is quickly becoming irrevocably passé. Mass production turns into mass personalization. Data at our fingertips is changing the way be live.

We’re moving from a post-industrial economy to a data economy.

Understanding Big Data

Regardless of whether your agency has 10 or 10,000 people, it’s a safe bet that you’re producing and storing data; it’s the one area where there is no deficit and no future likelihood of one. If we consider data as a valuable resource—which we should—then we’re all in a surplus position. That’s happy news for a sector so beset by negativity. That said, sadly, converting that surplus into value for the communities those agencies serve has not yet been broadly realized.

In short, governments are mostly sitting on an abundant resource, neglecting opportunities that could—if leveraged correctly—produce enormous benefits for their communities.

What is this government data that I’m talking about? On the federal site,, there are almost 400,000 sets of data. These cover every type of subject one could imagine. For example, there is the visitor log for the White House; the register of all federal government contractors; and unemployment statistics. There’s data on energy, health, manufacturing, and education. And these are only the datasets that have been posted for easy consumption; there are many more that still need to be posted.

And this phenomenon is not restricted to the federal level. On the city and county data website for San Francisco, for example, there are local crime statistics, and the location of every movie made there since 1924. My own city, Palo Alto, posts a variety of data that includes details on all our trees—a most revered Palo Alto resource—and demographics. In addition, we recently posted five years of financial information, which is data that taxpayers care deeply about.

Realizing the Value

But what’s so novel about posting government data? Many will point out that we’ve been doing that since the first public Web sites arrived back in the 1990’s.

There is truth in that statement; however, the current trend has a distinctive advantage to it. This data is being posted in a form that can be more easily used by Web and mobile applications. That means it’s more accessible, and this is no small point. It’s called Open Data. If the data is available for software engineers, data scientists, and other interested stakeholders, then all manner of new solutions can be built.

These solutions won’t get built by cash-strapped public agencies; rather, they will be created by the private sector, activists, residents, and other interested stakeholders. Already, citizens from across the nation are applying their skills to build useful applications such as apps for smartphones that have exceptional utility for communities (EfficientGov recently highlighted some of those apps and local efforts, including ours in Palo Alto, here). It’s a win-win: public agencies incur little or no cost, and the community receives the benefits.

Many communities host “hackathons” to promote their Open Data initiatives. These are events at which software developers focus on spinning up new applications—sometimes in a matter of hours—using a variety of datasets made available by the city. In Palo Alto earlier this year, we shut down a city block and 2,000 people turned up to build applications, create art, and network with one another.

We’re only at the very start of realizing the value of Open Data. One could easily imagine a time in the not-too-distant future when data is available to citizens at the moment of its creation. For example, an agency makes a payment for a product, and that transaction is immediately published and available to interested parties. Not only does real-time publishing create unprecedented transparency and accountability, it also makes the consuming applications vastly more useful.

I believe Open Data is foundational to building and enabling a digital city. This Open Data drives the development of useful applications; it is a convener of public-private partnerships; and it is a prerequisite to open government. And if your goal is to simply enable a lower cost and efficient manner to deliver your public agency services, then Open Data is still foundational.

Making It Happen

I’m often asked if Open Data is purely a product of Silicon Valley and its technically proficient community: “Isn’t Open Data only within the reach of tech-savvy communities like Palo Alto?”

Absolutely not.

Believing that Open Data requires significant technical expertise could not be further from reality. The biggest hurdle to enabling Open Data is recognizing it as an important part of your agency’s future, and then acting on it. Then focus should be on data value, not the volume of datasets.

There are many vendors ready to help any size agency, and the costs can be low enough for most to afford. In fact, with a little technical help—either from within your organization or by a willing volunteer in your community—there are Open Source solutions that can be deployed at negligible cost. Open Source is not the solution for everyone, but it’s certainly an option.

I’ll concede that this is a complex space, and any discussion here can only be superficial. While the dialogue is underway in some niche circles, I think it’s time for a broader national movement. We have to get the data topic on the table and start talking about how we can make it work for our citizens.

That’s my goal here: raising awareness to provoke you to learn more.

Let there be no doubt: Managing data and its value represent a core competency for both private enterprises and public agencies, from now and into the foreseeable future. Those that recognize this and assign priority to a data strategy will soon see benefits.

Are you ready to make data a priority?


Are Cities Prepared To Get Serious About Technology?

October 3, 2012 - 12:55 pm

This post first appeared on September 13, 2012 on EfficientGov as part of a guest column called Reichental’s Digital City.

Although attribution remains contested, Einstein apparently said that insanity is, “doing the same thing over and over again and expecting different results.” Regardless of who said it (and I’d never miss an opportunity to reference Einstein in my writings), the provisioning of technology in a city context today often has the hallmarks of insanity. With a painful consistency of missed project deadlines, cost overruns, and disappointing results, why is city technology still approached and delivered time and again in a manner that almost assures failure?

It’s a tough and sobering question to ask and a difficult one to solve too. It means looking deeply at leadership, at our tolerance for risk, and most of all—in my opinion—at a government’s capacity for innovation and internal, cultural reinvention.

As a traditional laggard in our economy and yet an obvious mechanism for helping to solve many of the most intractable issues that cities face, the delivery of municipal technology is postured perfectly for transformational opportunity. It’s time for city leaders to face this head-on.

It’s clear that many leaders are beginning to see the opportunity. There are now many openings for city chief information officers (CIOs), a great sign that there is recognition of the strategic value of technology in delivering government. I’d also like to see more of the larger cities begin to think about hiring chief technology officers (CTOs) too. Why? With a greater focus on community-facing automation, cities are increasingly in the business of delivering technology-driven products and services.

Assuming that the right candidates are being hired and positioned for success, these cities are taking one of the right first steps. But it’s only a start.

Regardless of whether new technology leadership is installed or existing management is given a new mandate, the sobering reality is that the entire value-chain for technology enablement must be redefined. This is because, for decades, the model hasn’t changed. It’s stuck in an old world of poor incentives and traditional approaches to technology that have been superseded by new, innovative models. These are models that are being embraced in the private sector and other models for innovation that are unique to the distinctive needs of the public sector.

Why Change Now?

Simply put: because it has too! Leaders in the private sector have embraced technology for moving their organizations forward. No longer relegated to a supporting role, the CIO is at the leadership table helping to develop and grow businesses by enabling new products and services. This is because marketplace forces require a new approach. Stand still and you face being trampled on by the competition.

Similarly, cities face enormous market changes—for example: flat or declining revenue sources and increasing expenses. Stand still and a city will face compounding and devastating consequences. Thus, if city leaders still believe that technology is simply for providing email and a website (of course exaggerated to make my point), they are missing the clear opportunity for technology to reinvent the very manner in which city government is delivered.

Let me be fair though. Across our country there are great examples of cities doing amazing things with technology. They are pushing the boundaries of what is possible. It’s impressive and refreshing. Sadly, these cities still represent a minority. My call to action is for the thousands of other cities still nervously eyeing the future and failing to act.

Where Do You Start?

In my view, it always starts with recognition of the problem. If your city technology continues to fall below expectations, costs too much money to maintain, and isn’t able to keep up with the actual needs of the city, then you have an environment that is ready for reinvention.

Next, you need to take a look at leadership. Is your technology leader, CIO or IT director, skilled and empowered to shift the existing struggling paradigm? Do they have what it takes to change the game and embrace innovation such as that which is happening in the private sector?

Recognition of the problem and ensuring you have the right leadership are two important starting points. Unfortunately that’s when the hard work starts.

In America we don’t have a history of letting big problems deter our resolve to find solutions. On the contrary, it’s in our DNA to face these problems head on. Our cities are entering a crisis period—many are already there—and these problems need solutions. While these seemingly intractable challenges will take multidimensional approaches to solve, I’ll bet technology will be a significant contributor.

Now that we have a context for the problem and the opportunity, the important work of identifying and executing many of the new ways of delivering government technology must begin. In the coming weeks and months, I will be sharing with EfficientGov readers our experiences in Palo Alto, including lessons from the field, innovations being tested around the country, and simple steps that your city can take to transform itself into a “digital city.”

We can’t keep doing city technology the same way. That would be insane.


CBS News Radio Interview on Open Budget Sept 20, 2012

September 20, 2012 - 10:00 am


Palo Alto’s Open Data Platform: What Transparency Looks Like?

August 3, 2012 - 9:15 pm

Pete Peterson, Executive Director, Davenport Institute for Public Engagement and Civic Leadership at Pepperdine’s School of Public Policy; his essay on his discussion with Jonathan Reichental on City government innovation and, in particular, his observations on Palo Alto’s open data work so far.

Read here:


Interview with Mashable on Palo Alto Open Data Initiative

July 31, 2012 - 10:21 pm

Mashable spoke with Palo Alto’s Chief Information Officer Jonathan Reichental and City Manager James Keene, who are at the forefront of the city’s open data initiative, to learn more about the project.


How Palo Alto is leading the digital city movement [GovFresh Interview]

June 17, 2012 - 1:43 pm

Luke Fretwell, founder and editor of GovFresh, conducted an interview with me on the work we are doing at the City of Palo Alto in rethinking and reinventing the delivery of local government. In a wide ranging discussion we cover topics such as open data, hackathons, cultural change, and the importance of leadership support.

You can listen to the interview here: CLICK HERE.


Palo Alto Weekly Cover Story: Building a Digital City

March 30, 2012 - 12:34 pm

I am thrilled that our vision for Palo Alto as a leading digital city is a cover story today in the Palo Alto Weekly. The story does a great job of covering the highlights of our work over the past few months. We’re experimenting with new ways of delivering service in local government and it’s getting the attention of media, our community, and other cities. Mayor Yeh, City Manager Keene, and I couldn’t be more pleased with our progress. We’re ready to take this work to the next level. Links to story attached.

Virtual Version:


My TEDx Talk: How the Web is redefining privacy

October 15, 2011 - 3:10 pm

Over the past 10 years the web has become an increasingly ubiquitous and useful utility for hundreds of millions of people across the world. But as I discover as I ramble across the rich terrain of the web, many of its benefits come at the cost of privacy. In the following short presentation I wonder if the web has ultimately become our least private domain and whether, in fact, that may be a good thing.


Spoiler alert: The mouse dies. Touch and gesture take center stage

September 29, 2011 - 9:00 am

The moment that sealed the future of human-computer interaction (HCI) for me happened just a few months ago. I was driving my car, carrying a few friends and their children. One child, an 8-year old, pointed to the small LCD screen on the dashboard and asked me whether the settings were controlled by touching the screen. They were not. The settings were controlled by a rotary button nowhere near the screen. It was placed conveniently between the driver and passenger seats. An obvious location in a car built at the tail-end of an era when humans most frequently interacted with technology through physical switches and levers.

The screen could certainly have been one controlled by touch, and it is likely a safe bet that a newer model of my car has that very feature. However, what was more noteworthy was the fact that this child was assuming the settings could be changed simply by passing a finger over an icon on the screen. My epiphany: for this child’s generation, a rotary button was simply old school.

This child is growing up in an environment where people are increasingly interacting with devices by touching screens. Smartphones and tablets are certainly significant innovations in areas such as mobility and convenience. But these devices are also ushering in an era that shifts everyone’s expectations of how we engage in the use of technology. Children raised in a world where technology will be pervasive will touch surfaces, make gestures, or simply show up in order for systems to respond to their needs.

This means we must rethink how we build software, implement hardware, and design interfaces. If you are in any of the professions or businesses related to these activities, there are significant opportunities, challenges and retooling needs ahead.

It also means the days of the mouse are probably numbered. Long live the mouse.

The good old days of the mouse and keyboard

Probably like most of you, I have never formally learned to type, but I have been typing since I was very young, and I can pound out quite a few words per minute. I started on an electric typewriter that belonged to my dad. When my oldest brother brought home our first computer, a Commodore VIC-20, my transition was seamless. Within weeks, I was impressing relatives by writing small software programs that did little more than change the color of the screen or make a sound when the spacebar was pressed.

Later, my brother brought home the first Apple Macintosh. This blew me away. For the first time I could create pictures using a mouse and icons. I thought it was magical that I could click on an icon and then click on the canvas, hold the mouse button down, and pull downward and to the right to create a box shape.

Imagine my disappointment when I arrived in college and we began to learn a spreadsheet program using complex keyboard combinations.

Fortunately, when I joined the workforce, Microsoft Windows 3.1 was beginning to roll out in earnest.

The prospect of the demise of the mouse may be disturbing to many, not least of whom is me. To this day, even with my laptop, if I want to be the most productive, I will plug in a wireless mouse. It is how I work best. Or at least, it is currently the most effective way for me.

For most of us, we have grown up using a mouse and a keyboard to interact with computers. It has been this way for a long time, and we have probably assumed it would continue to be that way. However, while the keyboard probably has considerable life left in it, the mouse is likely dead.

Fortunately, while the trend suggests mouse extinction, we can momentarily relax, as it is not imminent.

But what about voice?

From science fiction to futurist projections, it has always been assumed that the future of human-computer interaction would largely be driven by using our voices. Movies over decades have reinforced this image, and it has seemed quite plausible. We were more likely to see a door open via voice rather than a wave. After all, it appears to be the most intuitive and requires the least amount of effort.

Today, voice recognition software has come a long way. For example, accuracy and performance when dictating to a computer is quite remarkable. If you have broken your arms, this can be a highly efficient way to get things done on a computer. But despite having some success and filling important niches, broad-based voice interaction has simply not prospered.

It may be that a world in which we control and communicate with technology via voice is yet to come, but my guess is that it will likely complement other forms of interaction instead of being the dominant method.

There are other ways we may interact, too, such as via eye-control and direct brain interaction, but these technologies remain largely in the lab, niche-based, or currently out of reach for general use.

The future of HCI belongs to touch and gesture

It is a joy to watch how people use their touch-enabled devices. Flicking through emails and songs seems so natural, as does expanding pictures by using an outward pinching gesture. Ever seen how quickly someone — particularly a child — intuitively gets the interface the first time they use touch? I have yet to meet someone who says they hate touch. Moreover, we are more likely to hear people say just how much they enjoy the ease of use. Touch (and multi-touch) has unleashed innovation and enabled completely new use cases for applications, utilities and gaming.

While not yet as pervasive, gesture-based computing (in the sense of computers interpreting body movements or emotions) is beginning to emerge in the mainstream. Anyone who has ever used Microsoft Kinect will be able to vouch for how compelling an experience it is. The technology responds adequately when we jump or duck. It recognizes us. It appears to have eyes, and gestures matter.

And let us not forget, too, that this is version 1.0.

The movie “Minority Report” teased us about a possible gesture-based future: the ability to manipulate images of objects in mid air, to pile documents in a virtual heap, and to cast aside less useful information. Today many of us can experience its early potential. Now imagine that technology embedded in the world around us.

The future isn’t what it used to be

My bet is that in a world of increasingly pervasive technology, humans will interact with devices via touch and gestures — whether they are in your home or car, the supermarket, your workplace, the gym, a cockpit, or carried on your person. When we see a screen with options, we will expect to control those options by touch. Where it makes sense, we will use a specific gesture to elicit a response from some device, such as (dare I say it) a robot! And, yes, at times we may even use voice. However, to me, voice in combination with other behaviors is more obvious than voice alone.

But this is not some vision of a distant future. In my view, the touch and gesture era is right ahead of us.

What you can do now

Many programmers and designers are responding to the unique needs of touch-enabled devices. They know, for example, that a paradigm of drop-down menus and double-clicks is probably the wrong set of conventions to use in this new world of swipes and pinches. After all, millions of people are already downloading millions of applications for their haptic-ready smartphones and tablets (and as the drumbeat of consumerization continues, they will also want their enterprise applications to work this way, too). But viewing the future through too narrow a lens would be an error. Touch and gesture-based computing forces us to rethink interactivity and technology design on a whole new scale.

How might you design a solution if you knew your users would exclusively interact with it via touch and gesture, and that it might also need to be accessed in a variety of contexts and on a multitude of form factors?

At a minimum, it will bring software developers even closer to graphical interface designers and vice versa. Sometimes the skillsets will blur, and often they will be one and the same.

If you are an IT leader, your mobile strategy will need to include how your applications must change to accommodate the new ways your users will interact with devices. You will also need to consider new talent to take on these new needs.

The need for great interface design will increase, and there will likely be job growth in this area. In addition, as our world becomes increasingly run by and dependent upon software, technology architects and engineers will remain in high demand.

Touch and gesture-based computing are yet more ways in which innovation does not let us rest. It keeps the pace of change, already on an accelerated trajectory, even more relentless. But the promise is the reward. New ways to engage with technology enables novel ways to use it to enhance our lives. Simplifying the interface opens up technology so it becomes even more accessible, lowering the complexity level and allowing more people to participate and benefit from its value.

Those who read my blog know my view that I believe we are in a golden age of technology and innovation. It is only going to get more interesting in the months and years ahead.

Are you ready? I know there’s a whole new generation that certainly is!


With IT leadership, the “how” is as important as the “what”

August 24, 2011 - 9:00 am

The other day my IT operations leader entered my office in a state of confusion. He had just been reviewing our uptime statistics and was baffled by what he saw.

In 2010, on one particular web stack we had an uptime of 99.88% (translates to about 10 hours of an outage). But when he looked at our data for 2011 to date, we had 100% uptime. While clearly glowing with such a result, his confusion was based on the fact that we had not implemented any specific technology or fixes in this stack to garner such impressive results. He said: “I am very proud of these results. I just don’t know what we did to achieve them.”

In this instance he was asking the wrong question. It was not what we did. It was how we did it.

Doing things right

Our IT strategy is and will always be focused on doing the right things. Getting positive results is the bottom line. But while doing the right things is essential, it can be equally important to do things the right way.

It is my belief that a fleshed out IT strategy reconciles predictability with innovation. It will seldom fly to just have one or the other. Both are required and they must feed off each other.

The core challenge essential to implementing both is finding the right blend for your organization. I have written about it here. In the first year of our IT transformation much effort was expended on putting in place good process to support the right level of predictability. It’s a work in progress.

Getting the right level of process consistent with culture and organizational needs is a science unto itself.

The IT team made good progress in process areas such as IT governance, project management best practices, IT service management, business analysis and change management. It is in the latter that we gained particularly positive results.

Managing change

At its core, change management is about moving from one state to another to achieve a desired result while being adequately prepared and managing to an acceptable level of risk. It is also an important vehicle for communications between individuals and across teams.

Put bluntly, change management is good business.

Change happens all the time within IT. What contributes to the definition of a world-class IT organization is how that change is accomplished.

As O’Reilly IT entered 2011, we decided to be very deliberate about change. We agreed that we would be hyper-judicious in the infrastructure changes we made. We became priority junkies. Every time a change was identified we asked questions such as whether it was a priority, if there were alternatives, and studied the consequences of not making the change (see the change tool later in the blog).

And as we did that, something extraordinary happened.

The IT operations team started to get the most important projects deployed. Distractions became manageable and the priorities process kept everyone on track.

But most of all, we experienced increased infrastructure stabilization.

Of course some stabilization occurred because the improvements that were being made were being applied through a rigorous change management process. But, moreover, there was greater stability because less unnecessary change was being applied.

Change management was helping us make changes successfully and it was also helping us to determine what changes not to make.

Good process still gets insufficient focus

In IT, most of the time technology gets all the press. We get excited by new innovations and start-ups that introduce cool new capabilities. We are thrilled when a big player disrupts the market with something really compelling. And we should. We live in amazing times and new technology is a big part of that.

But often lost in the enterprise is that while technology represents a part of change — albeit, a critical part — the processes to implement and manage that technology are as important (and often more) than the technology itself.

I would guess we have all seen a great technology fail in an organization because of non-technology reasons. At the same time, I bet we have all seen how good technology coupled with good processes has resulted in excellent results.

When my IT operations leader observed great things happening despite technology, he was inadequately recognizing how we were working. For all involved it provided a rewarding “aha” moment.

Quick tool to manage change

I will conclude by sharing a brief tool that both IT and business can use for managing technology-related change. These are the minimum questions that must be asked for every change. They are simple questions, but all too often one or more is omitted when embarking on a change that expends scarce enterprise resources:

  1. Why [Governance]: Is the change aligned and essential to achieve business objectives?
  2. What [Measurable Outcome]: Is it understood whether it is a technology or process (or both) that will provide the desired result? Can the outcome be measured?
  3. Who [Resourcing]: Have the appropriate participants been identified for this change?
  4. How [Methodology]: What approaches have been identified to execute and manage this change?
  5. When [Prioritization]: Has sequencing been agreed to relative to all other objectives?

If both IT and the business are in agreement on the answers to each of these questions, you’ve just taken your IT management up a few notches. And you might just find a few people surprised by the positive results.


Will your business survive the digital revolution?

June 3, 2011 - 9:00 am

Over the last few years we’ve watched in giddy disbelief as a web-based social network launched from a dorm room at Harvard University unexpectedly found its way to be an enabler of a Middle East uprising. We’ve seen how new types of media have propelled people and events into the spotlight and even helped elect a U.S. president. We’ve looked in awe as mobile devices connected to a ubiquitous network have brought global commerce to the most remote parts of the developing world. We’ve seen 100-year-old businesses vanish as cocky upstarts replace their once unshaken dominance. We’ve delighted as citizens have been empowered by a new ease in which to leverage recently liberated stores of data held by governments.

With just these few observations it’s clear to all of us that technology is no longer just in support of our lives and organizations; it’s taking a commanding and empowering position. And it’s vital that we all fully understand just how profound these changes truly are (and will be). The very survival of your organization likely depends on it.

Are we at the start or the end of this technology revolution?

We observe these incredible events unfold and this may lead us to believe we’ve reached a new pinnacle of technological innovation. Many of us might believe that we’re peaking in our capacity to make amazing things happen. To them I say: we’ve barely even started.

From economics to democracy, from health to entertainment, from retail to education, and everything else in-between, something remarkable is happening.

In my view the events described here are just the beginning of a seismic shift in our human experience. Indeed, these innovations are not reserved for a single nation or continent. This technology-based revolution is the first to quickly reach and impact every corner of the planet.

Every generation believes it lives through remarkable and changing times. And that is probably true. But the large transformations, most recently like those of both the agricultural and industrial revolutions, don’t happen that often. These changes are a railroad switch that shifts the course of human destiny. Some have coined our era as the information revolution. But the emergence of the information age has merely been the precursor and a glimmer of things to come.

The true revolution is the convergence of many things. Revolutions require more than just a few elements to be in place. Historically they have required a unique alignment of qualities such as economic and political conditions, readiness for change, demographics and a catalyst.

We see much of that today. Of course, today the catalyst is the Internet. It’s also the ease in which so many of us can now produce digital innovation (creating new value through electronic, non-analog means). It’s also about the availability of low-cost, ubiquitous global communication networks with an abundance of devices connected. It’s close to zero-cost cloud-based storage. With low cost storage comes the easy retention of massive volumes of data and when it’s coupled with the fact there are so many opportunities to collect that data; new uses and value can be derived from it.

There is a new world order that is unique to our time that is also enabling this change. Not least the emergence of prosperity in many part of the world and the breathtaking rise of the BRIC nations and others. This prosperity is creating a new class of educated, global participants. This means more competition and it means more innovation. It’s all these things and more converging to produce a significant technology-based social and business disruption.

As this technology revolution unfolds, does your business have a survival plan?

The evidence is clear

The signals are in both the destruction of existing paradigms and in the creation of completely new ones. We’re watching entire industries disappear or be reinvented through digital transformation: newspapers, books, movies, music, travel agents, photography, telecommunication companies, healthcare, fund-raising, stock-trading, retail, real estate, and on and on.

Digital innovation has few geographic boundaries, so the disruptor can emerge from almost any place on earth.

Completely new models are emerging: location-based services, mobile apps, gamification, payment systems and new forms of payment, cloud computing, big data analysis and visualization, recommendation engines, near-field communications, real-time knowledge, tablets and other new form factors, augmented reality, gesture-based computing, personal medicine, large scale global social networks, microblogging and more. Many of these did not exist five years ago and many more will exist in the next five years. In fact, the next major disruptor is probably already underway. This kind of change is equally exciting and terrifying for organizations.

Why it is different this time

When the Walkman became the Discman, the music industry flourished. But when the digital MP3 player was introduced, the music industry was fundamentally and forever reinvented. Digital transformations are not subtle or calm. They are equal measure painful, chaotic, and exciting.

When mobile phones were introduced they enabled people to untether themselves from a fixed wire and talk almost anywhere. That was useful and convenient. But when smartphones freely enable the coordination of people and events that facilitates the overthrow of a corrupt government, this is not business as usual. That’s a fundamental shift in how humans communicate and coordinate their activities.

It will be a rough ride

Sure it won’t all be rosy and bad people will do bad things using more of this technology. But that’s certainly not news. The vulnerabilities will grow but so will our ability to fight attacks. Opportunities in security will remain in high demand.

There will also be booms, bubbles, and busts. That’s a normal part of the economic lifecycle. In fact, outside of the obvious pain it causes, a bust can be a valuable response to irrationality in the market. We will see many of these cycles through this transformation, but I believe we will net out with a continued exponential growth in digital innovation.

The big stuff is yet to come

When you observe how digitization causes significant economic restructuring and the emergence of completely new forms of business, and you factor in an entirely new level of social connectedness, it’s hard not to conclude that big things are ahead.

It’s also easy to be unfazed by the digital change underway, particularly if you’re working deep within it. In addition, it’s equally easy to become fatigued and even cynical about further change. But stop, elevate yourself above the chaos and noise, and the digital transformation is a palpable societal disruption.

At the heart of this blog is not a regurgitation of change that many of us already recognize and embrace; moreover, it’s about urging each one of us not to underestimate this transformational shift. It’s also neutral on the subject — but recognizes — the social and economic negatives that may result. Big shifts like these do evoke, for example, strong feelings of nationalism (somewhat ironically). But I’ll steer away from this subject for now.

Failure to anticipate, prepare and respond sufficiently is a significant organizational risk. In other words, delivering your product or service to the market of yesterday and today without constantly exploring reinvention for the market of tomorrow may be certain business suicide. And while that’s largely always been true, it’s seldom been so necessary and urgent.

Once we recognize the magnitude of change that digital innovation is causing and may bring in the months and years ahead, it will help us to think bigger and to think in ways that may previously have seemed absurd.

As inventors and facilitators of the future we would do ourselves a great injustice to underestimate the change.

The digital revolution: my own personal experiences

Let’s just take a quick look at my world for a moment. In many areas of my life it’s fascinating comparing how I did things in 2001 vs. how I do them now in 2011. By the way, it’s worth noting that while I immerse myself in technology and innovation through my work, I’m not particularly unique in the way I use technology outside of work.

So let’s take a look at some of the changes over the course of 10 years: I no longer wear a watch. No need, I get time from my smartphone. I got rid of my landline phone. My phone is my smartphone. I never go to a bank. Done online. I don’t know anyone’s phone number by heart. I select a name and my phone dials the number. Outside of a radius of a few miles, I don’t know how to get anywhere anymore without my GPS. I never use a map. I barely mail a letter. My use for stamps is diminishing. I seldom print anything. Everything that can be reserved, I do online. I don’t watch scheduled TV. I watch shows off my digital video recorder or computer when I want (in HD, no less). I use my smartphone for less and less voice calls. I text. I read, take classes, post photos, write, research, play, watch movies, listen to music, comparison shop, order insurance, complain and more all online.

I’m pretty sure your experiences are fairly similar.

Perhaps it is a little bit of an exaggeration, but I mostly only emailed and consumed static content online in 2001.

Almost every one of these areas represents an industry. And as a result of these enabled behavioral changes over the course of a mere 10 years, within these industries many organizations have been created and destroyed.

If this kind of transformation can happen in the past 10 years, with everything we know about how things are trending, what might our lives look like in 10 years from now? While not necessarily a novel question, I’m simply suggesting each of us are being forced to think bigger and more innovatively than ever before about the realities and possibilities of the future.

So what should organizations do?

I’m confident most enlightened organizations have some form of a strategy in place. That’s good news. For those that don’t or are hesitant, it’s time to act. In either case, the following are just a few fundamentals worth considering:

  • Recognize the magnitude of the digital revolution in acceptance and in action.
  • Invest in understanding how your organization can anticipate and respond quickly to change.
  • Monitor and interpret trends and new technology entrants.
  • Audit your vulnerabilities and score progress and risk on a regular basis.
  • Prepare by taking greater risks.
  • Innovate as standard practice (this doesn’t just happen, you need a strategy).
  • Make bold changes in order to continue to succeed when disruption is a certainty.

Technology used to be the domain of a few. Now it’s the fabric woven into how we all live, work, and play. Today it has the power to create and destroy value in an unprecedented manner. That’s a big deal for every organization.

It’s likely a very big deal for you, too.


The Future of Technology and Its Impact on Work

May 17, 2011 - 2:19 am

Here’s a 40-minute presentation and interview I gave at the Center for Technology, Entertainment, and Media (CTEM) at the Fuqua School of Business at Duke University. The video covers a range of subjects including demographics and technology trends that will emerge over the next 5-10 years and what will be required to succeed in the workplace of the future.


Why the cloud may finally end the reign of the work computer

April 20, 2011 - 9:00 am

It’s been a debate within organizations as long as I can remember: whether it’s possible to support a workforce that has the choice to use their own computers to perform their work. Recently the discussion has reached new levels of excitement as some big name organizations have initiated pilot programs. For IT leaders it’s a prospect that’s both compelling and daunting.

Technology developments over the years have made software more hardware agnostic, such as the introduction of the web browser and Java. Personal computers have largely become commodity items and their reliability has significantly improved. Yet, despite these events, bringing your own computer (BYOC) to work has remained an elusive goal.

Why bring your own computer to work?

From an IT leader’s perspective, the reasons for supporting BYOC are pretty clear. In an environment where CEOs want more of the organization’s dollars assigned to value-creating investments and innovation, the ongoing cost of asset management continues to be an unfortunate overhead. From procurement and assignment to repairs and disposal, managing large numbers of personal computers represents a significant dollar amount on a CIO’s budget.

The second driver is the desire of employees to use the equipment they are most comfortable with to do their jobs. We know that for most, a personal computer is not simply a black box. From wallpaper to icon positions, a computer often represents an extension of the individual. If anyone needs more convincing, just try and pry an Apple computer away from its user and replace it with a Windows machine (and vice versa). People have preferences. Enterprise-provided computers are a reluctantly accepted reality.

Why can’t we bring our own computers to work?

With these compelling reasons and more supporting BYOC, why has it not happened? The first reason that comes to mind for most IT leaders is the nightmare of trying to support hardware from a myriad of vendors. It flies in the face of standardization, which largely helps to keep costs and complexity down. In addition, organizations have continued to build solutions that rely on specific software and hardware requirements and configurations. Finally, there is both a real and perceived loss of control that makes most security and risk professionals shudder.

With all that said, there are now some substantive reasons to believe BYOC may soon become a reality for many organizations.

Times they are a changing

[Many of you can skip this brief history recap] When the web browser emerged in the 1990s, there was some optimism that it would herald the beginning of a world where software would largely become hardware agnostic. Many believed it would make the operating system (OS) largely irrelevant. Of course we know this didn’t happen, and software vendors continued to build OS-dependent solutions and organizations recommitted to large-scale, in-house ERP implementations that created vendor lock-ins. At the time, browser technology was inadequate, hosted enterprise applications were weak and often absent for many business functions, and broadband was expensive, inconsistent, and often unreliable across the U.S.

Skip forward and the situation is markedly different. Today we have robust browsers and supporting languages, reliable broadband, and enterprise-class applications that are delivered from hosted providers. It’s also not uncommon anymore for staff to use non-business provided, cloud-based consumer applications to perform their work.

Oh to be a start-up! If we could all redo our businesses today, we’d likely avoid building our own data centers and most of our applications. This is one of the promises of cloud computing. And while there will be considerable switching costs for existing organizations, the trend suggests a future where major business functions that are provided by technology will largely be non-competitive, on-demand utilities. In this future state it’s entirely possible that hardware independence will become a viable reality. With the application, data, business logic, and security all provisioned in the cloud, the computer really does simply become a portal to information and utility.

Smartphones are already a “bring your own computer” to work device

The smartphone demonstrates all the characteristics of the cloud-provisioned services I’ve discussed. In many organizations bringing your own smartphone to work is standard practice. Often the employee purchases the device, gets vendor support, and pays for the service themselves (a large number of organizations reimburse the service cost). It’s a model that may be emulated with personal computers. (That is, if smartphones don’t evolve to become the personal computer. That’s another possible outcome.)

I believe fully-embraced cloud computing makes BYOC entirely possible. There will continue to be resistance and indeed, there will be industries where security and control is so inflexible, that BYOC will be difficult to attain. There will also be cultural issues. We’ll need to overcome the notion that providing a computer is an organizational responsibility. There was a time when most organizations provided sales-people with cars (some still do). Today we expect employees to provide and maintain their own cars, but we do provide mileage reimbursement when it’s used for business purposes. Could there be a similar model for employees who use their own computers? Today, for BYOC, some enterprises simply provide a stipend. What works and what doesn’t will need to be figured out.

So what now?

So what are the takeaways from all of this? First, BYOC is a real likelihood for many organizations and it’s time for IT leadership to grapple with the implications. Second, the emergence of cloud computing will have unanticipated downstream impacts in organizations and strategies to address those issues will need to be created. Lastly, we’ve already entered into a slow and painful convergence between smartphones, personal computers, consumer applications and devices, and cloud computing. This needs to be reconciled appropriate to each industry and organization. And it has to happen sooner than later.

When the dust settles, the provision of computing services in the enterprise will be entirely different. IT leadership had better be prepared.


Why do so many enterprises still have difficulty implementing new technologies?

April 6, 2011 - 9:00 am

IT innovation abounds! We live in a spectacular time. Change appears to be happening rapidly. Market barriers for new entrants have come down. Got an idea? You can make it happen. But despite all the ebullience, much of our innovation still remains incremental. It’s more often evolutionary rather than revolutionary. In fact, that’s just the way it’s always been. New knowledge is created at its natural pace and new insights build upon it. Occasionally there is a ground shift and a new branch of knowledge emerges that itself spawns new products and services. In the IT business, we see this every few years.

Sure, we should give credit where it’s due. The IT industry is at the leading edge of innovation when compared with other industries.

I write here, not about the IT innovation that we see happening in businesses every day and not about the important incremental innovation that helps businesses move forward. I’m referring to breakthrough innovation — the kind of innovation that reinvents everyday things. Of course it happens eventually, but it takes a long time.

The reality is that organizations are only capable and willing to adopt technology at their pace. For many of us — suppliers, managers, and implementers of technology — it can be valuable to understand why this might be the case. No matter how much you fight it, the rate of technology adoption at enterprises is a throttle on the velocity of new innovation that can be introduced by technology providers. In my view, were this not the case, I imagine we may already have had pervasive teleportation and invisibility cloaks at our disposal.

Over my career working in and observing multiple enterprises, I’ve noted some consistent trends that provide rationale for their speed of technology adoption. It’s fair to say that there is a spread, but the majority in the bell curve move at a slow rate. Of course, there are always clear exceptions and we have to recognize those trailblazers, too. However, even the first movers are constrained by the majority. For example, with a social application: if your organization is the the only one using it, its value may be considerably weakened by the absence of the network effect.

Below I briefly discuss five reasons why enterprises continue to have a slow adoption rate for innovative technology. I’ll admit there are no surprises in this list, and frankly they are quite obvious to most of us. However, I think they are worth calling attention to again, particularly since we are in an impressive period of IT innovation. I’ve also added my own thoughts on how they can be addressed.

1. Cost

Decision-makers have many choices when investing scarce dollars on IT projects. In most organizations it’s a prioritization process that nobody enjoys. But it’s essential. Many great ideas fall by the wayside and never make the light of day in favor of more pressing enterprise needs. In this context, broad implementation of new technology — not research and development efforts — can have real problems securing funds.

Additionally, a new solution is often more expensive because of the change that needs to happen. It’s a bigger proposition than an upgrade, an enhancement, or the roll-out of a commodity-type ERP. Other costs, such as risk and the implementation unknowns, can provide a disincentive to decision-makers already jaded by too many failed IT projects.

ADVICE: Despite these constraints, many enlightened organizations still commit funds to high-risk, new technology projects, often by using dollars set aside specifically for these special projects. Decide if all projects should go through a standard IT governance or determine whether there should be an exception process that is triggered by technology that meets a certain high risk, high uncertainty criteria.

2. Complexity

Today, fewer and fewer solutions remain islands among the IT infrastructure. There are often so many inter-dependencies that even a small change has downstream impacts that must be considered. Introducing new technology into these environments is seldom a trivial exercise. It’s also a reason why so many decision-makers prefer single-vendor stacks. Sure, standards have improved the situation immensely, but we’re still a far distance from a time when customizations aren’t required or are at least reserved to the administrative layer.

ADVICE: In many ways, this limitation is aligned closely with cost. Complexity becomes less of an issue if you’re prepared to invest in the effort. If possible, put some funds exclusively for this exploratory work in your budget. Think about investing in a lab environment where ideas can be safely explored. Prototypes are a great way to win decision-makers over.

3. Resistance

While both cost and complexity are largely qualitative inputs, there are a number of human factors that greatly influence IT decisions.

There is an unfortunate twist to our period of hyper-innovation. While we embrace and support it — we love new toys — there’s a more sober component to new technology introduction that cannot be overlooked. It’s similar to that moment at a buffet when you know you’d like to try more, but you’re simply too full. Humans have a cap on the amount of new technology they are able to consume. Introduce too many new applications and they will be rejected.

This applies to system improvements too. As we’ve seen so publicly demonstrated as a consequence of Facebook changes, make too many far-reaching modifications and you risk a user rebellion. That’s a recipe for failure.

ADVICE: Every CIO needs to understand, for his or her organization, the pace at which new capabilities can be deployed. It’s probably a lot slower than we all think. Spend time to discuss different views with a variety of stakeholders. Analyze historical trends. Monitor usage as products get deployed. Over time it will become clear what the tipping point is. Your users will quickly let you know.

4. Legacy

We are wedded to the past. It has a lot to do with comfort and trust. We like the things we know more than things that are new and unknown. There’s a reason we go back to that tried and tested Excel formula when we know we have the same capability in the latest ERP system. There’s a reason we continue to use email for seeking answers when our organizations have spent millions on elaborate knowledge management systems. There’s clear value in legacy systems.

New technology often has to compete with these older solutions. For many of your users, you’ll need to pry them away from the old applications kicking and screaming. In some instances resistance will be so fierce, you’ll be forced to concede.

The net effect? Legacy systems present a limitation to the introduction of new technology. It may not happen at the time of deployment. It’s just as likely to happen at IT project governance when the decisions are being made on what projects to invest in. A debate may ensue that argues in favor of the legacy solution and that will kill the new technology before it ever sees the light of day.

ADVICE: Really focus on the business case. And reinforce it over and over. Make sure you have air-tight evidence for the return-on-investment (but recognize the need for a small number of leading-edge projects to move forward without all the evidence). Numbers talk, particularly dollars. Championing should come from many different leaders. Make a strong case, but ultimately respect the organizations choice.

5. Politics

Oh, the joy of organizational politics! It should come as no surprise that politics plays an important role in IT decision-making. Sometimes it can be an asset. For example, escalating up a hierarchy to leverage leadership perspective can often be a good way of getting tough decisions made. But it can too often be a liability. For example, individual or team self-interest can result in vendor selections that don’t reflect evidence gained in requirements gathering.

Reconciling organizational and individual interests is a messy business. And it’s highly complex. I imagine many of us can tell our own stories of how we observed decisions being made that had little basis in reasonable logic. We’d like to pretend it isn’t a factor, but all too often it is.

Negative organizational politics can hinder IT innovation. There is considerable value to the skill in those that can navigate within those constraints and turn them into a positive outcome.

ADVICE: Organizational politics shouldn’t be viewed as always being negative. It’s important to recognize the role it plays in the process of introducing new technology and then work to channel it into a positive force. Find out who your allies are and partner with them to help make a business case. Observe, listen and learn about your organizations dynamics. Make note of what works and what doesn’t and leverage that knowledge to navigate through the organizations politics. It’s not easy at all and an excellent skill for those that can master it.

These organizational constraints are presented not to suggest that new technology seldom gets introduced to the enterprise. Of course it does. The real effect of these limitations is that they slow the rate of introduction. I’m also suggesting that this slow rate when compounded at a macroeconomic level has a significant impact on the overall speed that new technology innovation gets integrated into each of our lives.

I recognize that this blog doesn’t explore the impact of new innovation in the consumer space on the enterprise (this phenomenon has been called consumerization). That’s certainly an important input. While many of the limitations I’ve discussed are still relevant, there’s a valuable research effort to be done to fully understand what the impact of consumerization will be on enterprise IT innovation over the medium to long term.

Just in case you were wondering, invisibility cloak research and prototyping is well underway. Just use your favorite search engine to look up the subject. You might be as surprised and impressed as I was.


Process management blurs the line between IT and business

March 29, 2011 - 9:00 am

Business process management (BPM) and more specifically business process optimization (BPO) is about fully understanding existing business processes and then applying agreed-upon improved approaches to support market goals. Rather than exploring BPO from the viewpoint of the business, here I’ll briefly explore some of the motivations and benefits from an IT perspective.

Almost every business change has a technology impact

There are very few IT systems today that exist in isolation within an organization. Systems interact because they often require data from each other and they are interdependent in terms of sequential steps in a business and technology process. As a result, a change in one system invariably has a downstream impact on one or more other systems or processes. Often, the consequences of these changes are poorly understood by both IT and business stakeholders. Put another way: in interdependent complex systems and processes, there is seldom the notion of a small change.

Once both IT and business stakeholders recognize this, there is an opportunity to turn it into a highly positive outcome.

IT must be perpetual teachers and learners

As is the case in achieving many of the objectives of an IT strategy, it begins with communications. Every contact between IT and the business is an opportunity to teach and to learn. This is a reciprocal interaction. When I hear or read a sentence that begins, “Could you make a small change for me…” I know we’re already starting from a bad place. Unless the requester fully understands the internal complexity of all the interdependent systems and the potential impacts (which is rare), it’s presumptuous for him or her to estimate the scale of the change. Conversely, any IT person who minimizes the impact of a change without fully understanding the potential impact does a disservice in setting expectations that may not be met.

For IT requests, it’s best and safe to assume that a change will have impact, but the scale of that change will not be known until reasonable diligence is performed. That’s a much better starting point.

Let’s now assume that the change is not inconsequential. Two opportunities present themselves.

IT is an important business facilitator

First, stakeholders that are impacted by the change should be brought together to discuss the impact. I’m always surprised how these meetings reveal gaps in everyone’s understanding of business processes between departments. To me, this is where IT can shine as the connective tissue within an organization. More than ever, technology forces organizations to better understand and agree on processes — and that’s often well before the subject of supporting technology is even relevant to the conversation.

Use this opportunity to surface the entire process and for everyone to understand the impacts of any change. Improvements to the process very often emerge. IT has suddenly motivated business process optimization.

There is no such thing as too much process documentation

Second, assuming no documentation exists, this is the right time to map the process. If you’re like many organizations, your IT systems grew organically with little emphasis placed on business process design. My guess is that comprehensive, high-quality, current process documentation is uncommon. It’s never too late to start. If you have business stakeholders in a room discussing and agreeing on the current and future process, this is the time to document it. There is a burgeoning market for tools and support to help enable and simplify this work.

Ultimately, documented processes make it easier to build the right software and to make changes with less overhead activities in the future.

The essential roles of business analyst and solutions architect

It’s this emphasis and attendant benefits of understanding and documenting business processes that supports the expanded roles of both the business analyst and solutions architect. These two roles, and having the right amount of capacity for your organization’s demand, will be essential to succeeding with your IT strategy and in growing the business. In many organizations, the business analyst for this work may or may not be in IT, thus further blurring the lines between where IT starts and ends and where business responsibilities start and end.

Perhaps it’s possible that in the not too distant future we’ll look at IT as part of the business and not as a separate entity in the manner it is today. It just might be the increased emphasis on business process management that acts as the catalyst.


It’s Time for IT to Ask More of the Right Questions

March 16, 2011 - 9:00 am

Today, the IT department is often a victim of its success. With technology increasingly at the center of business initiatives, there is an insatiable demand for services. And while most IT professionals come to work each day to be productive and add value, more often than not, it’s an uphill battle to keep internal customers happy. Working either harder or smarter hasn’t necessarily produced the customer satisfaction dividend anticipated. Moreover, it has served to increase expectations of what can be provided and it has continued to raise the bar for IT.

Typically, IT will deliver the right thing at the right time (as long as there is leadership support and good requirements), but it can be painful getting there. Internal customers will be happy to get their solution, but they might not be happy in the manner it was done. It’s a perception issue. IT is too often judged almost exclusively on how something was produced rather than what was delivered.

Should IT be chasing kudos or trying to get the job done right?

In the service business, success is often measured by having happy customers. In the marketplace, happy customers are repeat customers. Organizations with internal service departments are not usually subject to these types of competitive pressures. Sure, cost must be managed otherwise a service may be better performed outside the business. But even where cost is higher, organizations continue to enjoy the benefits and pay the premium of keeping many services internal. For example, they can exert maximum control and are not subject to continued contractual interpretations and disputes. With that said, if you’re a captive cost-center, quality customer service has to be driven by something else such as culture, incentives, or vision. In other words: it’s a choice.

If an IT team is delivering quality services and products but still not meeting, say, the speed of service expected, that might be an acceptable trade-off. In other businesses, quality may suffer in place of speed. In project management, there is a maxim known as the triple constraint. That is, changing one of the following: speed, cost, and scope usually results in an impact to the others. In service delivery, the triple constraint is often quality, speed, and customer satisfaction (underlying these is a fourth, the inadequately addressed component of risk.)

It’s a worthy goal to be both a world-class customer service provider and a producer of high quality products and services. It’s possible to manage the service triple constraint without too many trade-offs. But to be that organization requires an important operating principle: IT must rarely be the arbiter of priorities. That role must live squarely outside of IT.

Changing IT from an organization of “no” into an organization of “go”

I’ve seen it repeated throughout my 20-year IT career: internal customers come to the IT team with a need and it’s IT who says it can’t be done. Customers get frustrated and they have a poor view of the IT team. Usually they are saying “no” because of a capacity issue rather than a technical limitation. When IT says no to a customer, what they’re really saying is that something else is more important. That’s IT being an arbiter of priorities.

Yes, it goes back to IT governance, something I’ve discussed as being absolutely essential to business success.

But while IT governance can work as a process at the leadership level, it will fail when the IT team doesn’t have the understanding and the language of the process to support it as it manifests downstream.

When confronted with a priority decision, an IT staffer needs to move arbitration back to the business.

The staffer typically wants to know what to do, not whether they should do it.

Therefore, you must transition your staff from saying “no” to asking questions about priority and capacity. It certainly can be the case that more than one request has priority. If so, it’s now a question of investment. Spend more and you’ll get more resources.

Bottom line: these decisions are made by the business, not by the IT staffer who’s just trying to do the right thing.

Internal end-users and IT may never have a love affair, but if roles are better defined and understood, all parties will be less frustrated, have greater empathy for where they are coming from, and customer satisfaction will be firmly focused on the quality of the product or service being provided.


Seldom a love story: IT and end users

March 7, 2011 - 9:00 am

Today, the IT department is often a victim of its success. With technology increasingly at the center of business initiatives, there is an insatiable demand for services. And while most IT professionals come to work each day to be productive and add value, more often than not, it’s an uphill battle to keep internal customers happy. Working either harder or smarter hasn’t necessarily produced the customer satisfaction dividend anticipated. Moreover, it has served to increase expectations of what can be provided and it has continued to raise the bar for IT.

Typically, IT will deliver the right thing at the right time (as long as there is leadership support and good requirements), but it can be painful getting there. Internal customers will be happy to get their solution, but they might not be happy in the manner it was done. It’s a perception issue. IT is too often judged almost exclusively on how something was produced rather than what was delivered.

Should IT be chasing kudos or trying to get the job done right?

In the service business, success is often measured by having happy customers. In the marketplace, happy customers are repeat customers. Organizations with internal service departments are not usually subject to these types of competitive pressures. Sure, cost must be managed otherwise a service may be better performed outside the business. But even where cost is higher, organizations continue to enjoy the benefits and pay the premium of keeping many services internal. For example, they can exert maximum control and are not subject to continued contractual interpretations and disputes. With that said, if you’re a captive cost-center, quality customer service has to be driven by something else such as culture, incentives, or vision. In other words: it’s a choice.

If an IT team is delivering quality services and products but still not meeting, say, the speed of service expected, that might be an acceptable trade-off. In other businesses, quality may suffer in place of speed. In project management, there is a maxim known as the triple constraint. That is, changing one of the following: speed, cost, and scope usually results in an impact to the others. In service delivery, the triple constraint is often quality, speed, and customer satisfaction (underlying these is a fourth, the inadequately addressed component of risk.)

It’s a worthy goal to be both a world-class customer service provider and a producer of high quality products and services. It’s possible to manage the service triple constraint without too many trade-offs. But to be that organization requires an important operating principle: IT must rarely be the arbiter of priorities. That role must live squarely outside of IT.

Changing IT from an organization of “no” into an organization of “go”

I’ve seen it repeated throughout my 20-year IT career: internal customers come to the IT team with a need and it’s IT who says it can’t be done. Customers get frustrated and they have a poor view of the IT team. Usually they are saying “no” because of a capacity issue rather than a technical limitation. When IT says no to a customer, what they’re really saying is that something else is more important. That’s IT being an arbiter of priorities.

Yes, it goes back to IT governance, something I’ve discussed as being absolutely essential to business success.

But while IT governance can work as a process at the leadership level, it will fail when the IT team doesn’t have the understanding and the language of the process to support it as it manifests downstream.

When confronted with a priority decision, an IT staffer needs to move arbitration back to the business.

The staffer typically wants to know what to do, not whether they should do it.

Therefore, you must transition your staff from saying “no” to asking questions about priority and capacity. It certainly can be the case that more than one request has priority. If so, it’s now a question of investment. Spend more and you’ll get more resources.

Bottom line: these decisions are made by the business, not by the IT staffer who’s just trying to do the right thing.

Internal end-users and IT may never have a love affair, but if roles are better defined and understood, all parties will be less frustrated, have greater empathy for where they are coming from, and customer satisfaction will be firmly focused on the quality of the product or service being provided.


3 essential skills for IT professionals

March 2, 2011 - 9:00 am

Whether you are preparing for a career in information technology (IT) or you are a seasoned professional, it’s important to know what skill needs are emerging in the marketplace. As I review the technology and business landscape, I’ve made some observations about what I believe will be increasingly valuable proficiencies to bring to the table.

Demand for certain skills or an increased focus in specific areas is being motivated by drivers such as the commoditization of IT, which is moving many countries to more right-brained jobs economies; by the data deluge, which is presenting considerable opportunity to understand business in completely new ways; and by rigorous competition in the marketplace, which is forcing greater velocity in the generation of new service and product ideas.

Many roles within IT will continue to be valuable but may be more sensitive in the long run to the business landscape shifts we are experiencing. Rather than a decreasing need overall, I predict we’ll continue to see a greater role for IT in the future as well as IT skills being an important part of almost every information worker’s inventory of capabilities.

It’s also fair to point out that we’ll see new skills emerge that we can’t even imagine right now. For example, in the mid-1990s it would have been near impossible to predict skills in search engine optimization (SEO) or the whole range of IT careers that have spawned from social media.

The following three skill areas will find high demand in the marketplace either as standalone careers or in combination with other skills.

1. Coordination

In the context of IT, coordination is a skill set that provides guidance and oversight for the smooth interaction of multiple activities and their positive outcomes. It certainly includes project management, but it’s not limited to it. People who can bridge relationships between disparate participants, such as developers in an offshore location and testers at a local facility; accommodating cultural differences, advocating for collective success, and expediting answers to questions and concerns, offer significant value.

The IT coordination skills can equally live in the business, the IT organization, or in a third-party provider. In a world where achieving results can often require the participation of a multitude a loosely related resources, effective coordination skills are paramount.

Acquiring great coordination proficiency certainly comes with experience, but preparation should include focusing on negotiation skills and communications in general; problem solving techniques; understanding the fundamentals of project management; and acquiring time management and prioritization methods.

2. Analysis

Our digital world is creating mountains of new data. In fact, we are experiencing exponential growth in its volume. As an example, every two days now, we create as much information as we did from the dawn of civilization up until 2003. It’s both a challenge and an opportunity. The challenge is clearly making sense of it. The opportunity is using findings in the data for competitive advantage.

It’s becoming clear that large volumes of data can reveal new insights that were previously unknown. As examples, analysis performed on unstructured data scattered across the web can reveal sentiment on people and products. Examining the patterns within social network connections can tell us a lot about where authority resides.

It’s within this new context that we see demand for people with skills to identify and extract valuable data; perform extensive analysis on it; discover patterns and hidden secrets contained within; and make sense of it for decision-making purposes.

To acquire these skills includes training in critical thinking, analysis tools, presenting quality communications through writing and visualization, and statistics.

3. Innovation

We’ve seen large parts of IT turn into commoditized products and services. As an example, email is not a competitive advantage and it’s largely dominated by one vendor. Whether you keep that capability and its attendant skills in-house is largely a cost and risk decision. Many organizations are reviewing their internal IT capabilities and concluding, that unless they are creating new value and a distinctive advantage, they simply remain a necessary cost center.

IT leaders are being tasked to reduce the cost center component to a minimum while ramping up the competitive elements of technology. The c-suite is requiring the IT organization to commit the biggest percentage of their available capacity to partnership activities with the business in creating new opportunities. It’s this driver that is increasing the demand for innovation skills.

Innovation is the most abstract of the three skill areas in this blog as it is often the hardest to quantify. But it does include a wide range of skills that contribute to the conversion of ideas into net new business value. These include research, applied research, product evaluation and recommendations, problem solving, championing a new idea, and building a business case for investment that includes cost-benefit analysis.

As you consider your IT career, you might conclude that none of these skills are central to your interest. That’s okay, too. My view is that, should you choose another IT path, it’s still worth considering whether any of these three areas can complement your core interest. Whether you want to be or continue to be a programmer, business analyst, system administrator, or quality assurance analyst, adding one or more of the skills above can only add to your advantage.

We’re guaranteed that the needs of the IT jobs marketplace will continue to change, but if each of us is ready to acquire new skills, a career in IT will remain one of the most lucrative and exciting of the professions.


Mind-blowing, world-changing technology by the numbers

February 16, 2011 - 9:00 am


The impact of IT decisions on organizational culture

February 7, 2011 - 12:01 am

It’s said that with great power comes great responsibility. Among business functions, the IT group has disproportionate control over what can and can’t happen in an organization.

Let’s take instant messaging as an example. Assuming IT makes the final decision (which is often the case), enabling instant messaging both internally and with external parties can fundamentally change the way a business communicates. But prohibit instant messaging (which still happens today), and the IT organization is fundamentally dictating how communication will take place. That’s considerable power for a business function, and it must be managed carefully.

Should IT dictate how everyone works?

The obvious and most visible control the IT organization has is over product choices. If you’re reading this blog on a work computer, it’s likely you didn’t choose that model yourself. If your software is locked down, I’d guess you didn’t select the browser or the word processor or the email client. Fortunately some organizations allow staff to download applications of their choice, but they are often the freebies, and not the major business solutions. For those there is an approval path and it usually leads back to the IT organization.

Given that the IT organization has chosen many of your basic information tools, it has already predetermined what you can and can’t do.

Advice: Non-IT staff should participate in your standards and product selections process. Also, in addition to understanding what the technology will enable, it’s important to explore and articulate what it will limit.

Is IT enabling or killing ideas?

I’ve written often about the critical importance of IT governance. With all its significant benefits, the risk with IT governance is that it becomes the opposite of what is intended. The process in which a new idea becomes a great new product or solution can live or die with IT. That’s the power the IT organization has within a business. Executives know this and it is often a source of considerable frustration.

IT is often characterized as an enabler. While mostly accurate, to what degree have we acknowledged that through our decisions it can be a limitation? And more importantly, how do we manage that risk?

Advice: Don’t lock your IT governance down. It must be a process that quickly changes as constraints are identified. Stick to the principles but constantly look for ways to reduce friction.

Does your IT organization move fast enough?

Does the IT organization move as fast as the business? Ideally yes, but we all know that the sobering reality is that IT, largely due to its popularity as an enabler (read: demand for IT almost always exceeds supply of capacity), is often a bottleneck. I’ve seen it too many times: a request won’t even make it to IT because there is a perception that it will never get done or at least not get done in a timely manner. If this is the perception in your organization, then IT is stifling innovation. In other words, IT is limiting the possibility of a favorable cultural quality.

Advice: Look for ways to manage and prioritize ideas outside of those deeply aligned with business strategy. It’s important to explore and continuously evolve your processes to be more agile.

Living with your IT decisions

I’ve touched on a few examples here, but it must be clear to you now that almost all IT decisions today can have lasting negative implications (we already assume that we understand the benefits of our decisions). Choices like an ERP suite or architectural decisions around how data is stored and shared can have profound impacts on the ability for stakeholders to make timely business decisions. In many cases, and particularly in our new business environment, these IT decisions can make or break a business.

Smart organizations foster the culture they want. They make deliberate decisions to encourage and discourage certain behaviors. Today, business brand and culture are often intertwined and those that get it right consistently win in the marketplace.

While I believe we recognize the limiting qualities of IT decisions, I’d suggest we’ve insufficiently studied the degree to which those decisions in aggregate can have a large influence on organizational culture.

It might be time to better understand the relationship between the culture of your organization and the IT decisions that are being made.


Mobile in the enterprise changes everything

January 26, 2011 - 9:00 am

By now it’s clear that mobile is the new global frontier for computing. People on every continent are embracing mobile as the primary method for electronic communications.

Increasingly, many are also using mobile computing for a myriad of day-to-day activities, from purchasing products and services to testing blood insulin levels. Five billion people now use cellphones — about 62 percent of the planet’s population — compared to less than two billion who have a personal computer.

Within just a few years more people will access the Internet from a mobile device than from any other technology.

In developing nations the cellphone is a tool of empowerment; it has the power to change economic and political landscapes. In developed nations it is disrupting existing business models and introducing completely new ones. Mobile is enabling us to reinvent everything from healthcare to payment systems.

Smartphones, a subset of the mobile market at a little less than one billion users and growing quickly, has become a domain of hyper-innovation. Fierce competition from big players such as Apple, Google, Microsoft, and Research in Motion (RIM), is driving the rapid delivery of new, innovative capability. The lifespan of a new device or version of an operating system is been compressed. The market appetite continues to grow and there are no signs of fatigue. In fact, the accompanying mobile applications industry is booming. In just over two years, consumers have downloaded 10 billion apps from Apple alone.

Suddenly the PC looks like yesterday, while mobile is today and tomorrow.

It’s time to act

For an IT leader, mobile is a game-changer. Unlike many other emerging technologies where an immediate strategy is not a concern, mobile is front and center now to your users and customers. This requires new thinking with regard to how data is accessed and presented, how applications are architected, what kind of technical talent is brought on board, and how companies can meet the increasingly high expectations of users.

As if there wasn’t enough pressure already!

There are two audiences for mobile applications and capability: your internal audience and the customers you serve in the marketplace. Both have different needs, but both have expectations that have already been set. The benchmark is not your best internal web-based app; instead, it is the most recent best-in-class mobile app that any one of your users or customers have recently downloaded. While your internal users might be more forgiving for a less than optimum user-experience, you’re pretty much guaranteed your external users won’t be.

Many IT leaders are not yet fully embracing mobile (I’m not talking about email and calendar access here — those are the essential basics). Part of the reluctance to fully come to terms with mobile is simply change fatigue. Remember, the move to the web is still in full swing and many organizations still struggle with core system integration issues. But another part of the issue is that the magnitude of the change ahead is not well understood. A mobile strategy is not the equivalent of making your web applications accessible via a mobile device. In the short-term, that may suffice for some (but barely). In the medium-term a mobile strategy means thinking completely differently about the user experience.

In the world of mobile, IT leaders and business stakeholders must consider how new capability such as geolocation, sensors, near field communications, cameras, voice, and touch can be integrated into functionality. It also means that core issues such as security, device form-factor, and limited screen real-estate must be addressed.

Mobile crashed the party

For a time there was a positive trend trajectory where the ubiquity of browsers on computers were making application development almost hardware agnostic. This was a great story and it had a decent run. The proliferation of devices and operating systems in the mobile space is a considerable spoiler. Now, any mobile application worth its salt must have versions — at minimum — for the web, for the iPhone, and for the Android platforms (that’s the basics before you consider others such as the BlackBerry, iPad, and Windows Phone). That may mean multiple development and design efforts.

In other words, just when IT leaders were beginning to see some platform stability, everything changed.

Not all industries will need to adopt mobile strategies at the same rate and not all industries will have to deal with providing solutions for their end-users in the near-term. There is no question that if your product or service is business-to-consumer and it already supports a good deal of its business via the web, then this scenario demands an aggressive approach to mobile. While this offers some consolation for everyone else, it’s merely temporal in nature.

Every business and every IT leader will need to quickly find the right response to the momentum that is nothing less than a mobile revolution.

At the end of the day, those of us who work with technology do it because of these types of major disruptions. The move to mobile represents yet another technology cycle that we must embrace. These cycles often start and end in different places. Who could have imagined that the web would change so much about our world in the way it has? I think it’s fair to say that mobile has the capacity to change the world in ways we cannot even fathom today.

I don’t know about you, but that makes me excited about our industry and the future.


3 types of IT leaders: maverick, innovator, guarantor

January 17, 2011 - 9:00 am

There is little recognition that the operating profile of IT leaders can vastly differ from organization to organization. This is most pronounced when studying how technology vendors sell to this audience. It can often appear there is simply one type of person leading every IT organization. Variations in needs are seldom reflected in the way products are sold.

There is an array of independent inputs that determine the style of each leader. Take for example the industry in which the person works. The approach of a CIO that leads a B2B industrial products business is going to be vastly different from one that runs an IT department at a university. Now also consider the culture of a business. It’s not possible to have the same style leading IT at a highly risk-tolerant, innovative tech company versus providing the essential needs for a conservative and low-tolerance-for-risk insurance giant.

For many of you, this might sound obvious. But why then do marketers, analysts, consultants, and so many pundits (I’m probably guilty here too) so often sell to this community like it’s one dimensional?

I don’t mean to generalize too much. We should certainly recognize the brilliant jobs so many salespeople perform. Rather, the advice in this post is for the group of salespeople who could benefit from thinking differently about the diversity of every IT leader.

The following guidance can also be used by recruiters when thinking about filling IT leadership roles. In this instance, it can be asked: do the characteristics of the organization align to the skills, experiences, and personality of the person being hired?

Finally, if you work with or for an IT leader, it might help you in thinking about how to manage the relationship in a positive way.

Here I present my vastly condensed categorization schema for the IT leader:

1. The maverick

This IT leader works for an organization that thrives on taking risks. You’re likely to see lower levels of vendor standardization; this IT leader likes to try lots of different products and the organization’s broad portfolio of hardware and software reflects that.

The maverick IT leader is likely to have a higher level of comfort with open source and with quickly adopting less mature technologies. The background of this IT leader is likely technology-based and he/she has extensive IT knowledge.

The environment requires this person to move fast. Sitting on long, protracted RFP submission proposals, for example, will not go over well, nor likely be a common approach. Speed and agility are popular qualities with this IT leader, but there is a trade-off with standardization, repeatable processes, and predictability. Often this person succeeds with the sheer brute force of determination. But this benefit can often come at a price.

Advice: When working with this IT leader, be conscious of his/her low patience and less of a long-term commitment to any one direction.

2. The diligent innovator

This IT leader operates in an enlightened organization. He/she understands that IT innovation can bring considerable benefits, but this leader doesn’t necessarily make a first-mover play.

In this organization, occasional managed risk is supported with the caveat that homework is done and a back-out strategy exists. This IT leader is often asked to be agile in responding to needs while also being encouraged to push back on requests that don’t align with business objectives or may disproportionately introduce unnecessary complexity. It’s often a hard place to operate because the pull to take greater risks must be balanced with diligent decision-making. This can often result in a slower pace of activity, or in the worst case, in an impasse. The focus on diligence with underlying encouragement to innovate makes this a popular posture of IT leaders, but it can be the hardest of the IT leader categories to succeed in.

Advice: Be sure to provide this IT leader with plenty of assurances, good quality information, and support throughout any initiative.

3. The rock-steady guarantor

The ask of this IT leader is often the simplest: keep the essential systems running, don’t take too many risks, and keep the technologies moderately current. This person doesn’t need everyone to have the latest versions of software. They keep a close eye on new developments, but almost always take a late-majority approach to implementation.

While it sounds like this IT leader has it the easiest, that is the furthest from the truth. This person is being asked to keep everything working. Disruptions and surprises and not well received by management. Naturally, this makes the IT leader less agile, forces processes to be more bureaucratic, and change is much harder to make happen.

For most of history, this organizational profile has succeeded by being conservative and moving at glacial speed. The jury is out on whether this method is sustainable in today’s economic environment. The IT leader at the helm of this type of organization has considerable challenges ahead. He/she will see increased pressure to operate in a way that has been historically inconsistent with the risk profile of this type of business. A large amount of CIOs fill this category.

Advice: This IT leader requires a considerable volume of analysis to make decisions. Be sympathetic to rigorous approval paths, and prepare to support commitment to projects in the long-term.

I expect most IT leaders will have styles that overlap among all three categories, but it is highly likely that the predominant characteristics live in one of them. Of course, I’m really interested to hear from anyone who thinks they know an IT leader who doesn’t belong in any of these categories.

A short blog post can never do justice to an important discussion. I’ve left out a lot here, such as budget control and who the CIO reports to. But what I’m trying to do is raise awareness and provoke a dialogue. There isn’t a one-IT-leader-fits-all model. IT leaders are fundamentally different based on the organizations they lead.

Knowing and considering the subtle and not-so-subtle differences with each IT leader will help marketers better reach and resonate with them. It will help anyone who works with the leader to have more successful interactions and outcomes. Ultimately, it will be better for the IT leader and the organization.


Why is IT governance so difficult to implement?

January 11, 2011 - 9:00 am

“Governance? Process? Yuck, nasty words!”

Those are the actual opening words I used in an introductory email to business leaders whom I was inviting to participate in an important new process. What I was introducing had the potential to become antagonistic bureaucracy and could seriously backfire. So I was treading carefully but knew I had to move forward to enable our desired IT transformation and sustainable on-going effective operations.

But let’s step back and understand the origin of my motivation for this new process.

As an IT leader, do these statements sound familiar? Demand for IT capacity far exceeds the ability for the IT team to deliver. Everyone who makes a request considers theirs to be a priority and wants it done as soon as possible. There’s considerable frustration from requesters that their work is not getting addressed or has been put on-hold to address other priorities. The IT team is highly stressed, lacks direction, and morale is low.

The good news is that IT remains a highly valuable and in-demand business resource. The bad news is that all too often these statements reflect the state of IT operations for many businesses. And while it can continue like this, the costs are all too well known. IT becomes the bottleneck to business growth (yikes!) and effective operations and nobody is happy: not the IT team and not the business.

If you want to transform IT and fix these all-too-common issues, a new approach must be adopted. The trouble is that by introducing the methods to fix these problems, in the short-term it can be tough to win buy-in. In effect, you have to introduce a modicum of bureaucracy which will often arouse aversion by business leaders. Getting past those first few months and demonstrating success will help you turn the corner and transform the way IT is delivered. It’s not easy.

So what is this process?

What I’m describing here is IT governance. In simple terms, IT governance is a process that ensures that IT capacity is working on the right things at the right time to enable business goals. It’s a set of controls that focus on organizational success while managing associated risks. Sounds simple, right? The devil is in the details! While nobody could argue that any process that aligns IT to business goals is the right strategy, it’s the change required and the compromises on the part of business leaders that can derail this most worthy of efforts.

Why is IT governance so difficult to implement?

Business leaders want to do the right thing. They want the business to succeed and they will work hard to make that happen. But all too often, they are motivated and rewarded by having their small part of the organization succeed. IT governance requires that the scarce resource of technology capacity be diligently distributed across the organization for overall business success. In other words, it requires that IT cannot be allocated on the basis of individual team needs but rather on collective, organizational goals. (Of course, we recognize that a small percentage of IT budget should be set aside for specific team needs and in many organizations each team gets a dedicated amount of cash for that very reason).

How does IT governance work?

IT governance works like this: all technology investment requests are brought to a central authority–let’s call it the Governance Review Board (GRB)–and the merit of every request is debated (and/or processed through the lens of strategic alignment) and a decision is arrived upon. Membership of the board is made up of senior members of the organization that represent every function.

What is the core value that ensures IT governance will work? The ability to compromise.

If participants are focused on the success of the entire business, compromise becomes easier. Those not used to this type of approach will initially be frustrated, and that’s why the first few months are essential. You have to demonstrate that this is a better way to manage your scarce IT resources. If it works well, it solves most of the issues described earlier in my post. Seriously!

I’m passionate about IT governance because I see the enormous value it has for every type of business. If IT governance doesn’t succeed, it will radically hinder your ability to do move your enterprise forward. That’s no over-statement.

Is IT governance optional?

There are many ways to implement IT governance, but the principles remain the same. While we can debate the method of implementation, I’ll go to bat to suggest that we cannot debate the essential value of IT governance. Regardless of the size of your business, some form of IT governance must be part of your organizational processes.

Bureaucratic? Sure. Essential? Definitely.


Can good IT managers make great business leaders?

January 8, 2011 - 2:20 pm

Recently, many people have been pointing out to me how my writings and the attendant observations on technology have been resonating outside of the IT community. Specifically I’m told, the subjects I’m writing on have meaning and value as general business management content. As I pondered on this notion, it struck me (and it’s obvious in hindsight) that in a world where technology is a fundamental foundation of almost all business, there’s not a great deal of difference between the skills required for good IT management and that of general management.

However, as true as that might be, as I further considered the thought, I concluded that good IT management doesn’t necessarily equate to great business leadership. We hear it all the time: today, CIOs and IT leaders must be able to partner with other members of the C-suite and in addition to running the operations of IT, be able to grow the business through IT enablement.

After all, the CIO is first and foremost a business leader.

Here’s the ask: the CEO wants more value from IT, the COO wants optimized operations, and the CFO wants it all at the least possible cost.

This requires the CIO to understand business and — surprise — have some form of general business background. That being recognized, the most common path to IT leadership is still through the IT organization, and that means the CIO strength may be of a technical nature with a nuanced flavor of management. That can often present a problem.

It’s important to recognize that to run an IT project or to manage a team of IT developers requires good management techniques. But all too often, IT professionals exist and operate in a vacuum resulting in a variation of management absent of inputs such as market forces. In other words, the typical IT manager, for example, may never be exposed to a P&L statement.

This is not by intention, but comes about as a result of how almost every IT organization operates. Largely shielded from the real work of the business, IT has both the convenience and the limitation of working with internal sponsors who are captive customers with no choice of supplier. That couldn’t be any more different than leaders who were groomed with and are working with the open marketplace. Put another way, acquiring management skills within the IT organization may result in a myopic view of general management.

The best IT managers I’ve seen have a background of both IT and general management. Many IT managers do not get to work in a non-IT environment. But the IT managers that do best are often those that have had more business exposure than their peers. Take that as a tip for any aspiring IT manager.

I’m not suggesting for a moment that a great IT leader doesn’t need a technical background or a good understanding of technology. That’s necessary and expected. After all, one assumes that the reason for wanting to be an IT leader stems from a passion for technology often reflected in a life of mild obsessiveness with geekdom. What I am suggesting is that a technical background with IT management skills may not be enough to cut it as a great business leader.

To succeed, an IT leader must learn to talk in the language of business. For example, cloud computing is about potential cost reduction and new business opportunities, not some abstract technology term that introduces a suite of complex new service models. The latter has a place at your IT team meetings, but will do little to invoke attention in the board room.

A great IT leader is also a salesperson who takes an idea and inspires the audience. He or she must drive emotional commitment from a team and sell a vision that people can buy. That killer combination of communicating IT innovation in business terms, understanding the numbers, and eliciting belief from the C-suite can form the backbone of highly effective business leadership.

Without these skills the CIO can often be relegated to order-taker and maintenance guy.

Being a good IT manager is hard. Being a great business leader is harder. What separates them is not just the ability to continually and uniquely inspire, but first to be a really well-informed and skilled business manager. Get the basics right, learn the business, understand the financial aspects, think big picture, talk the talk, inspire through your values, and then deliver. Hit many of these on the head and you just might shine as a business leader in the C-suite.


The 2010 technology of the year is …

December 21, 2010 - 9:00 am

While Facebook and the iPad garnered considerable attention this year — and rightly so — it is the free micro-blogging service Twitter that gets my 2010 accolade for the most important technology product of the year.

Now with more than 175 million subscribers, an estimated dollar value that is double that of the New York Times, and 25 billion tweets this year alone, Twitter is becoming a formidable disrupter in multiple domains, including media and the enterprise.

In June of this year, responding to several of my friends and colleagues who were simply confounded by the merit of Twitter, I posted my first blog on the topic. Looking back now, six months later, I see that even I significantly underestimated the value of the service.

Why Twitter?

Twitter finally meets the two essential criteria for business success:

1. Is there a viable revenue model?

To that I say a resounding yes! This year, Twitter began the rollout of their suite of promotion features. A form of advertising, Twitter promotions call out sponsored hashtags and help to serve up associated tweets. As Evan Williams (Twitter co-founder) pointed out at the Web 2.0 Summit in November, a considerable challenge right now is managing the excessive demand by brands to have their products and services promoted. He also pointed out that there are many more ways to monetize Twitter that are in the works.

2. Does the service have sustainable utility for its users?

Once again, Twitter has proven this to be the case over and over again. I’ll spend the remainder of this post exploring this point.

Twitter as a communications tool

There are few websites or TV commercials now that don’t adorn themselves with the Facebook and Twitter logos. These services are quickly becoming the new destinations or originating points for people interested in learning more about products and services. Twitter, with its small footprint and timeliness advantage, has the ability to uniquely reach and drive sales to a global audience. For a broader set of marketers such as politicians, governments, entertainers, charities, media outlets, and non-governmental agencies, the service provides a new and valuable channel to spread a message.

I personally use Twitter to communicate my ideas and to highlight items of interest to my followers. I also enjoy reading tweets from those I follow that are both informative and entertaining (side note: like many of you, I’ve completely dropped the use of RSS for pushed content as a result of Twitter). It’s also a knowledge discovery tool for me (more on that later below).

The usage of Twitter during the Iranian presidential protests in 2009 hints at the promise of a frictionless channel that rides above the limits of traditional communication tools.

Twitter as a disrupter of existing media

If you’ve had the chance to play with Flipboard for the iPad, it’s clear to see that pulling in a Twitter stream illuminates the real-time zeitgeist in ways never possible before. It presents person-specific interests and provides options for content, such as video, that can be explored further if desired.

Too often we take an existing media and simply present the same content in a different digital context. Great innovation uses digitization for reinvention. For example, we shouldn’t simply bring TV to the Internet; it should be different and use the unique capabilities of digitization to make it even more compelling. In Twitter, for example, the ability to serve up news in small chunks from a plethora of pundits results in the reinvention of news distribution. That’s neat.

Twitter as a competitor to Facebook and Google

The September facelift of Twitter on the web, which included inline video and photographs, was suggestive of what may lie ahead. Rather than being limited to basic micro-blogging capability, the revised Twitter is a compelling place to share media and send and receive direct messages. Improved mobile accessibility and usability extend these capabilities beyond the desktop, too.

Twitter has become a destination to discover and find things. Some of that is by push (e.g. you follow a link someone shared), but increasingly it offers benefits in pull (e.g. you do a search for something). While the demise of Google search is not imminent, Twitter is a search paradigm disrupter that can’t be ignored.

Twitter is natively a social network. It easily connects people and interests. Once again, while not a Facebook killer yet, a few additional features would align it against the core value-propositions of Facebook, but in a decidedly — and potentially — more compelling manner.

One can easily deduce why both Google and Facebook have been vying to acquire Twitter.

Of course it’s not all perfect. Twitter has a lot of work to do. They continue to have service outage issues when utilization spikes. A symptom of success no doubt, but an excuse that is long past its free-pass status. In the same interview cited earlier, Evan Williams spoke about the need — which they are working on — to have more meaningful or relevant tweets somehow rise above other less valuable content. One survey found that 40 percent of tweets are “pointless babble.” That’s a lot of noise if you’re trying to get real value from the service.

Fundamentally Twitter is important because it takes traditional concepts such as marketing and messaging and forces us to rethink them. Its API enables powerful data analysis of trends and discovery of patterns. It has spawned an ecosystem of more than 300,000 integrated apps that extend its capabilities. It’s even sparked a healthy amount of copycats, both in the consumer space (e.g. and Plurk), and in the enterprise (e.g. Yammer and Socialcast).

I recognize Twitter as my 2010 technology product of the year for many of the reasons above, but specifically it is because of its potential. If the company makes a few smart decisions over the next few months and beyond, Twitter has the power to be profoundly important in many areas of our lives.


My top 5 predictions for CIOs in 2011

December 15, 2010 - 9:00 am

We are living in amazing times. Technology is changing the way we work and play at a considerable pace and there is no letup in sight. Rather, the change we anticipate ahead will be greater and more profound than anything that has come before. If you, like me, are lucky enough to be part of implementing that change then you’ll likely agree that we are extra fortunate.

To me, being a CIO in the early part of the 21st century couldn’t be further from being in “just a job.” If you’re doing it right and having fun while you’re doing it, you and your team can be inventors of the future. And that’s really important and interesting work.

As we look to 2011, the to-do list and choices for CIOs are getting longer and more complex. The pace of change is adding a level of uncertainty that doesn’t make any specific path clear. Knowing this, as most of us do, is not particularly helpful. But that’s not the point to focus on: the enlightened CIO must help go after the most valuable projects and be a trusted adviser to those who commit dollars to organizational goals.

It’s in this context that I present my top 5 predictions for CIOs in 2011. I’ve pondered whether they should be characterized as predictions. Regardless of what we call them, these areas will be featured on most CIO agendas in the year ahead. Think of them as unavoidable big ticket items that will consume considerable discussion and may be deserving of a deliberate strategy.

1. Cloud computing enters the mainstream

Okay, so one doesn’t need to be a soothsayer to know that cloud computing is at a point of inflection. Emerging from a period of hype and niche investment, cloud computing is positioning as a transformative and central technology in the arsenal of enablers of value.

Worthy of particular note, with mobile increasingly at the center of our computing future, a strategy for the mobile cloud will be an essential subset of this space.

I’ve said it before, if the CIO is not driving the agenda on cloud in 2011, there are many in the C-suite who will be. This is because cloud computing provides solutions for reducing cost, simplifying and optimizing infrastructure, and shifting the role of the CIO from back-office manager to enabler of business opportunity.

The risk is no longer the cloud. The risk is not having the cloud as a priority in your strategy.

2. Real business intelligence

I have a term for business intelligence that I prefer and I believe conveys a more urgent sense of its value: I call it unleashing data. Somewhere on some system in your organization lie answers and patterns in data that could be worth millions of dollars. In an era where we create more data every two days than was created from the start of recorded history to 2003 (apparently that’s about five exabytes of data), to say data is underutilized is a gross understatement.

Now, more than ever, we have tools to mine organizational data — whether structured or unstructured — and unleash its enormous value. What strikes me about business intelligence is that the CIO doesn’t have to create anything new; it’s about using what already exists.

3. The cost and value of technology

A notable manifestation of our recession recovery is the absence of rigorous business investment. Put another way, businesses have been shell-shocked into hoarding their profits at the cost of spending on necessary technology maintenance and new systems. Rather, the modus operandi is conservative spending and trying to get more technology value with less cost. CIOs are feeling it.

The year ahead will likely continue this trend as the economy remains unstable and uncertain. It’s not the end of the world for CIOs, but it does mean that more work must be applied to developing watertight business cases and for increasing the innovative use of technology. For many CIOs, this trend will require necessary business skills that will be challenging. Break open that old college business textbook. You might need it.

4. Integrating social into the enterprise

While I don’t think that integrating social computing deep into existing systems will hit an inflection point in 2011, nonetheless I believe this will be the year where the subject gets increasing attention both in the CIO discourse and in the emergence of new supporting technology.

The business advantages of social capabilities such as internal crowdsourcing, collaborative virtual spaces, video-on-the-desktop, social network analysis, creating serendipity, and consensus building are being gradually proven out on an ad hoc basis. The future will demand that a deliberate and rigorous plan be applied to it. The time to begin strategizing on a path forward begins now.

5. Temporary staffing

If you’re an IT contractor, 2011 will likely continue to be a good year for you. Closely aligned with prediction No. 3, CIOs are increasingly reluctant to fill openings with full-time employees. Loath to risk further layoffs in the future, they continue to be highly conservative about growing the ranks. Market confidence will need to be restored before we see a sizeable shift to full-time employee hiring in the IT sector.

As a result, CIOs will be managing more hybrid-staffed organizations. These organizations will constitute full-time employees, contractors, and outsourcing. While not radically different from many IT organizations today, what makes 2011 different is the uncertainty around the extent and duration of the contractor requirements. Will it be permanent? What effects will it have on institutional knowledge, loyalty, and existing staff?

You may agree or disagree with my predictions and you may believe I left something big out. I’m confident that’s true. So I’d like to hear from you. Add your comment below if you think there is another prediction that every CIO must be aware of for 2011.

As I did in 2010, I’ll revisit these in late in 2011 and make an assessment of how they fared as the top ticket items for the CIO during the year.


Taking stock of my 2010 tech predictions

December 8, 2010 - 9:00 am

Writing predictions is always fun because if you get one right it makes you look like you have extraordinary psychic skills. And if you get it wrong, well, so what? How could you really know?

On a more serious note, publishing predictions can be valuable content if they are formed by using unique insights to aggregate qualitative and quantitative observations, such as interviews with industry leaders or analyzing trend data.

In January 2010, I published a list of my technology predictions for the year ahead. I’ve been approached by many of you to revisit those predictions and state a verdict on each. While I hit a few right on, I certainly over-estimated the rate of change. In addition, like many of us who make predictions, we’re often caught off guard by the introduction of an unanticipated disruptor (read: iPad!). But that’s what makes our industry fun. Making technology predictions is somewhat of an oxymoron: innovation is by nature unpredictable.

So here are my top 10 technology predictions for 2010 and the assessment of how things actually turned out:

1. Software-as-a-Service (SaaS)

Prediction: 2010 will be a big year for providers of software as a service (SaaS). The obvious big names in this space will release new offerings to compete with popular desktop applications.

Verdict: This was a slam dunk, although admittedly one of the easier predictions. Gartner predicts that the SaaS market will have grown an outstanding 14 percent in 2010.

2. Netbooks

Prediction: This popular form-factor will have outstanding sales and may even surpass laptop sales by year-end. Given its remarkable low-cost, we will likely see more offerings that make available free netbooks.

Verdict: Wrong. Netbook sales in 2010 took a nose-dive. After outstanding sales in 2009, interest waned and the market crashed. Some blamed the introduction of the iPad, although many have argued that the audiences aren’t necessarily the same. However, there is some agreement that the introduction of the iPad and the tablet format in general did impact the overall sales of portable computers as a broader category.

3. Cloud services

Prediction: An obvious growth area in 2010; we will see new and expanded services from all the usual suspects. Expect major announcements from large businesses and government agencies choosing to move some of their core applications and data to the cloud.

Verdict: Correct. An easy call. Major adoption and spending continues. This week alone the General Services Administration announced it is moving all 15,000 of its employees to Gmail and Google Apps. The continued expanded use of the cloud is anticipated to grow in leaps and bounds in the years ahead.

4. Mobile money

Prediction: By late 2010, paying for products and services via a mobile device such as a cellphone will begin to emerge in the mainstream U.S. Multiple flavors will be available including custom applications and text messaging.

Verdict: While we see progress in this area, I was too optimistic on where we would be by the end of the year. There is no doubt that there is considerable innovation taking place, but we are still a distance from seeing broad adoption of mobile money in the U.S. I do continue to believe strongly that this will be a major area of growth in the years ahead.

5. Free software

Prediction: If current trends continue, it’s quite possible that all software will be available in a form of free, but 2010 will be the first year that this trend reaches a point of inflection. A combination of enterprise-class open source, freemium, freeware, ad-supported, and alternate revenue-model software will have lasting and destructive impact on the notion of license-paid software.

Verdict: Mixed. The big winner in 2010 was open source. We continue to see high adoption rates in the enterprise from products such as WordPress and Ubuntu (implementation and maintenance costs assumed). Worth watching are those organizations that are permitting staff to bring their own home computers to work that in effect offloads enterprise software costs to the employee.

6. Harvesting the social graph and Web-Squared

Prediction: 2010 will see the introduction of the first widely available and easily usable products for better understanding the mass of unstructured data being accumulated across public and private clouds. The emergence of intelligent solutions to interpret massive related and un-related data in order to create forecasts and identify trends will help people make more sense of the world and see previously hidden signals.

Verdict: Some momentum building here, but no critical mass. A notable player, Hadoop has seen considerable growth over the year. But easy-to-use, mass-adopted applications are still a distance away.

7. More video

Prediction: Continued investment in video infrastructures will see greater use in work and non-work environments. It will be more common (but still not ubiquitous) to have video conversations with colleagues and external parties such as customers and suppliers.

Verdict: Notwithstanding the fascinating war-of-words this year between Adobe and Apple on the strategy to support video on devices, this was an outstanding year for continued broad adoption of video. According to a survey conducted by LifeSize, 51 percent of mid-sized enterprises polled use video conferencing and the majority of those that don’t are planning to use it.

8. Green IT

Prediction: This may be an inconsistent area of investment as continued tight budgets and more immediate costs (e.g. migration to updated operating systems) distract from major green initiatives. However, going into 2011 and beyond, broad adoption of virtualization and further movement toward hosting in the cloud may help organizations lower their data center carbon footprint.

Verdict: Mixed. As I predicted, the focus on tightening costs for many CIOs forced some distraction from this subject, although greater use of cloud computing, broader recognition and pursuit of LEED certification, and better system efficiencies in general remained consistent themes.

9. Mobile Location-based services (LBS) and augmented reality (AR)

Prediction: Expect to see an extraordinary number of start-ups and existing technology companies offering mobile LBS-related services. Proximity-based solutions will become more common. Mobile devices will begin to offer compelling overlay data for the real world that help people with existing and new activities. Lots of noise and confusion will ensue as both consumers and providers try to figure out acceptable services. For example: How will people respond when they stroll through a mall and are bombarded with text messages from different retail stores?

Verdict: Correct. Both LBS and AR continue to be exciting and innovative technologies that are growing in leaps and bounds. The exceptional successes of both FourSquare and Gowalla and the use of places in both Facebook and Twitter are indicative of consumer interest.

10. Social spaghetti integration

Prediction: More social features will begin to show up in enterprise resource planning (ERP) apps. New and increased support for ERP solutions that, for example, integrate social networking, will see a further blurring of the lines between work and non-work applications and activities.

Verdict: Mixed. With two major standouts: Salesforce Chatter and Facebook integration, this space is seeing growing interest, but considerable lack of coordination and direction. At the Web 2 Summit in San Francisco in November, Mark Zuckerberg, CEO of Facebook, went to great lengths to suggest that social is coming to all applications over the next few years. This will be a fascinating area to watch and participate in.


Cloud computing’s fear factor: acknowledge, reduce, move on

December 1, 2010 - 9:00 am

In my 20 years providing IT services to organizations, I’ve never seen a technology that is so equally transformational and feared as cloud computing. I am hard pressed to find anything comparable in the past, bar perhaps the Internet itself, which has the power to positively re-engineer the manner that technology supports organizational goals. But possibly by a combination of issues such as negative pundit messaging and well-founded suspicion of wide-scale technology and organizational readiness, cloud computing appears to be the most feared of the big technology innovations.

There continues to be plenty of disagreement on a definition of cloud computing. There’s no doubt that for some it means something quite conservative, such as low-cost data storage at an external provider, and for others it’s as grand as the enablement of completely new business opportunities. Any time the definition of a domain is so broadly spread, you know it’s in the very early phase of its maturity. In time, I anticipate we’ll get sub-categories that will help to clarify the space.

To me, cloud computing today means organizations have the opportunity to redefine what and how technology value is provided internally versus what can be sourced externally. It’s an existential question, and that’s why it’s so incredibly important. Do it right and CIOs have the ability to transform IT from a back-office provider to a real business partner. Since that’s clearly where the C-suite wants IT to play, the value of cloud couldn’t be any more self-evident.

But why then, if cloud computing is so potentially important and valuable, is it so feared?

Given its existential property, perhaps IT leaders feel obligated to a self-preservation reaction. While this may be true in some instances, it’s clearly temporal. In this case resistance is futile. If you’re not proactively addressing a cloud computing strategy, my bet is that in the not-too-distant future the CFO or COO will force you to be reactive.

Much of the fear is based on issues that are warranted. Clearly we must recognize the relative immaturity of some of the technologies involved (but also acknowledge that limitations today go away over time). If you’re going to move a major business function such as email or a productivity suite into the cloud, you need up-time guarantees and confidence in being able to easily liberate your data should you decide to move vendors. You need recourse if your cloud provider goes under. That’s not a trivial issue. As more providers emerge, more will fail (sadly that’s the marketplace), and so the risk to an organizational function is greater.

You also need to be aware and mitigate your security concerns. It’s possible the security risk is over-stated. Most of us do personal online banking don’t we? And aren’t huge components of our infrastructure such as energy, financial markets, and the military already large consumers of the cloud? (Little consolation, I agree, when there is a breach — but a fact on the ground you can’t deny). I argue that in the short-term these issues are about deliberate and diligent organizational planning and in the long-term it’s simply about normal business continuity design. When something innovative becomes widely adopted, it just becomes business as normal.

These fears are sensible concerns, but some of the anxiety of cloud computing is wrongly perpetuated. While there is intense innovation taking place from start-ups to big established technology providers with its attendant marketing, there continues to be strong messaging taking place that suggests cloud is still only for limited use. Most often we hear that the cloud is a place where you can test applications or rent temporary storage. It’s time for us to move beyond that message. The evidence couldn’t be any stronger. Major organizations are successfully moving large functions to the cloud and analysis shows that IT budgets will commit increasing amounts to implementation over the next five years.

What’s the point here? What I’m trying to say is that there are legitimate reasons to approach cloud computing with care and planning, but we should be aware that many of us are being too conservative by continuing to be consumed by irrational fear. Until some of our legitimate fears are addressed and we can also overcome our misconceptions, the great promise of cloud computing to our organizations will be limited.

The cloud computing fear factor can’t be overstated. It’s time to acknowledge it, reduce it, and move on.


The four pillars of O’Reilly’s IT strategy

November 23, 2010 - 9:00 am

This week I delivered the detailed framework for a 3-year IT strategy for O’Reilly Media, Inc. The strategy is the culmination of several months’ work to fully understand the current state of the business and the vision for its future. Together with the goals for growth, the strategy focuses on many of today’s common IT requirements, such as: delivering more for less; increased agility; greater access to decision-enabling data; and improving customer service. It also directly addresses stress points in the existing technology environment and forms the basis for the IT organizational design required to support future business goals.

As I wrote about in a previous blog, it was essential that the context of this strategy consider O’Reilly’s culture of innovation while introducing the right level of predictability. Too much of either unmanaged innovation or codified predictability could limit our ability to grow and, in my view, be a recipe for IT failure.

While there is considerable depth and breadth to the strategy, I will share the simplified, four core concepts on which it is formed. Each is essential to move us forward. I’m not giving away any secrets here, as these are all fundamental concepts. But, it does achieve my objectives of being highly transparent in our thinking and for providing ideas to others.

The four pillars of our IT strategy are:

1. Governance

IT governance is all about making smart choices in allocating scarce technology resources and being accountable for the resulting performance of those decisions. These choices include those that consider cost, risk, and strategic alignment. While governance almost always exists in some form — i.e. without it being explicit, somehow decisions get made — maturity and predictability of process will really only be achieved by clearly understood and agreed upon governance processes. We’ll focus on the right quotient of governance, lest it stifle and suffocate the things we do really well.

2. Architecture

As systems become increasingly interdependent and a small change in one application can have significant downstream impacts, it’s no longer possible to take a narrow, single-system view of solution development. New requests must be handled with an end-to-end process mindset. Introducing new capability will now require an architectural perspective that considers qualities such as reuse, standards, sustainability, and data use. Over the medium- to long-term, smart architecture can lead to higher-quality solutions and reduced overall costs.

3. Strategic sourcing

Contrary to popular belief, strategic sourcing does not automatically equate to either staff reductions or outsourcing. Sure, unfortunately for many organizations this is the way it has manifested, but for many others it is about creating flexibility in identifying and temporarily acquiring talent from wherever it can be provided when it is needed. That talent may be internal, for example: Is there someone outside the IT department but within the business who can help with a project on a temporary basis? But it could also mean quickly finding a scarce development resource in Argentina. We’ll use strategic sourcing as a supplemental talent management approach to developing and supporting technology solutions.

4. Hybrid cloud

Historically, many organizations, including O’Reilly Media, built and hosted their own IT solutions. There’s often good reason to do this, particularly those systems that use proprietary innovation and are essential for market differentiation. Outside of this category, IT has become increasingly commoditized — i.e. basic services offer no competitive advantage but are essential to core business functions (think of email or file storage as examples). Utilizing more commodity-based IT products and services enables the IT organization to elevate its value proposition: to work on the most complex business problems and be a true enabler of business growth. At O’Reilly Media we’ll continue to build out our internal cloud infrastructure and pursue more external cloud capability and software-as-a-service solutions.

We’re under no illusion that making significant progress in all four of these areas will be easy. There’s a level of change management that will challenge us in new ways. But we’ll gauge the pace as we progress and make corrections as necessary. To me, inherent to our strategy is the capacity for flexibility. It’s not possible to get everything right, but it is essential to quickly correct when things go wrong.

I’ll continue to report on our progress and I welcome your feedback.


5 cloud computing conundrums

November 3, 2010 - 9:00 am

With all the attention being paid to both public and private cloud computing these days, it would be easy to believe that it offers a panacea for the woes of every CIO. If only! The reality of designing and implementing a cloud strategy, particularly the public component, is far more complex than any technology vendor or analyst paper would have you believe. Faced with an array of trade-offs, public cloud computing is creating considerable challenges for CIOs and their teams.

Like every new technology paradigm that has come before, cloud computing presents both clear advantages and near-term limitations that need to be addressed ( I deliberately say near-term, as IT innovation has a neat way of figuring stuff out eventually. Sadly, not always when you need it, and certainly not for the benefit of early adopters.)

With the C-suite continuing to apply pressure to get more value from IT and reduce cost, moving technology services into an externally hosted environment or subscribing to an online business solution can be a quick and convenient win. But can a strategy like this be applied successfully in a repeatable fashion without significant trade-offs?

While every business needs to consider public cloud computing in the context of its own needs and risk profile, I’ve identified a sample of puzzles that most CIOs will likely need to address. There are many others of course, but these should be sufficiently provocative.

Puzzle #1: Create flexibility by being less flexible

Moving capability to the cloud can provide clear advantages such as storage elasticity (the ability to increase or decrease needs as necessary and only pay for the amount used) and pay-per-feature options. But these flexibilities may come at the price of vendor lock-in and limiting feature sets. Will this compromise be acceptable? Difficulty level: Medium.

Puzzle #2: Determine the cost of an existing IT solution

Whether an IT service should remain internal or be hosted in the cloud requires a level of cost accounting (the true costs of labor, utilities, backups, disaster recovery etc.), which is seldom applied to the cost of running a technology service. This puzzle requires the CIO to understand and allocate the appropriate costs for each service being considered for the cloud. Hint: Don’t forget to include opportunity costs. Difficulty level: High.

Puzzle #3: Simplify the environment by introducing more complexity

Move a complex business process to a software-as-a-service (SaaS) provider and you immediately eliminate the complexity of developing, managing, and hosting the solution internally. However, move lots of processes to a variety of providers and you may introduce challenges in getting these applications to interface with each other. You also provide a considerably less unified experience to the user. While standard APIs ease the flow of data, supporting disparate vendor solutions adds a new level of complexity. Difficulty level: Medium.

Puzzle #4: Provide assurances of sustainability in a domain of uncertainty

Public cloud solutions remain largely nascent and unproven over the long term. With the benefits so compelling, it can be hard to resist moving forward with what may appear to be a great fit. With little ability to ensure that the solution will be available in the long-term, the challenge is to receive and provide assurances to already skeptical stakeholders. Difficulty level: High.

Puzzle #5: Maintain security while reducing it

Providing a secure computing environment is the priority of every CIO. With threats increasing and becoming ever more elaborate, this is a space with little room for error or oversight. By moving services to the cloud, you may essentially be outsourcing your security. Difficulty level: High.

One could assume from this posting that I’m not supportive of the movement to the public cloud. But nothing could be further from the truth. The opportunities such as lower cost, increased agility, and new business possibilities are obvious and compelling.  Given what’s at stake, a deliberate and diligent approach is absolutely essential. It’s clearly not all or nothing; migrate only what makes sense. Since moving services to the public cloud is often a unidirectional process (they’re unlikely to move back in-house without significant cost and serious disruption) it’s important to avoid buyer’s remorse.

If you’ve solved some of these puzzles I’d love to hear how you did it and any trade-offs you had to make. I’m also interested in other conundrums that cloud computing presents. Tweet to @reichental.


IT transformations must begin with hearts and minds

October 11, 2010 - 9:00 am

The role of the information technology (IT) department is changing. In simpler times it was the bastion of back-office services like data storage, network operations, and ERP systems. Today, both its purpose and the demands placed upon it are quickly evolving. Driven largely by economics, the IT function is outsourcing many of its commodity-type activities; looking for ways to rein in out-of-control support costs; and being asked to be more central in helping to enable new business opportunities. Simply put: the C-suite is demanding more value on its IT spend.

For many IT departments, moving from a largely back-office role to being an enabler of business growth requires nothing less than an IT transformation. This can often translate to painful, but essential change in the way IT is sourced, organized, and operated. But more importantly, it is about shifting the mix of IT dollars spent away from maintenance and into new investment. A successful IT transformation should result in 60 percent or more of all IT spend being available for new projects that can be directly tied to business growth.

Getting there is not easy.

Many IT leaders tasked with this directive leap deep into the strategy by quickly shifting priorities, shutting down projects, and using sheer brute-force to change the dynamics. This approach can work, but it will come with a price.

Like all change, and given its particularly complex nature, an IT transformation must be managed in a deliberate and multidimensional manner. Sure, the heavy lifting is essential, but it should not be the first thing that gets done. This kind of radical change must start with the CIO and his or her managers engaging in collaborative discussions concurrently across the business and with the IT team. As the impact of the change will be experienced by almost everyone, setting expectations and getting as many people as possible bought into the strategy at the outset is essential. An IT transformation will be tough, but it will go smoother and will be better understood and accepted when leadership has won hearts and minds.

As CIO of O’Reilly Media, I’m leading our own IT transformation. Driven by our desire to get more done, more quickly, and to continue to be at the leading edge of innovation in the business areas in which we compete, requires nothing less than a significant shift in how we execute our IT function. We’ll keep doing the things we do well, but we will take a careful look at everything else.

Over the next few months, I’ll be blogging candidly about our experiences: both what is working and where we are being challenged. I want you, the O’Reilly Media community to be part of the conversation in this change. We’ve started to work on hearts and minds and that also means we’ve got to do a lot of listening. So go head, tell us what you think.


Foursquare is a smart game changer

June 21, 2010 - 11:27 pm

Deciding to share ones location using a mobile device is not a new phenomenon.  Several geolocation applications, described as those that use system location awareness as its core function, have served this market for some time. For the most part, the value here is that of forced serendipity.  For example:  if you’re my friend and you decide to share your location and I am in the area, perhaps I can drop by and have a chat. In many cities this is how groups of friends are assembling. Not through a process of phone calls and lengthy coordination—how old school!—no, today for many it’s about meeting up by displaying your location and hoping you are discovered or by you discovering the location of others. Clearly it’s not for everyone and subsequently, to date it has had a relatively niche following.

While many vendors have entered this market and have had some limited success (measured through usage and not revenue, mind you) a new entrant to this space has provided an unexpected boost and it may just be their unique features that propel geolocation applications into the pseudo-mainstream.

Recently Foursquare (, a geolocation startup, has entered the market with the same basic two features available from the incumbents: the ability to declare ones location (called checking in) and to view the location of others. But here’s the twist. Foursquare has made it a game. And it’s strangely addictive. Declaring ones location can result in two types of prizes. First, for specific behaviors you get badges (feels a little like boy scouts and I think that isn’t a coincidence). These behaviors include checking in to a location late at night or declaring your location in two different cities on the same day.  The badges show up on a website and are visible to others including through their mobile device.  The more badges you collect, the more bragging rights you get (like I said, it’s not for everyone). The second type of prize relates to the frequency of checking in to the same location. The person who has checked in the most to that location becomes its mayor.  Generally there are no privileges to that designation other than the fact you are the mayor.

With me so far?

I don’t want to dwell on whether there is any merit to what I’ve described here so far.  I’ll let you be the judge of that.  But here is where I think Foursquare and its gaming approach gets interesting: they can actually make money from this!

People who sell things love customer loyalty and for good reason. Loyal customers are cheaper to manage than acquiring new customers. So someone who frequents, say, the same coffeehouse would be considered a loyal customer. Visit the store enough and with Foursquare the person can become the mayor.  Now the person has loyalty credentials which may be valuable to the owner of the coffeehouse. For example, why not reward that person for their loyalty and further reinforce that loyalty? Organizations do this all the time. All the individual needs to do is show the coffeehouse owner the mayor designation on their mobile device. But let’s take this a little further. If frequency can lead to perks, then there is the possibility to generate more business as individuals try to earn mayor status. Or the owner may just reward the individual for simply checking in, thus providing additional incentive to visit the store in the first place.

So what’s my take on this? It’s a win-win. Foursquare makes money by becoming a marketing vehicle for businesses (and uses a smart business model not based entirely on ad revenue) and the individual attains rewards for simply checking into a location.  I’m not sure that I will be a power user of Foursquare, but I like this approach.  In social computing, monetization has been elusive and these guys have put their finger on something really smart. And that’s a game changer. Let’s see what happens.


To Tweet or not to Tweet?

June 16, 2010 - 8:00 am

Since it started in late 2006, I’ve been a registered member of Twitter—the popular 140 character-limited microblogging service. However, I’ve only recently started to use it on a regular basis. I’ve suddenly found it quite useful. Many of the folks I socialize with are confounded by its value; they cannot see why people post the detail of their most inane activities and they are equally baffled by those that read the postings. I do neither of these things and yet I am able to derive value from it. To understand how, I thought it would be worthwhile to briefly outline the reasons why I think it is a rather compelling service.

While 40% of Twitter content is considered pointless babble according to this study:, the same report cited the combination of news, conversation, and pass-along value as approximately 40%. It is likely that the content of my interest falls into news and pass-along value, which appears to be around 12%. It is a small proportion of the content, but when you consider that tweets—the messages posted in Twitter—number around 50 million per day; suddenly 12% is a very large number. I don’t read a lot of what is on Twitter and frankly, I’m just not interested in most of it. However, the small group of people and organizations I follow disproportionally represent the 12% of content I am interested in.

So why would I read content posted on Twitter? There are three main categories that currently appeal to me. First is information that is newsworthy or timely. I find Twitter to be quite effective around alerting me to items of interest in a succinct manner. Typically a tweet will include a link for more information, but by scanning the initial tweet I can quickly determine whether following the link is worthwhile. The second category is getting a novel perspective from someone whose opinion I respect. The last category is pure entertainment. I follow a number of comedians and I find their random musings pleasantly diverting. Clearly there are many other reasons beyond my own why people read Twitter which include education, notifications, surveys, and fund-raising.

The final item I want to address is why someone like me would write a tweet (remember I’ve only been discussing why people read tweets so far). I think it’s highly flattering when someone decides to follow me on Twitter. My assumption is that they have made this choice because they believe I may repost other Tweets of value or I might have something of value to say myself. Without a scientific analysis, I would say the mix of my Tweets is 66/33 respectively. And while I don’t have millions of followers, I am always humbled when someone retweets (forwards) an original post of mine.

It’s too early to tell whether Twitter has staying power. But it’s evident that it is more than a novelty. Its current subscriber growth rate is spectacular, while its future is far from certain.

You can follow me at


First impressions of the Apple iPad

June 14, 2010 - 8:00 am

An Apple iPad is currently selling at a rate of one every three seconds. Since its launch in April, over two million have sold. These are numbers for the United States alone! It is only beginning to become available in other countries, so the pace of sales will gather further speed. In a good economy this kind of sales volume would be good. In a bad economy that’s close to phenomenal.

Of course, because of my work and interest in new technology I had to get one too. There is an obvious attraction to gadgets and new shiny things for many of us, but it seems to me that the unprecedented interest in the iPad goes way beyond a niche level of curiosity. The iPad, despite some early criticism about its name and some folks saying it was simply “a big iPhone,” has emerged victorious in the battle for hearts and minds (and frankly people parting with their cash). Apple are tapping into something important and those of us in the technology sector had better figure it out soon.

I will admit I am impressed. Apple, among a relatively small set of organizations consistently delivers a high quality product. The iPad does not disappoint. I don’t intend to review the product here, but simply convey some of my initial observations.

The iPad is certainly not a replacement for any other device that I have. That’s not a surprise. Apples intent, as well as others entering the Tablet computing space, is to deliberately create a new platform and not replace an existing one. Yes, it creates a new revenue channel for these providers, but it also fills some gaps that neither the PC nor the smartphone can easily meet. The iPad positions itself for easy consumption of content, whether text, audio, pictures or video and it is simple enough to pick-up, boot in a few seconds, and use without instruction. The beautiful screen, the touch-screen interface, and its overall ease-of-use, are highly attractive features. Surprisingly, it innovates by doing less and doing it better. The product isn’t perfect and as a technologist who likes to personalize his computing environment it feels more like an appliance with limited ability to customize. However, as a version 1.0, it’s pretty slick.

To compliment the hardware, they have created and championed an ecosystem of third-party developers that are deploying highly innovative solutions that are, once again, generally easy to access and use. Apple won’t be the only player in this market, but for now they have a considerable head start. Keep an eye out for other big technology shops making announcements in tablet computing. The competition will be robust and ultimately healthy for the consumer.

While it is too early to tell, the real promise of the iPad is yet to come. It will be used in ways that none of us can think of right now. To me that’s the most exciting thing about new technology: it creates the turns that guide the future. I can’t wait to see what’s next.


Why the next big thing may, in fact, be a really big thing

March 11, 2010 - 8:00 am

Every second, the Large Hadron Collider (LHC) near Geneva, Switzerland produces 40 terabytes of data.  That’s more data than can be currently stored and analyzed.  The scientists working on the project are forced to collect just a slim set of the data and they hesitantly ignore the rest.  While this example is at the top end of the data deluge that our increasingly digitized world is creating, we can all relate to other, closer to home examples:  Every minute, 20 hours of video is uploaded to the popular video-sharing site, YouTube.  On Facebook, the biggest and most popular social networking site, 2 billion photos are uploaded each month.  Whether it is our text messages (a volume greater than the population of the earth are sent each day), or credit card transactions that reveal the intimate details of our purchasing behaviors, or recorded search entries in search engines that tell us so much about the human experience, or the cameras that photograph and video us as we go about out daily activities, the quantity of data that humanity is collecting and storing is staggering.  And it’s increasing exponentially.  We are moving from data scarcity to data abundance and disrupting our conventional view of economics. We call it big data.

You might ask why is big data happening and what are the implications?

This data deluge is being driven by a number of factors.  First, we are increasingly migrating from an analog to digital world.  We don’t use the gelatin silver process for photos anymore; we store them as bits in an ethereal cloud.  The same applies to such things as movies, music, newspapers, books, our bank records, our bills, and our airline tickets.  It’s a very long list.  Second, it’s become cheap to store digital data.  At the time of writing, the estimated cost of a gigabyte is 10 cents.  The hard drive I am writing this blog on has 100 GB: a total cost of $10 that includes every digital photo I have taken and every song I have purchased and there is still plenty of room left over.  Finally, the tools for us to produce content, store it, and share it have become highly accessible and quite often: free!  Today, any one of us can be a publisher or credit-card collecting retailer.  This creates enormous amounts of data.

Collecting and sharing all this data clearly presents a number of challenges and in my view, incredible innovation opportunities.  This is not last year’s data-mining.  This is data-mining on steroids!  While I won’t dwell on the challenges, they are clear: increasing transparency in our lives potentially results in less privacy.  More information and equal access to it means that the value of some intellectual property is trending downward.  Easy access to content and communication tools means even the bad guys have a voice on the global stage.  And while the risks and challenges are many, I want to turn our attention to the innovation opportunities.

In order to track the spread of the H1N1 Flu virus, the Center for Disease Control (CDC) was able to leverage the search entries of a popular search engine.  Since people search for things like symptoms when they are sick and since the search engine knows where the search is located, we can easily see how large volumes of searches produce a beautiful visual of how a disease is spreading both over time and in what direction.  It’s the invisible knowledge that the data deluge unveils that makes me excited about this area.

I’ve identified five net new possibilities that big data presents:

  1. Answer formerly unanswerable questions. With so much data being collected in so many different ways, it is now possible to ask and get answers to questions we just couldn’t before.  For example, if we analyze the social connections between employees that their digital footprints create, might we be able to identify specific knowledge domain experts? Sometimes the questions will be deliberate, but I’ll bet we’ll accidentally be able to answer insightful questions we didn’t intend to.
  2. The formulation of new questions. Now that we know certain data is being collected, what new questions could we ask? Imagine how empowering that could be to C-suite executives!
  3. More informed, evidence-based decision making. How many times have you wished you had more detail on a subject in order to make a better decision? Better data can mean better and timelier decisions.  Big data opens up a whole new opportunity for competitive advantage.
  4. Democratization of data. We no longer need to build systems that silo critical data and create an enterprise digital divide.  Aggregated data in volume can be easily made available so everyone can benefit.  It can be re-purposed resulting in new value that goes way beyond its initial intent.  For example, governments all over the world are embracing open-data policies.  This gives the electorate unprecedented access to local, state, and federal insight.  For example, local community groups are using the data to combat crime and to discover societal inequalities (is this neighborhood getting an unfair access to a resource?).  Give data to your employees and they may tell you something about your organization that reduces costs or builds your next billion dollar product or service.
  5. Visualization of invisible knowledge. Big data creates amazing and valuable visualizations.  And these visualizations unveil secrets that were previously hidden.  For example, tag clouds (groups of words that become larger the more each word is used) tell us what people are talking about on social networks.  A map of the world superimposed with real-time stock trading data flows tells us a lot about global commerce.

Of course I’m only skimming the surface here.  What I’m trying to convey in this introductory blog on the subject is that big data is a big deal.  Moreover, rather than viewing it as a threat–which many folks will–this is an opportunity for incredible innovation.  Big data will bring entirely new value propositions and it will force the reinvention of entire industries and business models.

In my view, big data is the next big thing.


Why digital abundance may be good for quality

February 15, 2010 - 8:00 am

If you’re reading this blog entry then you’ve made an important decision.  You’ve decided that among all the different ways you could spend your time, reading my blog is a worthwhile option.  So, in the same way those in the airline business do, I say to you: we know you had a choice of blogs to read today, so thank you for choosing to read my blog.

We’ll come back to this thought in just a moment.

Classic economics deals with demand in the face of scarcity.  In business school it is pounded into you that price is a function of the demand for something relative to its supply.  In the digital arena of consumer-produced (largely free) content, these principles have been knocked on their head: scarcity has become abundance.  Supply massively outweighs demand.  The price, in this case, becomes the opportunity cost.  With so much supply, we can only consume so much and, therefore, must sacrifice something in favor of another.  Competition for our limited mental capacity has created a new economy: the attention economy.  For example, with thousands of blogs to read and a finite amount of hours in the day it is simply not possible to consume but the smallest, most trivial amount of content.  Might this partly explain the popularity of Twitter: that the short bursts of content from multiple sources provides an illusion of editorial breadth and depth?

Ok, so having more choices is not new; there have always been more newspapers or more movies to watch than any one individual could ever get to in a given period.  What makes our choices infinitely more complex is the magnitude of choices.  Today we can get our news from thousands of sources.  The entry barriers to content creation have largely been eliminated resulting in the ability for each of us to be a printing press or movie director.

So what?

A popular argument is that more content doesn’t mean better content.  It may just mean lots of really bad content.  That is often largely true: lots of videos of beer pong tricks are often nothing more than a collection of bad videos of beer pong tricks.  But it’s not an argument that can be used to prove that digital abundance is the same as digital waste.  I propose that the same market forces are at play that helps individuals make choices: collectively a large group will review and surface the things they like and relegate the things they don’t to the far end of the long tail.  For content to emerge above all the noise requires a type of pseudo-Darwinian selection.  Further, I argue that abundance may in fact improve quality of content.  Sure, in an economy of abundance there will be a lot of videos of cats playing the piano, but among the volume of garbage there will be a handful that will be quite good.  And if you’re a person who finds cats playing pianos funny, that’s a good thing.

If you are a creator of digital content, you still need to create content people want.  And that means you need to innovate to stand out.  You need to produce quality–defined in so many different ways–and add value for the consumer.  In an economy of digital abundance, innovation is king.

If you’ve reached this far in my blog, again I thank you.  You had a lot of choices and for some reason; you chose to read my blog.  In an environment of millions of blogs and many other choices of how to spend your time, you chose to spend it here.  And that proves in some small way that the ease at which one can create bad content hasn’t resulted in a dearth of digital content of value to you.


My personal Top 10 technology predictions for 2010

January 1, 2010 - 8:00 am

One of my responsibilities as the Director of IT Innovations at PwC is to spend a good deal of time researching and developing insights on the impact of emerging technologies. This year, for the first time, I thought it might be fun and, frankly, quite useful to share with you my thoughts on what I believe may be the big IT trends in 2010. While I was somewhat tempted to be bold and creative in my forecast, I decided to ground the Top 10 in areas that have some real momentum. If you agree with the predictions, what might that mean for your work and your industry? In what area do you think I got it completely wrong? I’d love to know what you think.

1. Software as a Service
2010 will be a big year for providers of software as a service (SaaS). The obvious big names in this space will release new offerings to compete with popular desktop applications. New and existing operating systems that are built primarily to support the SaaS model will begin to be more widely accepted and adopted.

2. Netbooks
This popular form-factor will have outstanding sales and may even surpass laptop sales by year-end. Given its remarkable low-cost, we will likely see more offerings that make available free Netbooks.  In addition, the ubiquity of embedded Web-cams will drive further use of personal video in both non-work and work environments.

3. Cloud Services
An obvious growth area in 2010; we will see new and expanded services from all the usual suspects. Expect major announcements from large businesses and government agencies choosing to move some of their core applications and data to the cloud.

4. Mobile Money
By late 2010, paying for products and services via a mobile device such as a cellphone will begin to emerge in the mainstream US. Multiple flavors will be available including custom applications and text messaging. More likely in 2011-12 will be the emergence of banking services from the big Telco’s. Rather than simply being a middleman, the telecommunication companies may announce banking divisions.

5. Free Software
If current trends continue, it’s quite possible that all software will be available in a form of free, but 2010 will be the first year that this trend reaches a point of inflection. A combination of enterprise-class open source, freemium, freeware, ad-supported, and alternate revenue-model software will have lasting and destructive impact on the notion of license-paid software.

6. Harvesting the Social Graph and Web-Squared
2010 will see the introduction of the first widely available and easily usable products for better understanding the mass of unstructured data being accumulated across public and private clouds. The emergence of intelligent solutions to interpret massive related and un-related data in order to create forecasts and identify trends will help people make more sense of the world and see previously hidden signals.

7. More Video
Continued investment in video infrastructures will see greater use in work and non-work environments. It will be more common (but still not ubiquitous) to have video conversations with colleagues and external parties such as customers and suppliers. Rigorous competition in this space between the major players and many start-ups will continue to push the price down for high-quality video. Greater use of PCs and Netbooks with Web-cams will continue towards critical mass. In addition, content creation will continue its profound migration from text to video, further consuming bandwidth and forcing more enterprise investment in network infrastructure.

8. Green IT
This may be a inconsistent area of investment as continued tight budgets and more immediate costs (e.g. migration to updated operating system) distract from major green initiatives. However, going into 2011 and beyond, broad adoption of virtualization and further movement towards hosting in the cloud may help organizations lower their data center carbon footprint.

9. Mobile Location-based Services (LBS) and Augmented Reality
Expect to see an extraordinary number of start-ups and existing technology companies offering mobile LBS-related services. Proximity-based solutions will become more common. Mobile devices will begin to offer compelling overlay data for the real world that help people with existing and new activities. Lots of noise and confusion will ensue as both consumers and providers try to figure out acceptable services. For example: how will people respond when they stroll through a mall and are bombarded with text messages from different retail stores?

10. Social Spaghetti Integration
More social features will begin to show up in ERP apps. New and increased support for ERP solutions that, for example: integrate social networking will see a further blurring of the lines between work and non-work applications and activities.

Do you agree or disagree with any of my predictions? I’d love to know what you think.


CIO Day, Amsterdam, Nov 16-17, 2009

November 18, 2009 - 8:00 am

On November 16 and 17, I was delighted to participate in an event just south of Amsterdam, Netherlands called CIO Day. It is an annual one and a half day event for the CIOs of primarily Dutch businesses but with some other European organizations represented. The theme of this years event was “Chief Impact Officer” and focused on how the CIO can bring increasing value to an organization. Despite the tough economic conditions, 600 people attended. The purpose of my participation was to host a roundtable on emerging technologies with CIOs and be part of a question and answer panel later in the day. The meeting was well attended and from what I can gather from the feedback, it was well received. Here is a summary of the discussion.

The discussion began with an overview of several macro trends: Increasing access to broadband across the globe and four times the number of mobile phones connected to the Internet than PC’s is further increasing the value, capability, and reach of the Web. However, there is recognition that many parts of the world still have poor Internet access. Global organizations are often forced to build applications for the weakest connection. This picture is gradually changing as broadband becomes increasingly ubiquitous. The cost of technology continues to decline. In some pricing models, netbooks for example, are given away free with other services. Free and low-cost hardware and software, subsidized through other means such as advertisements or optional premium services, and the recent recognition of open source software as a legitimate route for the enterprise, is further contributing towards the commoditization and reach of information and communication technology (ICT). ICT on its own may not contribute to competitive advantage, so innovation that is enabled by technology must become a core focus of the Chief Information Officer (CIO). In this light, it was also suggested that CIOs explore the creation of small IT innovation teams that focus on the potential for emerging technologies to, for example: reduce cost; disrupt existing business models; and/or create new opportunities.

A major trend that may even trump climate change and the energy crisis is the nature of demographic change. For example: aging populations; a shrinking tax-base; migratory patterns; and populations with hyper-connected and computer-literate youth cultures, are major societal game-changers.

The emergence of broadly accepted computing standards is helping data interchange which in turn leads to increasing the value of data. This coupled with the rapid accumulation of data through sensors and the myriad of transactions that exist in an individual system or those that are interconnected is resulting in data deluge. CIOs must look to leverage the data deluge in a positive way by evaluating data analysis solutions and exploring technology techniques such as mash-ups.

Before discussing specific emerging technologies, the CIOs were asked to think about the profile category in which they currently viewed themselves with regard to new technology adoption. The three categories were: (a) aggressive, (b) moderate, and (c) conservative. Recognizing a specific organizations approach is a helpful lens through which to view emerging technologies. Specifically, not all technologies will have the same impact to individual organizations dependent on its profile.

When the conversation turned to specific technologies, the following areas were briefly discussed: cloud computing (leveraging computing services typically from an externally hosted provider on an as-needed basis) and the group represented all perspectives from first-movers to watchful waiters; increasing use of enterprise open-source software (often free, community developed software) with many participants already leveraging it; consumerization (clearly a behavior not a technology, where employees are bringing their own computing hardware and software to work) with all members recognizing this in some form at their organization but with clearly different risk capacities; “video everywhere” with some divergent views on when and how this would manifest in the enterprise; and the rapid adoption of mobile technologies and the emergent attendant advantages brought about by location awareness. The latter was the least explored but the most interesting to the group in terms of potential for business opportunity.

Overall, the discussion was far-reaching in its scope of subjects. However, the objective of raising a number of valuable thought-provoking items related to emerging technologies was met.


Why it may be time to put your head in the cloud

September 23, 2009 - 8:00 am

One of the most widely discussed technology buzzwords of 2009 has been Cloud Computing. However, unlike other hyped technology, this one exhibits significant substance. It is worth understanding what it is and why it matters.

While the term is relatively new, some of the core features have existed in different forms for some time. Cloud Computing is all about hosting business technology, typically in an external environment, with application support, infrastructure, and operations managed by a third party. Does anyone remember application service providers (ASP’s)? Since the enterprise is less concerned with where the service lives and more focused on the functionality it enables; it is said to exist somewhere in the cloud — with complexity hidden from view. Similar to the shared nature of municipal power stations, enterprises are able to tap into a supply of technology resources as needed. Since it is on-demand, cost is determined by consumption and the required level of service. Simply put, when utilization of IT resources increases, so do costs; but equally, costs decline as usage decreases.

Cloud computing provides an organization with scalable and disposable technology services without the need to procure and manage a large, internal infrastructure. It provides a pay-as-you-go model for software applications, software services, and full-service application development environments. These can be complemented as needed with the ability to easily reconfigure performance, bandwidth and storage. For example: if you need more space? Simply request it!

The advantages of cloud computing largely depend on the size and nature of the business. For a small organization or start-up, the benefits are clear: low-cost (often free) applications, no infrastructure costs, and considerable speed in making essential, core IT services available to users. With larger organizations, there are additional considerations when evaluating cloud computing as an option. Typically processes are more complex and therefore require a larger degree of systems integration. Currently, this integration makes the cloud option riskier; however this will become less of an issue in the future.

Finally, in all scenarios, security concerns remain central to any hosting solution (but when has security not been a concern?). The organization must assess the ability for the cloud vendor to ensure the integrity of data and applications. For most organizations, cloud computing is not an all-or-nothing proposition. It is likely that applications and data inside and outside the firewall will co-exist in a hybrid manner.

Today we already see this model implemented across businesses. In many ways, large organizations have the ability to evolve their own infrastructures to provide an internal cloud. IT architects and other technology professionals need to consider a cloud computing delivery model to solve business problems. Business leaders should be informed about the opportunities, costs, security and risk considerations.

What should you do now? If you’re a business or technology professional, I encourage you to learn more about cloud computing. This important technology paradigm is rapidly rising in importance and will become an essential part of your technology toolkit. It’s time to put your head in the cloud!


The Metaverse as a business platform

March 31, 2007 - 8:00 am

1. Introduction

Mankind’s new frontier is neither the far reaches of outer space nor deep oceans, but rather a limitless, ethereal virtuality that we are collectively imagining one pixel at a time. This is the Metaverse, an immense and immersive 3D virtual rendering of the real world, populated by millions for the purposes of gaming and, increasingly, social and economic pursuits. It provides the freedom to live dreams and will extend our notion of the self.

The Metaverse represents a disruptive technology through a significant extension of the existing 2D-Internet into a multifarious and emotive 3D-Internet. The speed and graphic capabilities of personal computers and gaming devices coupled with the ubiquity of broadband access to the Internet is unleashing a dramatic new world of opportunities and challenges.

The Metaverse eliminates real world constraints and in doing so it will redefine our view of the real world through new realities that are created in the virtual space. The breathtaking rate of change within these virtual worlds outpaces that of the real world by a magnitude. In its rapidly evolving form, the Metaverse hints of a compelling environment to conduct business. Organizations are jumping in with the speculative promise of exploiting a new channel to market and sell their wares. But how compelling is this notion and what is the sustainable value proposition?

When technology catches up with great ideas, profound things can happen. Now is one of those times.

2. What is the Metaverse?

The term, Metaverse (short for metaphysical universe), was coined in 1992 by Neal Stephenson in his book Snow Crash which envisions how a virtual reality (VR) based-Internet may evolve in the future. The notion of VR – a computer-based simulation of a real environment, with many products and solutions already in use – is not a new phenomenon. Today’s Metaverse brings together high-quality graphics, ubiquitous access to the Internet, the significant processing power of personal computers and gaming devices, and the imagination of millions to create massive immersive virtual worlds. Interaction is facilitated through the use of an avatar – a customizable, electronic persona that represents the individual. Barriers of space and time are largely eliminated since many of the constraints of the physical world become irrelevant. In effect, society is creating a new frontier that brings with it all the challenges and opportunities of settling a new world but without many of the constraints of the physical world. The Metaverse is taking the form of massively multiplayer online role-playing games (MMORPG) such as World of Warcraft; non-gaming worlds such as Second Life and There; and the mash-up social networking and virtual world Habbo.

3. The Metaverse as a Business Channel

While renderings of the Metaverse have existed for some time in such simulations as Electronic Arts’ The Sims and Microsoft’s Flight Simulator, a critical mass of processing power and global network connectively has exponentially elevated its potential, making it an impressive proposition in multiple domains. In addition, its pervasiveness on gaming devices and personal computers, and its broad acceptance and use across the globe ensures a captive and willing user base. For consumers willing to part with their cash and a market with a voracious appetite for new channels, the Metaverse becomes a compelling venue for consumerism. A new business platform is born.

Businesses in the Metaverse can be broadly defined as native and non-native. A non-native business refers to a real world organization building a presence in the Metaverse. That is, this world represents another platform for an existing business much like the Web has provided. Native businesses refer to those start-ups that are building businesses codified into the Metaverse fabric. These include new propositions that sell products and services that only exist in a virtual world, for example vendors that sell animations for changing or extending avatar movement capability.

So who are these businesses and what kinds of things are they doing? This article looks specifically at Linden Lab’s Second Life, the product that best encapsulates the virtual world zeitgeist.  Second Life is differentiated from other virtual worlds in that the intellectual property created by participants is retained by them. Commerce is conducted between participants in a manner that is independent from the platform provider. This is in contrast to a virtual world such as Sony’s Home product in which the vendor, Sony in this case, is the only seller. The authors suspect that this business model will change in favor of something closer to Second Life.

At the time of writing, in Second Life there are over 3,100 business entities in existence. They range from big non-native names such as Adidas, Calvin Klein, and Toyota to native vendors such as Anshe Chung Studios-a virtual real estate agency and developer, Cranial Tap-a designer of virtual environments and “in-world” applications, and OurBank-a full service bank for the Second Life currency Linden Dollars. Business activities for both non-native and native organizations range from selling real and virtual products, to recruitment, banking, marketing, real estate, and training. The following are all recent examples of Second Life business: Starwood, the hotel chain that owns Sheraton and Westin, built a concept hotel “in-world” and asked members to visit the building to provide feedback well before a single real brick was used in construction. Urban Outfitters sells real clothes that are shipped in the real world, but while waiting the clothing can be enjoyed by the consumer’s avatar. Reuters, the international news bureau, has an elaborate news presence “in-world” and has even assigned a full-time journalist to cover events throughout the Second Life universe. Anshe Chung, most notable as the first native business to supposedly create a millionaire entrepreneur, buys virtual land, develops on it, and then resells it at a handsome profit. The Versuvius Group specializes in a range of native-only services such as machinimatography-the process of creating movies “filmed” in the virtual world-and designing fashion clothing for avatars.

As of March 2007, about $2 million per day was being spent in this virtual world. During 2006, the economy grew at 270%. Residents number just over five million and are growing by over a million per month. Currently the return rate of those that sign-up for membership is about 10%. Today that still represents approximately 500,000 regular users. Do the math: if both this economy and population growth continues, any extrapolation estimate suggests a billion dollar virtual economy within just a few years. Does the value proposition of the Metaverse need any more convincing? But beware: any diligent industry analyst or economist should be quick to warn of hype or the potential for sudden bust.

4. Servicing Businesses in the Metaverse

With both non-native and native businesses flourishing in the Metaverse, new and ambiguous tax and financial implications emerge. Should the tax laws of the real world apply in the virtual world? Congress is already on the case (Reuters, 2006). New business models mean new financial challenges. For example, a digital object, created at time of purchase and reproduced in an unlimited fashion without cost or effort, essentially redefines the notion of inventory. While many new questions are raised, it is clear that as organizations develop a presence in virtual worlds, they will require the same type of services often requested in the real world such as business process improvements, risk management, controls and compliance, financial effectiveness, and business modeling.

As businesses have developed an online presence on the Web, many new supporting services have emerged such as Web designers and developers, hosting services, Web traffic analysts, and search engine optimizers (SEO). The same trend is expected as firms move into or are created within the Metaverse. Consider such services as those that help with developing digital products, digital property protection, marketing in a virtual environment, specialized tax and audit services, Metaverse cultural and etiquette training, and avatar trainers. Many new career opportunities will abound. Organizations of all sizes will require advice on their endeavors; advisory services demand will be diverse and abundant.

While we use the real world and the Web to make these assumptions about probable services, the most remarkable conclusion is that services will emerge that we cannot yet imagine, but in time will be born in this new virtual reality.

5. The Challenges Ahead for Doing Business in the Metaverse

The Metaverse currently exists in a state of low-level chaos and altruistic self-governance. While this is momentarily sustainable, as a critical mass of legitimate businesses establish themselves within these worlds and attempt to gain trust in the marketplace, it is likely that people and communities will group to form appropriate governance. Might there be an equivalent of the World Trade Organization (WTO) or United Nations (UN) for the Metaverse? In addition, how will law and order be established, monitored, and enforced? Rather than develop law in a piecemeal manner, there may an opportunity to establish a general agreement on acceptable behavior – something that leverages thousands of years of human trial and error to the rapid benefit of all virtual worlds.

In the real world, government agencies and institutions expend considerable effort and cost to ensure that a single identity is maintained by each individual. Proving the identity of the individual is a fundamental requirement for the tenets of civilized society to function and particularly in the context of this article, for most, if not all contemporary business transactions. The Metaverse largely relies on mapping an identity to an e-mail address or a credit card number. As the paradigm matures, this method of identity authentication will become massively inadequate since, for example, impersonation will be more possible and therefore more prevalent, and discerning between a human and a computer controlling an avatar will be difficult. New forms of identity management will need to evolve. Methods of detecting fraud and identify theft will be required. In addition, is it possible that an individual will have multiple identities based on different needs or roles? In fact, individuals could conceivably have hundreds of identities–for example, avatars that do specific tasks.

It is predictable that those intent on crime or simply malicious intent will make their way to the Metaverse. Note the recent vandalism on the Second Life campaign headquarters of Presidential hopeful, John Edwards (ABC7, 2007). While bringing down a Web site through a denial-of-service attack, or using a phishing scam–fake Web sites that mimic real sites such as an individual’s online bank in order to entice a user to enter his password thus enabling the perpetrator to steal the password–is painful and often costly, the implications in a virtual world are much larger. In the Metaverse, there will be complex interdependencies between many different types of processes and digital objects. Depending on the level of that complexity, disruptions to a part of its digital ecosystem could be the virtual equivalent of an interruption to the energy supply chain in the real world. Businesses could be destroyed, identities modified or eliminated, and digital factories shut-down. Injections of rogue avatars could roam and cause havoc, resulting in the collapse of the system’s integrity. Law enforcement may need to operate between two or more worlds both real and virtual, identifying and disabling digital criminals and stopping their human sponsors. Might this result in the creation of enforcement avatars or Metaverse army intent on protecting their owner’s digital assets, going as far as preemptively attacking other virtual worlds to thwart future invasions? Let’s stop there lest we get caught up in our own speculative hyperbole.

The Metaverse presents new opportunities and challenges in almost every discipline. It clearly represents a value proposition for businesses. The fundamental question is: can it be credible and sustainable? Since the rate of change in the Metaverse is progressing at an accelerated time frame compared to the “real world” it won’t be long before we know the answer.

6. References

ABC7 (2007). John Edwards’ Second Life HQ Vandalized. Retrieved March 31, 2007, from

Reuters (2006). US Congress launches probe into virtual economies. Retrieved March 31, 2007, from

Author: Jonathan Reichental, Ph.D. & Robert G. Eccles, Ph.D.

Originally published in the Harvard Interactive Media Review.