Your customers are digital - are you?

bryang | No Comments
| More
In the hopefully unlikely eventuality that your company executives still need convincing that the internet is going to transform your business, the past few days have provided further evidence of the accelerating changes brought about by consumers moving to the web.

Only last week, I highlighted, "Which businesses are ready to thrive in the internet era?" as one of the big IT questions for 2013. We're already finding out some of those that are not.

Camera retailer Jessops is the latest high-street retailer to suffer, following Comet at the end of last year - both household names that failed to adapt to the digital world.

In the latest round of supermarket financial results this week, Morrisons was the big loser, with its lack of an online presence or a multi-channel strategy cited as a primary cause of its sales decline. Meanwhile, Tesco and Sainsbury's both highlighted impressive online growth as key factors in their performance.

In a generally depressed retail environment before Christmas, John Lewis stood out for a 44% year-on-year growth in web sales making a major contribution to 13% growth in December revenue.

John Lewis is so bullish about its digital plans it has even coined the phrase of an "omni-channel" strategy - which does rather make it sound like Malcom Tucker from TV political satire The Thick of It has taken over - but shows the extent of the department store's ambition.

Plenty of experts now predict that this online shift is not going to be the evolutionary process that many firms hope, but instead more of a cliff. Consumer behaviour is changing so rapidly, that one day you may just find your business goes straight down - in which case, there can be no more excuses for delaying the IT investments needed.

Consumer behaviour is the key consideration. It's not that the internet is changing businesses, it's changing the way your customers think and shop and buy. And when consumers are prevented by a particular company from thinking and shopping and buying the digital way they want to, they just go elsewhere. Bang - goodbye.

The digital King Canutes who want to hang desperately to their old pre-digital business models are not fighting the tide of the web and mobile wave, even if they think they are.

Take the obvious example of the music industry. Here, they resorted to copyright and intellectual property law, and to the rhetoric of turning customers into criminals, to prevent their former, lucrative models from disappearing.

Music firms that claim they are simply stopping illegal piracy miss the point. The reason consumers opted for illegal downloads is because they were stopped from getting reasonably priced, easily available legal downloads. It wasn't about rampant illegality, it was about consumer behaviour changing and the music industry wanting to prevent that change.

If you're doing anything similar, you're on a downhill slope to the cliff. Stop now.

For many, it will be a painful change. Digital often means lower profits. But not digital means no profits. Just ask Jessops.Enhanced by Zemanta

Six big questions for 2013

bryang | 2 Comments
| More
The big flaw in most of the IT predictions you will read at this time of year is that technology is no respecter of the calendar. Fast moving it may be, but it's not as if the issues faced by IT leaders have changed suddenly since we all departed for the season of festive overeating.

For most IT professionals, the issues  for 2013 are not that different from those they faced in 2012 - managing or reducing budgets; delivering innovation; and explaining to business colleagues how to use IT to improve competitive advantage, productivity and customer service.

The three fastest developing technology trends will remain the same this year too - expect mobile / consumerisation, cloud and big data to appear frequently in the headlines, along with perennials such as information security.

But 2013 is likely to be a time when certain big questions affecting corporate IT are going to have to be answered. Here's a personal rundown of six of the biggest:

What next for HP?

The world's largest IT company is a mess. Despite annual revenue over $120bn, its market value is now less than $30bn - a figure less than it has spent on acquisitions over the past five years. Not only has HP destroyed the value in major purchases such as EDS, Palm and Autonomy, it has at the same time reduced its own value to less than it spent in the first place. That's a phenomenal vote of no confidence from investors, let alone customers.

Years of musical chairs in the boardroom, ever-shifting strategy, unclear technology priorities, and vain attempts to maintain both a consumer and enterprise business, have put HP in a worryingly weak position. The latest CEO, former eBay chief Meg Whitman, is engaged in a "multi-year turnaround" plan, but investors are losing patience. Will HP conduct a fire sale, be broken up, or even itself be acquired this year? You can't rule out any option.

Will Microsoft get it?

Microsoft remains the dominant supplier of enterprise IT software, and that's not going to change in the course of 2013. But IT decision-makers want to see evidence that Redmond understands the fast-changing demands they are dealing with. Even some of the most loyal supporters of Microsoft these days agree that the company just doesn't "get it" with the evolving IT landscape. If Windows 8 is meant to be the blueprint for how Microsoft will span the corporate and consumer worlds, then the supplier is no nearer to proving it will remain the inevitable choice for productivity, communications or collaboration.

Many CIOs are publicly declaring their desire to move away from Windows, stating that the forced move from XP to Windows 7 will be their last migration. Windows Server, Office, and Sharepoint will ensure a continued presence in the IT department, but as users increasingly see non-Windows interfaces in their day-to-day work, that long-held lock-in starts to evaporate and it becomes easier to move to alternatives. Microsoft's influence will not go away in 2013, but the decisions the firm makes this year will determine how much longer that influence will continue.

Can the reformers in government IT win?

This is a pivotal time for the reform of government IT. In 2012, after much politicking (of both the small "p" and capital "P" varieties), important foundations were laid - the new digital strategy, the open standards policy, and the G-Cloud project, to name but three. But now those plans need to be turned into actions, with a further battle looming between departmental CIOs and the central Cabinet Office team.

A recent reorganisation of responsibilities has thrown the future of the CIO role into question - but will those CIOs be happy to simply roll over (or retire) and have their roles diminished? It's unlikely. But the reform of government IT needs to be in place before the next election in 2015, and for that to happen, much progress is needed this year.

Can the UK become a world-leading digital economy?

For the last 15 years, successive prime ministers have made grandstanding speeches pledging that the UK's future is as a world-leading digital economy. And to be fair, as a country we've not done too badly: we are the biggest online shoppers in Europe; we're just about keeping up on high-speed broadband (unless you live in rural areas); and we have enthusiastically embraced all things mobile (even if we are rather late to the 4G party).

But these things have happened because of you and me, and not because of any policies put in place by those eager politicians. It's all very well leaving digital progress to "the market" and consumer demand, but this is the new industrial revolution, not the latest fashion.

We need a government industrial policy for technology that puts in place a framework that incentivises global companies to base their cloud operations in the UK, and an environment in which UK digital businesses can thrive and not all end up acquired by overseas rivals. In particular, we need significant investment in the IT skills base, otherwise our essential digital skills will eventually be sourced from abroad.

Let's not knock initiatives such as the cash going into Tech City in London, or the small hand-outs from the Technology Strategy Board to help fund new initiatives - but equally, let's be honest and say they are a drop in the ocean; they should be peripheral initiatives supporting a broader plan for the digital economy.

We need the government this year to put in place real actions, not just words and pennies. As the financial services industry makes its slow but inevitable move to the East, we need to build the digital economy to replace it.

Can IT departments cope with the new tech-savvy user?

The era of the command-and-control IT department is dead - even if there are still plenty of examples that refuse to die. Consumerisation and bring your own device (BYOD) policies are here to stay, and will happen whether the IT manager wants them to or not. If you resist this wave, your inevitable successor will embrace it instead.

Forward-thinking IT leaders are redefining the relationship between IT and its internal customers. IT budgets no longer necessarily live in the IT department - marketing chiefs in particular are becoming increasingly influential as firms look to use social technology, the web and mobile to improve customer engagement.

For those IT departments determined to be digital King Canutes, and stick with their locked-down, process-controlled, "them and us" approach to users, this year will be one of your last.

Which businesses are ready to thrive in the internet era?

It's now a statement of the obvious to say that whole industries are being transformed by the internet. No self-respecting CEO can deny that the web is changing their business, and in particular its relationship with customers.

But for many long-established, hitherto successful companies, the spectre of legacy IT holding them back looms large. If you look at most of the household name firms that are struggling, have disappeared or gone bust, then a failure to adapt to the web was central to their collapse - Comet, HMV, Woolworths, Kodak, Borders; the list is long and growing.

In industries such as entertainment or travel, new entrants are destroying old dominance because they have started from scratch with modern web technologies and don't have an anchor in the mud of outdated legacy IT.

The big banks in particular face a huge dilemma - most have, at the heart of their IT, systems that were developed 10, 20 or even 30 years ago. Their hideously complex IT infrastructures are barely able to cope with online or mobile banking - look what happened to Royal Bank of Scotland and NatWest last year, when that combination of complexity and legacy brought the banks to their knees.

As the economy slowly, agonisingly recovers, the firms that thrive will be those that recognise the dramatic changes in customer behaviour brought about by technology, and invest in IT accordingly.
Enhanced by Zemanta

Are government CIOs being pushed to the sidelines?

bryang | No Comments
| More
The latest step in the government's IT leadership shake-up was announced last night, with former deputy government CIO Liam Maxwell appointed as the first government chief technology officer (CTO). As revealed by Computer Weekly, Maxwell now reports to Government Digital Service director Mike Bracken.

What's in a job title, you might ask? In this case, potentially quite a lot.

Computer Weekly has seen a copy of the internal Cabinet Office announcement of Maxwell's new role, sent out by Bracken's boss, government chief operating officer Stephen Kelly.

His memo is significant for what it does not say, perhaps more than what it does.

Here is what Kelly said:

"I am delighted to announce that Liam Maxwell has been appointed Chief Technology Officer (CTO) for HM Government. Liam will lead the CTO council and executive in government, leading the design and definition of future technology platforms. He will play a lead role in defining the future technologies required for digitally delivered public services, and the successful IT Reform group will become part of our digital team.
Through the Government Digital Service, we have created a network of digital leaders who are senior operational leaders in their departments, able to identify and drive through the creation of new digital services. The CTO will support these operational leaders to appreciate the technologies required to achieve the change we need across the government's digital estate and synchronise business process improvement.
Each department already has its own CTO or someone very close to that role. Together with the CIO and Digital Leaders, they form a powerful combination to achieve our transition to Digital by Default. This combination of technical and business vision to support operational leaders driving change. Liam will report to Mike Bracken who is leading the Government's digital transformation agenda."
The memo has all the key phrases you would expect of any organisation looking to transform its approach to IT: "...leading the design and definition of future technology platforms"; "...a network of digital leaders who are senior operational leaders in their departments"; "This combination of technical and business vision".

Can you see what's missing from this statement of reforming intent? The phrase "chief information officer." You know, the highly paid CIOs who sit in every government department, leading the delivery of IT to support public services. Yes, them. Barely mentioned, just once, in passing.

Insiders say Stephen Kelly sees no need for the role of CIO. He has already reduced the responsibilities of government CIO Andy Nelson - and the obvious implication of his latest pronouncement is that he intends to do the same to the other departmental CIOs too.

The vision that is emerging is one where Mike Bracken and his team lead the transformation of digitally delivered public services, while Maxwell and his network of digital leaders / CTOs change the way the supporting technology is delivered.

That means lots of cloud, lots of external hosting, plenty of open source, more standardisation, fewer big multi-year development projects like the floundering Universal Credit.

The IT team at the Cabinet Office looks like it increasingly wants to centralise IT decision-making across Whitehall, with departmental CTOs in place to manage the actual technology and the relationship with their suppliers.

Such centralisation eliminates the traditional CIO role completely - at least in the way it has been implemented across government.

One senior source told me recently they expect there will be no such thing as a government CIO in future, nor will there be the need for one. Kelly's announcement can be seen as a first step down that road.

Enhanced by Zemanta

Government IT reform - now the real battle starts

bryang | 1 Comment
| More
Despite the understandable scepticism that was aired about the government's IT reform plans when the coalition came to power, it is worth reflecting on some of the achievements that have been delivered this year.

Just those three initiatives alone promise to end several of the most frequently cited criticisms of past government IT - namely, too costly and inefficient; not enough SMEs; the restrictions from European procurement rules; no open source; no open standards; too much lock-in to big incumbents; and not enough use of modern software development techniques.

It would be wrong to underestimate how far the IT reformers in Whitehall have come to reach this point. But the real battles lie ahead.

It is all very well putting policies in place, but getting the big government departments to follow them is another matter. Insiders say there is a growing backlash from departmental IT teams to the dictates from the Cabinet Office, whose minister Francis Maude is very much the political driving force behind the much-needed IT reforms.

There is a triumvirate of influence across Whitehall. In one corner, Mike Bracken and his Government Digital Service, which is leading the "digital by default" strategy to deliver public services online. In another is deputy government CIO Liam Maxwell, who has been leading the changes in strategy and supplier relationships, and driving the effort to overhaul the way departments purchase and use technology. And finally, there are the departmental CIOs, who are, ultimately, the budget holders for where the £16bn of annual IT spending goes.

The tensions across that triangle are getting very stretched. And now, a major reorganisation at the Cabinet Office has provoked fears from reformers that the dissenting voices of those who want to slow change, or protect the status quo, are gaining influence.

Government CIO Andy Nelson is apparently being sidelined - which is not a great surprise. He is liked and respected across the Whitehall IT community, but even when he was appointed it was clear that the role was increasingly that of a figurehead to lead the IT profession, rather than deliver IT change. Nelson also has some big challenges in his department, the Ministry of Justice, to focus on.

If the shake-up proceeds in the way documents seen by Computer Weekly suggest, Liam Maxwell is now to report to Mike Bracken - previously a peer-to-peer relationship, so how will the dynamics and priorities of that relationship change?

It also seems that Maxwell's responsibility for liaising with departments is being taken away, and given to Cabinet Office executive director Lesley Hume. Insiders say Hume is well liked, and is perhaps seen as more of a diplomat than Maxwell and therefore more acceptable to departmental CIOs. But those insiders also suggest she has yet to prove her ability to deliver the IT changes needed.

More significantly, Hume will report to chief procurement officer Bill Crothers - labelled "old school" by former G-Cloud director Chris Chant, who still takes a keen interest in the reforms he championed before he retired to a French country idyll.

Rumour has it that Crothers is seen as more favourable to dealing with larger suppliers.

Immediately, there seems a conundrum in having the driver of change - Liam Maxwell - reporting to a different person from Hume, who has responsibility for making that change happen in departments. What does it say that IT strategy delivery is to be a function of procurement, not of the CIO or of the leaders of reform?

Furthermore, it appears that the G-Cloud team may also find itself reporting to Crothers - a move that insiders fear could seriously undermine the project and its role in the wider reform agenda.

The shake-up has been instigated by the new Cabinet Office chief operating officer, Stephen Kelly, formerly the CEO of software firm Micro Focus. Kelly is believed to see little point in the role of a central CIO across government, but his views on the wider IT reforms have yet to be publicly aired.

Among Whitehall CIOs there are rumours of frustration with the mixed messages they are receiving. An apocryphal story doing the rounds is of one CIO of a major department being encouraged by one part of the Cabinet Office to avoid certain IT suppliers, by another part to avoid certain others, until he had such a long list of suppliers he was told not to use, that he exploded with a "Who the f*** am I meant to buy from then?"

The window of opportunity for the reformers is getting shorter. If fundamental, irreversible change is not in place by the next election in 2015, it could be stymied for years. If minister Maude gets reshuffled, the political willpower for reform could disappear - although some insiders say he wants to retain responsibility for the IT spending controls he put in place, even if he ends up in another ministerial post.

The internal civil service politics is brewing up, and it's no longer as clear where the real power and influence in driving IT reform lies. Insiders fear the impending shake-up in the IT leadership organisation threatens to confuse and dilute ownership of the reforms.

Next year is going to be critical - can the policies put in place to enable change be turned into actual changes in IT strategy, delivery, purchasing and supplier relations? The words of one insider put into context the need for the reformers to win the battle:

"I just try to keep reminding myself why I'm trying to help do this: the poor old taxpayer working their socks off in some boring job in the belief their taxes are going into something important like education or health - not bloated and poorly designed IT."
Enhanced by Zemanta

Does outsourcing work? Does anyone actually know?

bryang | 2 Comments
| More
So, does outsourcing work? Can anybody actually provide a definitive answer, even after two decades of often knee-jerk outsourcing in organisations of all sizes across the UK?

The biggest focus on this question is underway in the local government sector. Councils in Cornwall, Somerset and Barnet in London have been embroiled in controversy over plans to put huge amounts of public services out to be run by the private sector. Councillors' heads have rolled, as some feel that outsourcing plans have been forced through without sufficient public debate.

Outsourcing in local authorities is becoming an increasingly religious issue - you either believe in it no matter what, or you don't.

Even existing, long-term initiatives such as those in Birmingham and Liverpool have come under scrutiny. Questions of value for money and quality of local services are at the heart of the issue. In Barnet, services outsourced to Capita will be provided from locations as far away from the borough as Belfast and Carlisle, with 70% of the affected staff likely to lose their jobs.

The finances behind Barnet provide an insight into how these deals work.

Barnet council has made its case for outsourcing on the basis of saving 18% of its back-office costs. Yet the actual savings that will be made are about 45% - the difference is retained by Capita as profit.

Compare that with the approach at Sunderland City Council. Here, the local authority has established its IT systems as a profit-making initiative, offering cloud and datacentre services to local businesses, as well as other public sector bodies such as NHS and fire services.

Taking another approach, Whitehall is pushing the mutual model - where public sector, private sector and employees share ownership of the organisation. We're likely to see several IT shared services operations from government being mutualised in the near future.

But which is best? Outsourcing cuts costs but history suggests that it ultimately benefits the supplier most. Selling your own services is high risk, but if done well provides a valuable service to the community. Mutualisation sounds good, but is unproven.

Recent examples suggest the decision comes down to the willpower of the individuals - typically politically motivated - driving the change. Is that really the way to make such fundamental, often irreversible decisions?

The reality is that austerity is often being used as the excuse to drive ideological changes without enough consideration of the real alternatives. Public sector IT chiefs need to bring the clarity and evidence-based approach to balance the whims of their political masters.

Enhanced by Zemanta






The economics of the cloud in action

bryang | No Comments
| More
Compare and contrast the following excerpts from Computer Weekly news stories in the past few days:

27 November: Google is expanding its infrastructure-as-a-service cloud computing platform into Europe and cutting the price of its cloud-based storage by 20%. From December 1 the price for up to 1TB of storage will be reduced by $0.025 per month to $0.095. This is $0.025 cheaper than Amazon's equivalent S3 storage cloud.

29 November: In response to Google's announcement, Amazon is cutting S3 pricing by about 25% across the board, effective from 1 December. Customers will be able to store data in the cloud for about $0.09 per gigabyte.

28 November: Windows users can expect a 15% increase in the cost of licensing key Microsoft products as the software giant raises its prices. From December 1, Microsoft will increase the price of per-user licensing of several products including Exchange, Lync, Windows Server and Terminal Services.

Here we have the co-existence of two eras of information technology - the question it poses is how long can that co-existence continue?

The commoditisation of computing power and storage through the cloud is spreading at pace, bringing increased competition, driving down prices and improving service levels. It's happening now - Amazon says that at the latest count S3 stores 1.3 trillion objects and receives 800,000 requests per second. That's not a niche or specialist business, it's mainstream computing.

Microsoft, meanwhile, unilaterally puts up prices, because it can. Earlier this year, the software giant increased its volume licensing prices to corporate customers by up to 25%. Why? Because it can. But the firm also cut the costs of its cloud-based Azure and Office 365 products. Why? Because it had to. That's what competition and commoditisation forced Microsoft to do.

It won't take long for IT leaders to notice the difference in competitive behaviour of old era suppliers such as Microsoft, and compare with their new era suppliers such as Amazon and Google. It might take them a lot longer to ditch the old and switch to the new, but with economics like those described above, it is inevitable.

Microsoft isn't the only one - SAP and Oracle have both come under fire from customers over their pricing policies.

The difference is that as these established suppliers grow, so they put their prices up further or make their terms more complex. In the cloud, the more you grow, the greater the economies of scale, and the more you can pass those on to customers in the form of lower unit prices.

Forget the issues of functionality, of branding or legacy that protect the user base of large suppliers. They are, ultimately, transient. It is simple economics that demonstrates the scale of the changes that the cloud is going to bring, at increasing pace, to IT spending priorities in the next five years.

Enhanced by Zemanta

HP vs Autonomy - a high-stakes game with a big loser

bryang | No Comments
| More
Congratulations to HP for creating a global flood of words about its increasingly controversial purchase of British software company Autonomy last year.

The accusations of "questionable accounting and business practices" that led to HP making an $8.8bn write-off on the $12bn acquisition has shocked and surprised most people who followed the rise of Autonomy into the FTSE100, becoming the UK's leading home-grown software firm.

Already there is something of a transatlantic divide emerging, with US media reports adopting a tone of, "How can this foreign company do such an awful thing to our HP", while the UK press goes for, "How can they have done this to our Autonomy?"

But there seems to be several vital issues around which there is general agreement, and have to be the place to start for any attempt to work out just what is going on.

  • HP overpaid for Autonomy in the first place.
  • HP has a terrible track record of losing value from major acquisitions - look at EDS and Palm, for example.
  • HP was on track to lose huge value in Autonomy anyway, with large numbers of former staff quitting the company since the purchase.
  • HP's announcement of the accusations completely overshadowed yet more dreadful financial results, with areas of its business haemorrhaging revenue - personal systems sales down 14%; servers, storage and networking down 9%; services down 6%; even the cash cow of printers was down 5%. Ironically, software revenue - which includes Autonomy - was the only bright spot, growing 14%.
  • Plenty of people are coming out of the woodwork now to say Autonomy has long been viewed as using creative accounting techniques - but still it was a quoted company, regularly audited and given a clean bill of health, and which went through a thorough due diligence by HP's advisors that failed to uncover any discrepancies of the kind now being accused.
  • HP, despite the gravity of its accusations and the size of the financial hit, did not tell Autonomy founder Mike Lynch, nor any public authorities, about what it had discovered before issuing a press release detailing its claims. That in itself seems an extraordinary decision.
The question that everyone is asking, is how could alleged accounting malpractice on a scale that warrants an $8.8bn write-down not have been noticed by any auditors or financial advisors, both before HP's acquisition, and in the year since?

If the allegations are proved - and it will probably take years of detailed and costly court cases before we ever get to that point - it would raise issues around the whole corporate governance regime in the UK, were a FTSE-listed company to have got away with such practices for so long. The worst case is if Autonomy turns out to be a British Enron or Worldcom.

If the allegations are overblown - well, HP's reputation would fall off a cliff. If CEO Meg Whitman and HP's board harboured any doubts about the accusations, or are proved to have totally over-reacted to some questionable but not illegal accounting, it could be the final straw for a company in trouble whose investors are already concerned over its continued decline.

Someone is going to be a very big loser when this high-stakes game is played out.

Enhanced by Zemanta

Tax transparency will be a growing issue for government IT suppliers

bryang | No Comments
| More
Considering the amount of taxpayers' cash that goes to IT suppliers - some £16bn per year at latest estimates - it was inevitable that the big multinational systems integrators that dominate Whitehall IT would come under scrutiny over their tax payments.

The vultures have been circling over big US firms like Google, Amazon and Starbucks, who have been questioned by MPs over the ludicrously small amounts of corporation tax they pay in the UK, despite the millions (or even billions) of pounds they make from UK customers.

Private Eye has now pointed the finger at IT suppliers, by naming IBM, HP, Fujitsu and Capgemini as likely candidates paying less tax than perhaps they should.

And Computer Weekly contributor Mark Ballard has shown how CSC paid just 0.5% tax on £1.5bn income earned from a 10-year outsourcing deal with Royal Mail signed in 2003.

It's important to point out that none of this is illegal - it's all clever accounting, using tricks like paying internal charges to an overseas head office to reduce profits and minimise corporation tax.

But when it comes to supplying government, you get into a whole new moral maze.

Should government be giving taxpayers' money to IT suppliers who pay very little of that back in tax to the state coffers? And is it, therefore, reasonable to consider the amount of tax paid by a supplier relative to the value of their contracts when assessing their bids for new IT projects?

As disclosed by Computer Weekly, government officials are considering whether or not it is feasible to force suppliers to include details of their tax payments relative to UK revenues when bidding for contracts.

Even if there are legal barriers that make such a move impossible, there is enough data in Whitehall to expose any discrepancies in suppliers' contributions to UK plc.

The Cabinet Office is trying to make the whole IT procurement process more transparent - wouldn't it be good to have some sort of online resource that compares major supplier's tax payments (sourced through HM Revenue & Customs' IT systems) with the value of the contracts they have?

You can imagine the Government Digital Service putting together a website in pretty rapid time to display that data - wouldn't it make for some interesting conversations between government CIOs and their suppliers if they could call up a nice graph of that information during contract negotiations?

There's bound to be more to come on this issue - the big systems integrators would be well advised if they voluntarily adopted a policy of transparency over their real value to UK plc.

Enhanced by Zemanta

Universal Credit - the last failure of the old IT regime, or a boost for the new?

bryang | No Comments
| More
There are two things that often signal a major government IT project on the brink of disaster. First, streams of leaks appear suggesting little problems here and rather bigger problems there; and second, the relevant Whitehall press office tells journalists it is not going to provide a "running commentary" on progress. That's been the story of Universal Credit (UC) for the past couple of weeks.

First, Computer Weekly revealed that several senior executives running the programme had departed. Officially, this is because the project has moved to a "different phase" and requires "different skills sets".

Talk privately to almost anyone with knowledge of the project and it's clear that was not the reason.

Since then, more stories have circulated about UC running late and being over budget. There seems little doubt it is a priority project to sort out, not least because the government's flagship welfare reform hangs on its success.

The new Department for Work and Pensions (DWP) CIO, Philip Langsdale, has a track record of turning round problem projects, having transformed the technology at Heathrow after the fiasco of Terminal 5's opening. He was even appointed by BAA to put together the emergency plan to cope with extreme weather after the airport was shut for days due to snow two winters ago.

Langsdale is conducting a thorough overhaul of UC - both internally and with the major suppliers involved, whose relationship with the DWP had become far too cosy.

While critics will, justifiably, shake their heads and cite yet another looming government IT disaster, the situation with UC has wider ramifications for the future of Whitehall IT.

There is a huge amount of good things going on in central government IT. The digital strategy led by Government Digital Service director Mike Bracken is delivering impressive results using open-source technology and agile development to deliver high quality projects at low cost in short timescales. The new open standards policy should reduce costs and supplier lock-in. And the G-Cloud is opening up the market to small suppliers that would have previously been unable to compete against the big systems integrators (SIs) that have historically dominated in central government.

Those three initiatives alone have already slain a number of sacred cows belonging to the old school who said you couldn't do things that way.

It would be a terrible shame if a disastrous UC project overshadowed those achievements and created the impression that nothing has changed.

However, it's also important to point out that UC is perhaps the last of the mega-projects that was set up under the old rules. It contracted the usual SI suspects into huge, multimillion-pound, multi-year deals. It bypassed the spending controls put in place by the Cabinet Office - rumour has it that work and pensions secretary Iain Duncan Smith personally authorised the avoidance of those rules.

So a very public failure for UC might also be an opportunity for some "I told you so's" behind closed doors in the corridors of Whitehall.

Universal Credit is not yet a failed IT project, and Langsdale has time to turn it round. But if it does fail, it could become the final failure of a failed IT regime that is being consigned to the past.Enhanced by Zemanta

We need more visionary IT leaders

bryang | No Comments
| More
When IT leaders explain their reasons for not moving to a new technology, it is almost always due to immediate or short-term perceived problems.

Should you move to the cloud? Security is a concern. Should you overhaul your information security? Ooh, too risky, what if we get hit by cyber attackers? Should you let staff use their own devices? That's not policy.

Whatever happened to long-term, visionary planning?

Think about what the IT world will be like in 10 years, perhaps even in five. Cloud will be ubiquitous in how organisations use IT. Emerging technologies like micro-virtualisation or advanced encryption will make data more secure than ever. Bring your own device (BYOD) will be the default for user access to corporate systems.

We all know these things, even if only as a gut feel. Looking further ahead makes the apparent problems faced today seem smaller, and puts them into better perspective. Working toward that vision gives a motivation to find solutions, and a reason to make them happen.

But still people stick to their complex legacy systems, and see only the hurdles in front of them, not the finishing line that can be reached with innovation and vision. Banks stick to old mainframes with 20-year old software, because there's no three-year business case to justify replacing them. Royal Bank of Scotland can tell you what happens then.

IT leadership in large companies has become an increasingly short-term role. We often see CIOs coming in for a three to five year period to manage a programme of technology change. There seem to be fewer CIOs who offer a 10-year vision of the way that IT will change their business or industry sector, and set about working towards that vision.

There are, of course, exceptions. Barclays, for example, is encouraging staff to collaborate on ideas and support innovative projects. ""It is okay to fail. Scar tissue is a good thing," says the bank's European CIO Anthony Watson.

IT suppliers don't help either, focused as they are on the next deal for their latest product. What's more, according to a survey this week, 39% of IT staff lose at least one working day per week on tackling IT problems and chasing suppliers, while 69% have dropped suppliers in the past year because of customer service shortfalls.

The opportunities for business to grow through innovative uses of technology are greater than ever, and that needs IT leaders with long-term vision, working with suppliers who can share and support that goal. Surely, that's not too difficult to achieve, is it?Enhanced by Zemanta






The government's open standards policy is bold, important and very carefully written

bryang | No Comments
| More
The government has finally released its policy for open standards in IT - after an often controversial consultation process - and it will surprise and delight many observers who expected a meek compromise to the lobbying power of the software industry.

The new "Open Standards Principles" are bold, important, and clearly written with a smart lawyer and a clever linguist looking over the shoulder of the author. They are mandatory immediately for all central government IT purchases. And they will worry the big incumbent suppliers who have been used to a long-term lock-in to their products.

Here are a few of the boldest highlights from the policy document:

"The product choice made by a government body must not force other users, delivery partners or government bodies, to buy the same product."

This is hugely significant. Think how many times you have bought the same software because the cost of integrating any other software would obviate the potential benefits from a product that may be better than what you currently have. Want to buy a Linux server, but can't because it doesn't integrate with your Windows Server software? That's no longer allowed.

"Government bodies should expose application programming interfaces (APIs) for its services to enable value-added services to be built on government information and data."

The idea of "government as a platform" has been discussed for some time, but formally encouraging APIs for private sector firms to develop services around public data takes that a big step forward. It is, admittedly, a very Tory policy - using IT to push public services out of the centre and to involve businesses in their delivery - but from a technology perspective it is forward thinking and far reaching in its implications.

"For government bodies that are identified as not adhering to the Open Standards Principles (e.g. through transparent reporting or spend controls cases), Cabinet Office may consider lowering the threshold for IT spend controls until alignment is demonstrated."

In other words, if you don't comply, the Cabinet Office will make you justify every minor piece of IT spend until you get so fed up with the scrutiny that you go along with the policy.

"As part of examining the total cost of ownership of a government IT solution, the costs of exit for a component should be estimated at the start of implementation. As unlocking costs are identified, these must be associated with the incumbent supplier/system and not be associated with cost of new IT projects."

This is really clever. Proprietary software suppliers have long been protected by the prohibitive cost of moving away from their products. That cost is always considered as part of the business case for a new project - so the price of moving from incumbent supplier X to new supplier Y becomes part of the cost of moving to supplier Y, and usually that makes such a move unaffordable. Instead, the government will include the cost of moving away from supplier X as part of the initial business case for buying from them in the first place. In other words, the cost of moving away from an incumbent supplier is added to the purchase price for that supplier. That's smart - and scary for a lot of supplier Xs.

"Other than for reasons of national security, essential government extensions or variations to open standards for software interoperability, data or document formats must themselves be made available under an open licence and be publicly shared to enable others to build upon them."

In other words - if a supplier has to spend money to integrate their product to an existing, policy compliant, open-standards-based product, not only are they banned from passing that cost on to the government buyer, but they must also offer the result of their integration work for free to other government buyers. That's going to hurt a proprietary software provider.

"Rights essential to implementation of the standard, and for interfacing with other implementations which have adopted that same standard, are licensed on a royalty free basis that is compatible with both open source and proprietary licensed solutions."

This is perhaps the most controversial policy of all. Much of the heated debate in the consultation process came from well-funded lobbying by the big software suppliers (you know who they are) to convince government that software provided under a Frand (fair, reasonable and non-discriminatory) licensing policy could be defined as an open standard.

In effect, their argument was that even if you have to pay a royalty to a third-party for their ownership of all or part of a standard, then it was still an open standard. The Cabinet Office disagreed. They disagreed before the consultation, and went through the consultation so they could justify to somebody else's lawyers that they had properly considered the arguments over Frand. And then they came up with a policy that disagreed anyway.

This clause is the essence of one of the primary goals of the whole open standards policy - to create a level playing field between proprietary and open-source software.

It's worth stating that royalty-free does not mean that you have to use free software to be considered open. But it does mean that to be considered open, you cannot include a standard that costs money. It means that proprietary, patented standards are not considered open if government is paying a royalty (whether overtly or hidden in pricing) for their use.

Open source software, by the nature of its licensing, cannot include royalty-encumbered standards. If an open-source supplier wanted to interoperate with a proprietary product using a Frand-based standard, it would be prohibited from doing so by the fact that open source has to be publicly shared, and you're not allowed to publicly share software that carries a royalty. So the open-source product is prevented from integrating with the proprietary, Frand-based "open" product.

So the rejection of Frand is going to worry the big incumbent suppliers more than anything.

Of course, a policy has to be implemented, and the big test will be the first time that Microsoft, or Oracle, or whoever, loses a deal because they are not considered to be open standards compliant, and then we'll see how much they want to fight the decision through the courts, and whether the government has the stomach for that fight.

But by producing the open standards principles that it has, the Cabinet Office has signalled that's a fight it is willing to take on. Seconds out...

Enhanced by Zemanta

Windows 8 - the eminently sensible product launch of the week

bryang | 1 Comment
| More
Despite all the hype about iPad minis, there's no doubt the most significant product launch of the week for IT managers is the release of Windows 8.

Unlike Apple's "top secret but with lots of leaks" approach to new products, we know pretty much everything there is to know about Windows 8 already.

We know that it's going to confuse a lot of users with its attempt to combine a new touch-oriented interface with the conventional method.

We know it's going to present challenges for software developers who now have the ARM-based Windows RT version for tablets to consider too.

And we know it is going to be scrutinised more than ever as to whether it is good enough to attract consumer sales away from the iPad as the bring-your-own-device (BYOD) trend continues in the workplace.

The big pitch

The big pitch for Windows 8 is promising, and hugely significant for Microsoft. This operating system is all about extending the Windows ecosystem to encompass everything from smartphones to tablets to PCs and on up to servers.

There's no denying that is going to be an attractive option for IT managers looking to eliminate the complexity of multiple architectures and multiple environments. It's an eminently sensible thing to do.

But while "eminently sensible" has been the underlying theme of Microsoft's long track record in corporate IT, more is expected of our technology these days.

Does it matter to IT decision-makers that Microsoft is no longer cool? When it comes to return on investment and making the business case, almost certainly not.

But when it comes to meeting the heightened (if not always realistic) expectations of users and business executives, perhaps it does.

Arguably, it would be a more significant move if Microsoft were to release Office for iOS and Android, and also an Active Directory client for those tablet environments. Increasingly, the corporate commitment to Microsoft comes less from the Windows PC, and more from Windows Server, Active Directory, SharePoint and other back-office infrastructure software - and from the integration between Office and those products. What serious alternatives to Windows Server are there, for example?

As browser-based applications become increasingly the norm for businesses, the user operating system is less important. Even Office is becoming increasingly cloud-based.

Microsoft is betting its future on the fact that users might not be enthused about Windows on a tablet, but at least will be comfortable with it if driven in that direction by the IT department.

Skimming the Surface

There are big question marks over the depth of Microsoft's commitment to its Surface tablet - for a start, it's only available online or in a Microsoft store (when was the last time you saw one of them?) There's no mass market distribution strategy for the product, and Microsoft has to tread a fine line between encouraging its hardware partners to make Windows 8 tablets of their own and pushing Surface. Surely if there was an aggressive plan to make Surface a real rival to the iPad, Microsoft would be exploiting its established retail distribution channels for the Xbox?

You can see that companies that look to tablets for line-of-business applications are going to tend towards Surface or its Windows 8 alternatives. In areas like healthcare, education, field sales and others, the integration with the corporate environment is a winner, and users would not expect such devices to be used in their personal life.

But for the general purpose user computing environment, where employees increasingly want to have a dual-use machine that is also their personal device, they are still going to be more likely to opt for Apple or Android, and less likely to be swayed by an IT manager telling them it's better for the company if they choose Windows.

And of course, let's not forget that most corporate IT is still moving from XP to Windows 7 - a major migration to 8 is some time away, and by that time the proliferation of other tablets in the workplace will have grown.

The other things

Microsoft will continue to bluster and hyperbole about Surface - "There's nothing like Microsoft Surface on the market today," CEO Steve Ballmer told the BBC. "The other things have a purpose but they're nothing like the Surface."

The "other things" seem to be doing OK though, and at best Windows 8 will be the number three tablet and smartphone operating system in a market that already has two dominant players.

Apple and Android are not going to kill Microsoft. The Redmond giant isn't going away from the corporate market. But over the next few years, it's less likely that Windows 8, for all its cross-platform standardisation, is going to be the core of Microsoft's success in business. Microsoft's future area of dominance is increasingly going to be back-office based.

Enhanced by Zemanta

Leaders need to be part of the network not apart from it

bryang | No Comments
| More
Last week in Rio de Janeiro saw a gathering of the "informed elite".

That phrase is in quote marks because it's not one that I would ordinarily use, nor in fact is it one that even the person who used it wants to use, as I shall explain.

But it's the impending demise of that phrase that, for me, illuminated the 2012 Global Economic Symposium (GES).

Some background: GES is an annual conference that brings together politicians, bureaucrats, business leaders, academics and experts from around the world to discuss the economic and social challenges we face - and solutions to those challenges. It's a bit like the World Economic Forum in Davos, but without Bono.

I've been fortunate to be involved for the past three events, moderating the annual panel debate on technology related issues, which this year was titled, "Optimising information use through the internet and social media". You can read my submission to the panel here.

At previous events, I've been a little disappointed at the lack of discussion on IT and internet topics, given the fundamental role technology will play in tackling many of the economic problems of the world.

But this year was different - not in that there were more sessions focused on tech, but in the way the web and social media was infused through so many other discussions.

It's clear that among the leaders in Rio last week, there is a mix of fear and excitement around the way that technology is changing our lives, and in particular changing our relationship as citizens and consumers with institutions used to thriving on the mantra "knowledge is power".

In one session I listened to, on "Trust and citizenship in the age of engagement", there was one panellist, a US academic, who was thoroughly dismissive of social media and the power of the crowd, and unable to see how its disruptive nature is threatening the traditional hierarchies that generations have been conditioned to respect. Unable, or unwilling perhaps.

That panel was chaired by Robert Phillips, CEO of the European arm of Edelman, the global PR firm, and also a published author on citizenship and the rise of what he calls digital democracy.

You can read Robert's thoughts on GES2012 here (apologies if it seems like a bit of a love in when you notice he also mentions me in his article, but our views were pretty similar).

Robert it was who used the phrase, "informed elites" - derived from the Edelman Trust Barometer, an annual survey into attitudes about the state of trust in business, government, NGOs, and media across 25 countries. The study classifies us into "informed elites" and "the masses" - a distinction that made me cringe - and it turns out one that Phillips also would prefer to see removed from the study.

What became clear from GES - and the debate during the session I hosted characterised this too - is the degree to which the internet, social media, and access to information is reversing that distinction and terrifying plenty of those former "informed elites" in the process.

We are moving from a world based on vertical hierarchies and hierarchical controls, to one based on networks.

In the hierarchical world, the informed elites exist because - self evidently - they are better informed than the masses below them in the social, cultural and business hierarchy.

Edelman's research shows that trust in those elites - in governments and businesses in particular - has declined rapidly in recent years, while the fastest growing trusted group is "people like me". No surprise there you might say, given the banking crisis and the age of austerity.

But the forces behind the move from hierarchies to networks go deeper than that.

Thanks to the web and social media, we now have for the first time the informed masses. More information is readily, easily and cheaply available to all, and the more we learn about the elite, the less we like them.

Governments are responding with promises of transparency and openness, but such a policy is shown up for its discrete attempts to act as a new form of hierarchical control when we find out about the things that were meant to still remain secret - such as MPs' expense claims, and bankers' bonuses.

The informed elite cannot cope with the idea of the informed masses. If knowledge is power, shared knowledge means shared power. That's a threat.

History proves this. The closest analogy to today's dramatic growth in data volumes and the associated redistribution of information were the after-effects of the invention of the printing press by Gutenburg in 1440. That innovation saw 20 million books printed in the subsequent 50 years, and contributed greatly to massive societal changes that redefined relationships between church, state and individuals. We're seeing an even greater redistribution of information now - and hence, of knowledge, and ultimately of power.

It's no coincidence that social media has grown to such prominence at a time when global hierarchies consolidate upwards into global or regional interest groups. For example, as European countries move towards greater integration within the EU, so their voters feel further removed from governments they already distrust, and become even more alienated by a further level of political hierarchy over which we have seemingly even less influence.

In such circumstances, what do you do? You turn to people like you, who feel the same way. And how do you find them? These days, through social media and the web - tools that have never before been able to bring together people by their millions who feel, instinctively, that something is not right with the hierarchies that control their lives.

There was much talk at GES of the role of leadership, and the failures of leadership that led to the global economic crisis.

I felt such talk missed an important point.

Even with informed masses, you still need good leadership. But today, great leaders are part of the network, not apart from the network. That is a lesson that many of today's leaders have yet to learn.

The elite no longer have a monopoly on the solutions for the world's economic challenges.

Their control and status are being threatened by the bottom-up movement that the internet and social media represent. Top-down hierarchies are being swept away - and that is a challenge that many, perhaps even most, of the so-called informed elites are struggling to come to terms with.

I'm a great believer in the power of technology to improve the world and tackle many of its current problems. But we face a period of time when our leaders see technology as a threat, and the "masses" see it as an opportunity to reshape their relationships with institutions and hierarchies. There will be difficult times ahead as those forces balance out, inevitably, in favour of a networked society.

I expect the nature of the leaders attending GES in, say, 2020, will be very different from those attending last week. I put my trust in the network to make the transition - but for some among the elite, the process will be a very painful one.

Enhanced by Zemanta

Gary McKinnon's legacy

bryang | No Comments
| More
So Gary McKinnon stays free - for now.

At Computer Weekly, we've followed the self-confessed hacker's story for the 10 years it's taken to fight his extradition to the US. Along the way we've seen his cause become an international issue, with prime ministers and presidents discussing his case.

It's for others to discuss the legalities of home secretary Theresa May's decision to rescind the extradition order. It's also for others to debate the approach of US prosecutors that once told McKinnon they wanted him "to fry".

But it's also important to remember that McKinnon is guilty - something he has never denied. It is right that he should face up to the law, and the consequences of his actions - but it's equally right that those consequences should be proportionate to the crime.

The 10 years since McKinnon came to public attention have put his hacking into a very different context. Governments now do far worse on a regular basis than Gary did. It is easy to ponder that the Pentagon would have remained vulnerable to cyber attacks from people with much worse intent were it not for the holes that McKinnon exposed. That's no excuse though, of course.

It's probably not difficult to argue also that the Pentagon and other intelligence services learned a lot about what they can get away with in cross-border cyber intrusion.

And as we've seen in the last year or two, there are plenty of new young Garys out there, operating under the guise of hacktivist groups like Anonymous, still exploiting the security flaws that are all too inherent in modern technology.

The immediate priority for McKinnon is his health. Then he has to face whatever the legal authorities in the UK decide to do about his case. But he also now has an opportunity to put something back into the IT security community, and it would be great to see him put his unwanted notoriety to good use in highlighting to others just how vulnerable our IT systems remain. Nick Leeson, the man who brought down Barings Bank, does a similar thing these days about banking fraud.

But beyond the legalities, and the human cost of McKinnon's 10-year ordeal, there is a lesson for everyone in IT. Information security is now a matter of national security, let alone of business success. Gary McKinnon's case went well beyond the hacking crime he committed. IT security goes well beyond the technicalities of hackers and viruses. If Gary's legacy is to put the topic onto the boardroom agenda of every organisation, then he can, at least, be thanked for that.

Enhanced by Zemanta

When will the traditional model of software licensing die?

bryang | No Comments
| More

When will the traditional model of software licensing die? Surely it is only a matter of time.

This week alone, we've seen Oracle accused of costing customers millions of pounds through its "non-transparent and complicated licensing policy". Legal experts say that firms are being forced to pay huge penalties when using Oracle software in a virtualised environment - a set-up that is increasingly common.

Meanwhile, SAP users in the UK have called for more transparency and better value for money in the supplier's licensing policies. Here too, a user group survey suggests that 95% of SAP customers believe the firm's software licensing is too complicated.

Licensing has always been a point of contention for IT leaders. Computer Weekly was writing about how big software firms would rip off customers through opaque terms and conditions as long ago as the 1990s.

But now, with the growth of the cloud and software as a service, those old models of upfront licence fees with annual maintenance payments look increasingly outdated and inappropriate for a modern IT environment.

One of the biggest culprits is Microsoft, but even the world's biggest software provider is showing early hints of realising the world has changed. CEO Steve Ballmer told shareholders this week that the firm is undergoing a "fundamental shift", and now sees itself as "devices and services company".

The implication between the lines, surely, must be that you don't sell devices and services on the same basis as a conventional software licence. It would be a huge change, with enormous financial implications, were Microsoft to move to a subscription-based model more in tune with the pay-as-you-go ethos of the cloud. It clearly won't happen overnight - but if that is the direction of travel, then perhaps even Microsoft is starting to get it right.

Of course, supporters of open source will be smiling smugly at the travails of licence-encumbered users. It is no coincidence that most of the new cloud services - Amazon, Google, Facebook etc - are built on open-source principles. Imagine the cost of an Oracle database licence for Facebook's server infrastructure.  

There's a bright future for software companies - their products will power the world and our lives. But there are gloomy prospects for any firms that insist on hanging on to outdated software licensing practices from a different age.Enhanced by Zemanta







GES2012: Optimising information use through the internet and social media

bryang | No Comments
| More
This year's Global Economic Symposium (GES) takes place in Rio de Janeiro next week, on 16-17 October.

GES is an annual event that invites stakeholders from around the world to discuss global issues, challenges and problems. It's a great coming together of politicians, business leaders, NGOs, and experts across a huge range of topics - like a smaller version of the World Economic Forum in Davos.

I've been fortunate to be invited for the past three events to moderate the session on technology, which this year is titled "Optimising information use through the internet and social media" - a subject that would take far longer than the allotted 90 minutes to discuss in its entirety.

Panelists are invited to submit their views on the topic at hand in advance, and I thought I'd publish my submission here - I'd be interested in your opinions too:

Encourage the innovators, and allow consumers of information to make the choice

There is a fundamental dilemma to consider when looking for solutions to the challenge of "optimising information use through the internet and social media": the internet and social media have grown as "bottom-up" technologies often used by people to bypass traditional social, cultural and establishment controls, yet the control of most of the information that has value to those people remains in the hands of businesses and governments.

So, when considering how to "optimise" information use, one has to look at who wants to use that information, and who has that information.

Typically the "user" is you and I - individual citizens going about their daily lives, requiring information owned by the state, by business, by educational institutions, healthcare organisations, and in the world of social media, by each other.

In most of those organisations, that information has either a commercial value, or more likely a power value - information being power in so many areas of life. Those organisations are loathe to provide that information if it means loss of commercial benefit, control, or competitive advantage.

So we are increasingly faced with two opposing sides of this challenge.

On one, the digital King Canutes, who see the internet and social media as a threat to their established models, and will use whatever means - often resorting to the law or legislation - to protect their incumbency. The music and film industries are classic examples of sectors reluctant to change and reflect the new demands of their customers, resorting to lobbying government to impose overly restrictive controls on intellectual property.

On the other side, are the organisations that see information access as an opportunity - for example, to empower people to take better care of their health; to encourage innovation through access to government data; and to boost education and business through open access to research.

The question, therefore, is should governments and other representative bodies use their influence - through legislation, regulation or other measures - to lean one way or the other? The evidence to date is that their efforts to do so are cumbersome, slow, and often inappropriate.

It is the latter organisations - those with open, transparent, accountable attitudes to information use - that are gathering popularity and success. The restrictive, laggard organisations are struggling financially, culturally and often even democratically - witness the Arab Spring as an example of that.

My proposal would be to avoid fresh legislation wherever possible, to allow openness to flourish, and ultimately to allow citizens to choose whether they want to deal with those organisations that restrict or those that encourage information use. Such an organic process is already underway, and the best response of business and governments would be to allow it to continue to its natural conclusion.


Enhanced by Zemanta

Why CIOs need to Like the Facebook way of IT

bryang | No Comments
| More
This is a Powerpoint slide I've used a lot lately when I've been asked to give talks on the state of IT or "the next big thing", and it usually seems to raise a smile from the audience:

Likeslide.pngThe aim of the slide is to demonstrate to the audience - typically IT leaders - how the game has changed, and that user expectations of corporate IT are now set by consumer-oriented services such as Facebook.

I was interested, therefore, to read this article by Walter Adamson, a US "social media strategist":

Lessons for Enterprise from Facebook and its 1 billion users

I'd endorse the point it's making wholeheartedly.

If your CEO or CFO (or perhaps increasingly these days, your marketing director) questions the value and cost of IT at your organisation, it's likely they will draw comparisons with the likes of Facebook. One billion users, almost no downtime, with an approximate running cost of $1 per active user? Why can't we do that?

And of course, we in IT know that behind that Like button there is an enormous complexity of software, networks and datacentres. We know that the challenges of delivering hundreds of diverse business applications is different. But not so many IT leaders want to admit that there are lessons to be learned from the likes (pun unintended) of Facebook in how to run corporate IT.

The problem is in the IT supply chain, which over the years has evolved to a point where the  basic commodity that passes through that supply chain is complexity.

I don't mean it's a complex supply chain - retail firms have a complex supply chain but it works perfectly well.

A supply chain, at its simplest, is something that takes a basic commodity and passes it through a series of processes and organisations until enough value has been added to sell it to an end consumer for a profit.

In IT, that basic commodity is complexity.

When the corporate IT buyer talks to their IT supplier, and explains their business need, the supplier's response is, "Ooh, that's complicated." And so the buyer ends up purchasing lots of complicated products, and because they are complicated they have to buy lots of complicated services too, to make the complicated products work.

Unfortunately, IT departments just take that commodity, and add process to it. So, when the business manager comes to the IT manager and explains his or her need, the IT manager says, "Ooh, that's complicated."

And so it goes. For most of the history of corporate IT, that poor business user has had no choice but to accept the complexity. Complexity keeps IT professionals in a job. Jargon and acronyms formalise the complexity and reinforce the processes that add complexity to complexity through the supply chain. And of course, complexity keeps the profit margins of IT suppliers high.

But today, the business user can reject the complexity, and ignore the IT department that says "no", and go to the cloud, or point to the websites that are based on open source and commodity cloud services that support millions of users with no front-end of complexity.

As Adamson puts it in his article: "I am sure that there are all sorts of nuances, all sorts of reasons why 'we're different', 'we're more complicated' etc etc but for all intents and purposes Facebook stays up globally for 1 billion customers while very expensive dedicated enterprise systems in a single country serving a minute fraction of users don't stay up."

These are the questions that IT leaders will have to face - and if you aren't already, you will soon.

IT leaders know that the biggest challenge they face in their IT delivery is complexity - that history of poorly integrated, incompatible legacy systems that somehow grew organically over 10 or 20 years. It's the reason the banks have such a huge IT problem and why online banking and ATM systems seem to crash so much more often - constantly bolting more complexity onto an already over-complex, sprawling legacy infrastructure.

IT leaders want to simplify things, to make upgrades easier and technology more agile and flexible.

But it's going to take a bold CIO to say, we need to start from scratch, we need to learn from how a Facebook or a Twitter or a Google has created enormous IT infrastructures from nothing. But they need to acknowledge that corporate IT departments no longer have the only nor the best way to approach large-scale IT systems, and that there are lessons they can learn.

And those CIOs need to say the same thing to their suppliers, who need to learn that CIOs are no longer delivering IT systems, they are delivering business outcomes, and an annual licence fee with monthly maintenance payment does not deliver a business outcome.

I believe that most CIOs get this. I don't believe many traditional IT suppliers do. And there are problems ahead for corporate IT until that dichotomy is resolved.
Enhanced by Zemanta

HP's latest turnaround plan makes everyone dizzy

bryang | No Comments
| More
HP has announced so many "turnaround plans" in the last few years that employees and customers must be getting dizzy.

As well as the revolving door to the CEO's office, we've seen HP declare itself a services business, a software business, a hardware business, not a PC business, and then a PC business again.

There was much derision on social media about the supplier's latest claim to be the "the world leader in cloud infrastructure" with $4bn of cloud revenue - which may come as a surprise to the likes of Amazon, Google, IBM and others. Just because you sell a bunch of servers that run in a cloud-type environment, doesn't make you a cloud provider.

HP has form for taking the latest trend of the day and slapping it as a label on existing products. A few years ago, when green IT was the marketing vogue, the company claimed it had always been the greenest IT supplier in the world and had in fact been a green IT company since the 1960s.

Manhandling the latest buzzword onto your product range is a far cry from the days when HP's slogan was "Invent". Can anyone name the last, genuine invention or innovation that HP has brought to the market ahead of its rivals, without an acquisition?

People who deal with HP on a regular basis remark on the constant reorganisations, the changing faces among their contacts, and the general state of perpetual unrest. One former HP executive recently told me the strategy was all over the place, and that many employees felt that selling off the low-margin PC business would have been the best possible move.

HP has made a series of major acquisitions and somehow reduced the value of all of them - from Compaq, to EDS, and now Autonomy. The integration of those firms has left the company disorganised and unfocused, reliant on its sheer size and market presence to remain on the shortlists of IT leaders, bolstered by the profits from being the world's biggest ink provider.

Current CEO Meg Whitman has at least chosen the route of brutal honesty in her latest turnaround plan, admitting it will take several years to achieve, and that revenue will decline in key areas of the business. Wall Street responded to such honesty by driving the share price down to its lowest point in 10 years.

In May this year I wrote the following, after HP announced 27,000 job cuts: "HP's 2011 annual revenue was $127bn - but its current market value is less than $42bn. If all HP's customers got together, they could buy the company three times over for what they spend with it in a year."

Five months later, HP's market value is now just $28bn. Just as well those HP customers ignored me - today they could buy the company four times over and still share a profit of $15bn.

Is a "multiyear turnaround" as described by Whitman even a possibility in the current economic climate, and with huge disruptive change going on through the cloud and consumerisation? Can any company have four years' grace to be allowed to make what will have to be fundamental changes to its operations, culture and products?

If HP is to invent anything, it needs to invent a new future for itself.
Enhanced by Zemanta

Cloud is an economic opportunity for the UK - does the government know?

bryang | No Comments
| More
It has long been the case that governments and regulators struggle to keep up with the pace of change in technology. With the growth of cloud computing - the first genuinely globalised, commoditised, off-the-shelf IT service - that challenge threatens to become a serious problem for the European and UK IT sectors.

A new survey of cloud computing users in 50 countries has highlighted the failure of government regulations to keep up with developments as the number one factor eroding confidence in the cloud.

At software-as-a-service provider Salesforce.com's recent Dreamforce user conference, UK customers were critical of the supplier's failure to build a promised datacentre in Europe. For many organisations affected by the European Union's strict data protection laws, that's a showstopper. But should it be?

The European Commission (EC) has at least recognised the problem. This week it announced a new strategy to work with counterparts in the US and Japan to prevent data protection and differing international legal frameworks from hindering a market that the EC estimates could generate €900bn and an additional 3.8 million jobs across the EU by 2020.

But the wheels of such pan-governmental processes turn slowly.

While there are diligent firms that will shun the cloud without guarantees on the physical location of their data, you can bet there are plenty who barely give it a thought, and have sensitive information parked on an Amazon storage system somewhere in the US, because it's the cheapest and easiest place to put it.

The real problem here is that the likes of Salesforce.com and others have so far only considered building a cloud datacentre in the UK or Europe because they have been forced to. We all know that architecturally, with cloud computing it doesn't matter where the physical servers or storage are located.

But the issue for the UK/EU is that we're not seen as a natural location for the big cloud suppliers. The question we should be asking is why? If there are 3.8 million jobs that could be created in an economically depressed Europe, we need rapid incentives for cloud providers to set up here, not new regulations to help when they are forced to.

The financial services sector came to the UK because our location straddling US and Asian time zones in a loosely regulated market made London a highly attractive location. Cloud is a massive economic opportunity for the UK, for the very same reasons. The US and Asia will be happy to spend a few years talking to the EU, while they press ahead and make the most of that opportunity.

We need the government to provide reasons to bring the cloud to the UK now. Tax incentives and planning regulations, for example, that make it easy to build cloud datacentres - no taxpayers' money needed, lots of inward investment created, plus jobs, private sector investment in telecoms infrastructure, a boost for the green energy industry, and the whole cloud ecosystem looking at the UK as a place to be. A few million here and there for a bit of innovation is hardly enough.

Sadly, there is little or no evidence that the government is having such a conversation -or is even aware of the opportunity.
Enhanced by Zemanta

What the cloud means (and it isn't cutting costs)

bryang | No Comments
| More
What are the first words that come to mind when you think of the cloud? Low cost, perhaps. Pay as you go, maybe. Probably also: not secure, too complex, regulatory headaches, lacking standards, no interoperability.

Ask two CIOs to explain what the cloud means to them, and you'll almost certainly receive two different answers. Ask them what are their concerns about the cloud, and they will be in greater agreement.

Sadly, the cloud is currently going the way of so many great technologies in IT - from initial curiosity, to ensuing enthusiasm, to widespread confusion in the light of a welter of meaningless acronyms and a lack of best practice. IaaS? PaaS? SaaS? You can find a cloud supplier putting "at a service" on the end of pretty much every technology available, to the extent it all becomes rather meaningless.

And now that we have a few early adopters, we even hear that some find moving to the cloud doesn't necessarily save the money they had been promised.

Perhaps part of the problem is that the cloud means so many different things to so many different people. So here is a definition that we think will become increasingly significant: cloud is no more, and no less, than the commoditisation of processing power.

In the same way as the internet commoditised networking, and that smartphones, tablets and laptops are commoditising end-user devices, the cloud is doing the same to servers, storage and the provision of software applications on top.

Commoditisation does, typically, mean lower unit costs. But its significance goes much further - it creates a platform for innovation. Once the big, costly, processor, storage, server stuff is reduced to the level of Amazon offering 1Gb of archive disk space per month for just one US cent, it opens up access to computer power previously inaccessible to start-ups and innovators, and really shakes up markets.

There is enormous competitive advantage to be gained by organisations that understand how to make the most of the opportunities for innovation that the cloud presents. If all you want from the cloud is to save money, then you can do that too, if you get it right. But the potential benefits are so much more.Enhanced by Zemanta

Have you entered our awards yet?

Archives

Recent Comments

  • James James: As above, I have just read this page. My view, read more
  • Ashutosh Sharma: Hi Bryan, Another nice blog however, I was surprised to read more
  • Philip Virgo: Some of these are very old questions - such as read more
  • Tim Manning: An important debate. But it's not a question of what read more
  • Alan Charnley: Very interesting piece Brian. We’ve seen many reports of a read more
  • David Chassels: Hi Bryan In a tweet the Guardian Public Leaders were read more
  • David Rigbye: I used Mail Goggles religiously. If anyone out there used read more
  • Stevey McFly: Windows 8 should be rather called Windows 7.5 in my read more
  • Ashutosh Sharma: Bryan, I agree the real winner will be decided by read more
  • David Chassels: The BIG step is commoditisation of business software, no read more

Dilbert

 

 

-- Advertisement --