Tax transparency will be a growing issue for government IT suppliers

bryang | No Comments
| More
Considering the amount of taxpayers' cash that goes to IT suppliers - some £16bn per year at latest estimates - it was inevitable that the big multinational systems integrators that dominate Whitehall IT would come under scrutiny over their tax payments.

The vultures have been circling over big US firms like Google, Amazon and Starbucks, who have been questioned by MPs over the ludicrously small amounts of corporation tax they pay in the UK, despite the millions (or even billions) of pounds they make from UK customers.

Private Eye has now pointed the finger at IT suppliers, by naming IBM, HP, Fujitsu and Capgemini as likely candidates paying less tax than perhaps they should.

And Computer Weekly contributor Mark Ballard has shown how CSC paid just 0.5% tax on £1.5bn income earned from a 10-year outsourcing deal with Royal Mail signed in 2003.

It's important to point out that none of this is illegal - it's all clever accounting, using tricks like paying internal charges to an overseas head office to reduce profits and minimise corporation tax.

But when it comes to supplying government, you get into a whole new moral maze.

Should government be giving taxpayers' money to IT suppliers who pay very little of that back in tax to the state coffers? And is it, therefore, reasonable to consider the amount of tax paid by a supplier relative to the value of their contracts when assessing their bids for new IT projects?

As disclosed by Computer Weekly, government officials are considering whether or not it is feasible to force suppliers to include details of their tax payments relative to UK revenues when bidding for contracts.

Even if there are legal barriers that make such a move impossible, there is enough data in Whitehall to expose any discrepancies in suppliers' contributions to UK plc.

The Cabinet Office is trying to make the whole IT procurement process more transparent - wouldn't it be good to have some sort of online resource that compares major supplier's tax payments (sourced through HM Revenue & Customs' IT systems) with the value of the contracts they have?

You can imagine the Government Digital Service putting together a website in pretty rapid time to display that data - wouldn't it make for some interesting conversations between government CIOs and their suppliers if they could call up a nice graph of that information during contract negotiations?

There's bound to be more to come on this issue - the big systems integrators would be well advised if they voluntarily adopted a policy of transparency over their real value to UK plc.

Enhanced by Zemanta

Universal Credit - the last failure of the old IT regime, or a boost for the new?

bryang | No Comments
| More
There are two things that often signal a major government IT project on the brink of disaster. First, streams of leaks appear suggesting little problems here and rather bigger problems there; and second, the relevant Whitehall press office tells journalists it is not going to provide a "running commentary" on progress. That's been the story of Universal Credit (UC) for the past couple of weeks.

First, Computer Weekly revealed that several senior executives running the programme had departed. Officially, this is because the project has moved to a "different phase" and requires "different skills sets".

Talk privately to almost anyone with knowledge of the project and it's clear that was not the reason.

Since then, more stories have circulated about UC running late and being over budget. There seems little doubt it is a priority project to sort out, not least because the government's flagship welfare reform hangs on its success.

The new Department for Work and Pensions (DWP) CIO, Philip Langsdale, has a track record of turning round problem projects, having transformed the technology at Heathrow after the fiasco of Terminal 5's opening. He was even appointed by BAA to put together the emergency plan to cope with extreme weather after the airport was shut for days due to snow two winters ago.

Langsdale is conducting a thorough overhaul of UC - both internally and with the major suppliers involved, whose relationship with the DWP had become far too cosy.

While critics will, justifiably, shake their heads and cite yet another looming government IT disaster, the situation with UC has wider ramifications for the future of Whitehall IT.

There is a huge amount of good things going on in central government IT. The digital strategy led by Government Digital Service director Mike Bracken is delivering impressive results using open-source technology and agile development to deliver high quality projects at low cost in short timescales. The new open standards policy should reduce costs and supplier lock-in. And the G-Cloud is opening up the market to small suppliers that would have previously been unable to compete against the big systems integrators (SIs) that have historically dominated in central government.

Those three initiatives alone have already slain a number of sacred cows belonging to the old school who said you couldn't do things that way.

It would be a terrible shame if a disastrous UC project overshadowed those achievements and created the impression that nothing has changed.

However, it's also important to point out that UC is perhaps the last of the mega-projects that was set up under the old rules. It contracted the usual SI suspects into huge, multimillion-pound, multi-year deals. It bypassed the spending controls put in place by the Cabinet Office - rumour has it that work and pensions secretary Iain Duncan Smith personally authorised the avoidance of those rules.

So a very public failure for UC might also be an opportunity for some "I told you so's" behind closed doors in the corridors of Whitehall.

Universal Credit is not yet a failed IT project, and Langsdale has time to turn it round. But if it does fail, it could become the final failure of a failed IT regime that is being consigned to the past.Enhanced by Zemanta

We need more visionary IT leaders

bryang | No Comments
| More
When IT leaders explain their reasons for not moving to a new technology, it is almost always due to immediate or short-term perceived problems.

Should you move to the cloud? Security is a concern. Should you overhaul your information security? Ooh, too risky, what if we get hit by cyber attackers? Should you let staff use their own devices? That's not policy.

Whatever happened to long-term, visionary planning?

Think about what the IT world will be like in 10 years, perhaps even in five. Cloud will be ubiquitous in how organisations use IT. Emerging technologies like micro-virtualisation or advanced encryption will make data more secure than ever. Bring your own device (BYOD) will be the default for user access to corporate systems.

We all know these things, even if only as a gut feel. Looking further ahead makes the apparent problems faced today seem smaller, and puts them into better perspective. Working toward that vision gives a motivation to find solutions, and a reason to make them happen.

But still people stick to their complex legacy systems, and see only the hurdles in front of them, not the finishing line that can be reached with innovation and vision. Banks stick to old mainframes with 20-year old software, because there's no three-year business case to justify replacing them. Royal Bank of Scotland can tell you what happens then.

IT leadership in large companies has become an increasingly short-term role. We often see CIOs coming in for a three to five year period to manage a programme of technology change. There seem to be fewer CIOs who offer a 10-year vision of the way that IT will change their business or industry sector, and set about working towards that vision.

There are, of course, exceptions. Barclays, for example, is encouraging staff to collaborate on ideas and support innovative projects. ""It is okay to fail. Scar tissue is a good thing," says the bank's European CIO Anthony Watson.

IT suppliers don't help either, focused as they are on the next deal for their latest product. What's more, according to a survey this week, 39% of IT staff lose at least one working day per week on tackling IT problems and chasing suppliers, while 69% have dropped suppliers in the past year because of customer service shortfalls.

The opportunities for business to grow through innovative uses of technology are greater than ever, and that needs IT leaders with long-term vision, working with suppliers who can share and support that goal. Surely, that's not too difficult to achieve, is it?Enhanced by Zemanta

The government's open standards policy is bold, important and very carefully written

bryang | No Comments
| More
The government has finally released its policy for open standards in IT - after an often controversial consultation process - and it will surprise and delight many observers who expected a meek compromise to the lobbying power of the software industry.

The new "Open Standards Principles" are bold, important, and clearly written with a smart lawyer and a clever linguist looking over the shoulder of the author. They are mandatory immediately for all central government IT purchases. And they will worry the big incumbent suppliers who have been used to a long-term lock-in to their products.

Here are a few of the boldest highlights from the policy document:

"The product choice made by a government body must not force other users, delivery partners or government bodies, to buy the same product."

This is hugely significant. Think how many times you have bought the same software because the cost of integrating any other software would obviate the potential benefits from a product that may be better than what you currently have. Want to buy a Linux server, but can't because it doesn't integrate with your Windows Server software? That's no longer allowed.

"Government bodies should expose application programming interfaces (APIs) for its services to enable value-added services to be built on government information and data."

The idea of "government as a platform" has been discussed for some time, but formally encouraging APIs for private sector firms to develop services around public data takes that a big step forward. It is, admittedly, a very Tory policy - using IT to push public services out of the centre and to involve businesses in their delivery - but from a technology perspective it is forward thinking and far reaching in its implications.

"For government bodies that are identified as not adhering to the Open Standards Principles (e.g. through transparent reporting or spend controls cases), Cabinet Office may consider lowering the threshold for IT spend controls until alignment is demonstrated."

In other words, if you don't comply, the Cabinet Office will make you justify every minor piece of IT spend until you get so fed up with the scrutiny that you go along with the policy.

"As part of examining the total cost of ownership of a government IT solution, the costs of exit for a component should be estimated at the start of implementation. As unlocking costs are identified, these must be associated with the incumbent supplier/system and not be associated with cost of new IT projects."

This is really clever. Proprietary software suppliers have long been protected by the prohibitive cost of moving away from their products. That cost is always considered as part of the business case for a new project - so the price of moving from incumbent supplier X to new supplier Y becomes part of the cost of moving to supplier Y, and usually that makes such a move unaffordable. Instead, the government will include the cost of moving away from supplier X as part of the initial business case for buying from them in the first place. In other words, the cost of moving away from an incumbent supplier is added to the purchase price for that supplier. That's smart - and scary for a lot of supplier Xs.

"Other than for reasons of national security, essential government extensions or variations to open standards for software interoperability, data or document formats must themselves be made available under an open licence and be publicly shared to enable others to build upon them."

In other words - if a supplier has to spend money to integrate their product to an existing, policy compliant, open-standards-based product, not only are they banned from passing that cost on to the government buyer, but they must also offer the result of their integration work for free to other government buyers. That's going to hurt a proprietary software provider.

"Rights essential to implementation of the standard, and for interfacing with other implementations which have adopted that same standard, are licensed on a royalty free basis that is compatible with both open source and proprietary licensed solutions."

This is perhaps the most controversial policy of all. Much of the heated debate in the consultation process came from well-funded lobbying by the big software suppliers (you know who they are) to convince government that software provided under a Frand (fair, reasonable and non-discriminatory) licensing policy could be defined as an open standard.

In effect, their argument was that even if you have to pay a royalty to a third-party for their ownership of all or part of a standard, then it was still an open standard. The Cabinet Office disagreed. They disagreed before the consultation, and went through the consultation so they could justify to somebody else's lawyers that they had properly considered the arguments over Frand. And then they came up with a policy that disagreed anyway.

This clause is the essence of one of the primary goals of the whole open standards policy - to create a level playing field between proprietary and open-source software.

It's worth stating that royalty-free does not mean that you have to use free software to be considered open. But it does mean that to be considered open, you cannot include a standard that costs money. It means that proprietary, patented standards are not considered open if government is paying a royalty (whether overtly or hidden in pricing) for their use.

Open source software, by the nature of its licensing, cannot include royalty-encumbered standards. If an open-source supplier wanted to interoperate with a proprietary product using a Frand-based standard, it would be prohibited from doing so by the fact that open source has to be publicly shared, and you're not allowed to publicly share software that carries a royalty. So the open-source product is prevented from integrating with the proprietary, Frand-based "open" product.

So the rejection of Frand is going to worry the big incumbent suppliers more than anything.

Of course, a policy has to be implemented, and the big test will be the first time that Microsoft, or Oracle, or whoever, loses a deal because they are not considered to be open standards compliant, and then we'll see how much they want to fight the decision through the courts, and whether the government has the stomach for that fight.

But by producing the open standards principles that it has, the Cabinet Office has signalled that's a fight it is willing to take on. Seconds out...

Enhanced by Zemanta

Windows 8 - the eminently sensible product launch of the week

bryang | 1 Comment
| More
Despite all the hype about iPad minis, there's no doubt the most significant product launch of the week for IT managers is the release of Windows 8.

Unlike Apple's "top secret but with lots of leaks" approach to new products, we know pretty much everything there is to know about Windows 8 already.

We know that it's going to confuse a lot of users with its attempt to combine a new touch-oriented interface with the conventional method.

We know it's going to present challenges for software developers who now have the ARM-based Windows RT version for tablets to consider too.

And we know it is going to be scrutinised more than ever as to whether it is good enough to attract consumer sales away from the iPad as the bring-your-own-device (BYOD) trend continues in the workplace.

The big pitch

The big pitch for Windows 8 is promising, and hugely significant for Microsoft. This operating system is all about extending the Windows ecosystem to encompass everything from smartphones to tablets to PCs and on up to servers.

There's no denying that is going to be an attractive option for IT managers looking to eliminate the complexity of multiple architectures and multiple environments. It's an eminently sensible thing to do.

But while "eminently sensible" has been the underlying theme of Microsoft's long track record in corporate IT, more is expected of our technology these days.

Does it matter to IT decision-makers that Microsoft is no longer cool? When it comes to return on investment and making the business case, almost certainly not.

But when it comes to meeting the heightened (if not always realistic) expectations of users and business executives, perhaps it does.

Arguably, it would be a more significant move if Microsoft were to release Office for iOS and Android, and also an Active Directory client for those tablet environments. Increasingly, the corporate commitment to Microsoft comes less from the Windows PC, and more from Windows Server, Active Directory, SharePoint and other back-office infrastructure software - and from the integration between Office and those products. What serious alternatives to Windows Server are there, for example?

As browser-based applications become increasingly the norm for businesses, the user operating system is less important. Even Office is becoming increasingly cloud-based.

Microsoft is betting its future on the fact that users might not be enthused about Windows on a tablet, but at least will be comfortable with it if driven in that direction by the IT department.

Skimming the Surface

There are big question marks over the depth of Microsoft's commitment to its Surface tablet - for a start, it's only available online or in a Microsoft store (when was the last time you saw one of them?) There's no mass market distribution strategy for the product, and Microsoft has to tread a fine line between encouraging its hardware partners to make Windows 8 tablets of their own and pushing Surface. Surely if there was an aggressive plan to make Surface a real rival to the iPad, Microsoft would be exploiting its established retail distribution channels for the Xbox?

You can see that companies that look to tablets for line-of-business applications are going to tend towards Surface or its Windows 8 alternatives. In areas like healthcare, education, field sales and others, the integration with the corporate environment is a winner, and users would not expect such devices to be used in their personal life.

But for the general purpose user computing environment, where employees increasingly want to have a dual-use machine that is also their personal device, they are still going to be more likely to opt for Apple or Android, and less likely to be swayed by an IT manager telling them it's better for the company if they choose Windows.

And of course, let's not forget that most corporate IT is still moving from XP to Windows 7 - a major migration to 8 is some time away, and by that time the proliferation of other tablets in the workplace will have grown.

The other things

Microsoft will continue to bluster and hyperbole about Surface - "There's nothing like Microsoft Surface on the market today," CEO Steve Ballmer told the BBC. "The other things have a purpose but they're nothing like the Surface."

The "other things" seem to be doing OK though, and at best Windows 8 will be the number three tablet and smartphone operating system in a market that already has two dominant players.

Apple and Android are not going to kill Microsoft. The Redmond giant isn't going away from the corporate market. But over the next few years, it's less likely that Windows 8, for all its cross-platform standardisation, is going to be the core of Microsoft's success in business. Microsoft's future area of dominance is increasingly going to be back-office based.

Enhanced by Zemanta






Leaders need to be part of the network not apart from it

bryang | No Comments
| More
Last week in Rio de Janeiro saw a gathering of the "informed elite".

That phrase is in quote marks because it's not one that I would ordinarily use, nor in fact is it one that even the person who used it wants to use, as I shall explain.

But it's the impending demise of that phrase that, for me, illuminated the 2012 Global Economic Symposium (GES).

Some background: GES is an annual conference that brings together politicians, bureaucrats, business leaders, academics and experts from around the world to discuss the economic and social challenges we face - and solutions to those challenges. It's a bit like the World Economic Forum in Davos, but without Bono.

I've been fortunate to be involved for the past three events, moderating the annual panel debate on technology related issues, which this year was titled, "Optimising information use through the internet and social media". You can read my submission to the panel here.

At previous events, I've been a little disappointed at the lack of discussion on IT and internet topics, given the fundamental role technology will play in tackling many of the economic problems of the world.

But this year was different - not in that there were more sessions focused on tech, but in the way the web and social media was infused through so many other discussions.

It's clear that among the leaders in Rio last week, there is a mix of fear and excitement around the way that technology is changing our lives, and in particular changing our relationship as citizens and consumers with institutions used to thriving on the mantra "knowledge is power".

In one session I listened to, on "Trust and citizenship in the age of engagement", there was one panellist, a US academic, who was thoroughly dismissive of social media and the power of the crowd, and unable to see how its disruptive nature is threatening the traditional hierarchies that generations have been conditioned to respect. Unable, or unwilling perhaps.

That panel was chaired by Robert Phillips, CEO of the European arm of Edelman, the global PR firm, and also a published author on citizenship and the rise of what he calls digital democracy.

You can read Robert's thoughts on GES2012 here (apologies if it seems like a bit of a love in when you notice he also mentions me in his article, but our views were pretty similar).

Robert it was who used the phrase, "informed elites" - derived from the Edelman Trust Barometer, an annual survey into attitudes about the state of trust in business, government, NGOs, and media across 25 countries. The study classifies us into "informed elites" and "the masses" - a distinction that made me cringe - and it turns out one that Phillips also would prefer to see removed from the study.

What became clear from GES - and the debate during the session I hosted characterised this too - is the degree to which the internet, social media, and access to information is reversing that distinction and terrifying plenty of those former "informed elites" in the process.

We are moving from a world based on vertical hierarchies and hierarchical controls, to one based on networks.

In the hierarchical world, the informed elites exist because - self evidently - they are better informed than the masses below them in the social, cultural and business hierarchy.

Edelman's research shows that trust in those elites - in governments and businesses in particular - has declined rapidly in recent years, while the fastest growing trusted group is "people like me". No surprise there you might say, given the banking crisis and the age of austerity.

But the forces behind the move from hierarchies to networks go deeper than that.

Thanks to the web and social media, we now have for the first time the informed masses. More information is readily, easily and cheaply available to all, and the more we learn about the elite, the less we like them.

Governments are responding with promises of transparency and openness, but such a policy is shown up for its discrete attempts to act as a new form of hierarchical control when we find out about the things that were meant to still remain secret - such as MPs' expense claims, and bankers' bonuses.

The informed elite cannot cope with the idea of the informed masses. If knowledge is power, shared knowledge means shared power. That's a threat.

History proves this. The closest analogy to today's dramatic growth in data volumes and the associated redistribution of information were the after-effects of the invention of the printing press by Gutenburg in 1440. That innovation saw 20 million books printed in the subsequent 50 years, and contributed greatly to massive societal changes that redefined relationships between church, state and individuals. We're seeing an even greater redistribution of information now - and hence, of knowledge, and ultimately of power.

It's no coincidence that social media has grown to such prominence at a time when global hierarchies consolidate upwards into global or regional interest groups. For example, as European countries move towards greater integration within the EU, so their voters feel further removed from governments they already distrust, and become even more alienated by a further level of political hierarchy over which we have seemingly even less influence.

In such circumstances, what do you do? You turn to people like you, who feel the same way. And how do you find them? These days, through social media and the web - tools that have never before been able to bring together people by their millions who feel, instinctively, that something is not right with the hierarchies that control their lives.

There was much talk at GES of the role of leadership, and the failures of leadership that led to the global economic crisis.

I felt such talk missed an important point.

Even with informed masses, you still need good leadership. But today, great leaders are part of the network, not apart from the network. That is a lesson that many of today's leaders have yet to learn.

The elite no longer have a monopoly on the solutions for the world's economic challenges.

Their control and status are being threatened by the bottom-up movement that the internet and social media represent. Top-down hierarchies are being swept away - and that is a challenge that many, perhaps even most, of the so-called informed elites are struggling to come to terms with.

I'm a great believer in the power of technology to improve the world and tackle many of its current problems. But we face a period of time when our leaders see technology as a threat, and the "masses" see it as an opportunity to reshape their relationships with institutions and hierarchies. There will be difficult times ahead as those forces balance out, inevitably, in favour of a networked society.

I expect the nature of the leaders attending GES in, say, 2020, will be very different from those attending last week. I put my trust in the network to make the transition - but for some among the elite, the process will be a very painful one.

Enhanced by Zemanta

Gary McKinnon's legacy

bryang | No Comments
| More
So Gary McKinnon stays free - for now.

At Computer Weekly, we've followed the self-confessed hacker's story for the 10 years it's taken to fight his extradition to the US. Along the way we've seen his cause become an international issue, with prime ministers and presidents discussing his case.

It's for others to discuss the legalities of home secretary Theresa May's decision to rescind the extradition order. It's also for others to debate the approach of US prosecutors that once told McKinnon they wanted him "to fry".

But it's also important to remember that McKinnon is guilty - something he has never denied. It is right that he should face up to the law, and the consequences of his actions - but it's equally right that those consequences should be proportionate to the crime.

The 10 years since McKinnon came to public attention have put his hacking into a very different context. Governments now do far worse on a regular basis than Gary did. It is easy to ponder that the Pentagon would have remained vulnerable to cyber attacks from people with much worse intent were it not for the holes that McKinnon exposed. That's no excuse though, of course.

It's probably not difficult to argue also that the Pentagon and other intelligence services learned a lot about what they can get away with in cross-border cyber intrusion.

And as we've seen in the last year or two, there are plenty of new young Garys out there, operating under the guise of hacktivist groups like Anonymous, still exploiting the security flaws that are all too inherent in modern technology.

The immediate priority for McKinnon is his health. Then he has to face whatever the legal authorities in the UK decide to do about his case. But he also now has an opportunity to put something back into the IT security community, and it would be great to see him put his unwanted notoriety to good use in highlighting to others just how vulnerable our IT systems remain. Nick Leeson, the man who brought down Barings Bank, does a similar thing these days about banking fraud.

But beyond the legalities, and the human cost of McKinnon's 10-year ordeal, there is a lesson for everyone in IT. Information security is now a matter of national security, let alone of business success. Gary McKinnon's case went well beyond the hacking crime he committed. IT security goes well beyond the technicalities of hackers and viruses. If Gary's legacy is to put the topic onto the boardroom agenda of every organisation, then he can, at least, be thanked for that.

Enhanced by Zemanta

When will the traditional model of software licensing die?

bryang | No Comments
| More

When will the traditional model of software licensing die? Surely it is only a matter of time.

This week alone, we've seen Oracle accused of costing customers millions of pounds through its "non-transparent and complicated licensing policy". Legal experts say that firms are being forced to pay huge penalties when using Oracle software in a virtualised environment - a set-up that is increasingly common.

Meanwhile, SAP users in the UK have called for more transparency and better value for money in the supplier's licensing policies. Here too, a user group survey suggests that 95% of SAP customers believe the firm's software licensing is too complicated.

Licensing has always been a point of contention for IT leaders. Computer Weekly was writing about how big software firms would rip off customers through opaque terms and conditions as long ago as the 1990s.

But now, with the growth of the cloud and software as a service, those old models of upfront licence fees with annual maintenance payments look increasingly outdated and inappropriate for a modern IT environment.

One of the biggest culprits is Microsoft, but even the world's biggest software provider is showing early hints of realising the world has changed. CEO Steve Ballmer told shareholders this week that the firm is undergoing a "fundamental shift", and now sees itself as "devices and services company".

The implication between the lines, surely, must be that you don't sell devices and services on the same basis as a conventional software licence. It would be a huge change, with enormous financial implications, were Microsoft to move to a subscription-based model more in tune with the pay-as-you-go ethos of the cloud. It clearly won't happen overnight - but if that is the direction of travel, then perhaps even Microsoft is starting to get it right.

Of course, supporters of open source will be smiling smugly at the travails of licence-encumbered users. It is no coincidence that most of the new cloud services - Amazon, Google, Facebook etc - are built on open-source principles. Imagine the cost of an Oracle database licence for Facebook's server infrastructure.  

There's a bright future for software companies - their products will power the world and our lives. But there are gloomy prospects for any firms that insist on hanging on to outdated software licensing practices from a different age.Enhanced by Zemanta

GES2012: Optimising information use through the internet and social media

bryang | No Comments
| More
This year's Global Economic Symposium (GES) takes place in Rio de Janeiro next week, on 16-17 October.

GES is an annual event that invites stakeholders from around the world to discuss global issues, challenges and problems. It's a great coming together of politicians, business leaders, NGOs, and experts across a huge range of topics - like a smaller version of the World Economic Forum in Davos.

I've been fortunate to be invited for the past three events to moderate the session on technology, which this year is titled "Optimising information use through the internet and social media" - a subject that would take far longer than the allotted 90 minutes to discuss in its entirety.

Panelists are invited to submit their views on the topic at hand in advance, and I thought I'd publish my submission here - I'd be interested in your opinions too:

Encourage the innovators, and allow consumers of information to make the choice

There is a fundamental dilemma to consider when looking for solutions to the challenge of "optimising information use through the internet and social media": the internet and social media have grown as "bottom-up" technologies often used by people to bypass traditional social, cultural and establishment controls, yet the control of most of the information that has value to those people remains in the hands of businesses and governments.

So, when considering how to "optimise" information use, one has to look at who wants to use that information, and who has that information.

Typically the "user" is you and I - individual citizens going about their daily lives, requiring information owned by the state, by business, by educational institutions, healthcare organisations, and in the world of social media, by each other.

In most of those organisations, that information has either a commercial value, or more likely a power value - information being power in so many areas of life. Those organisations are loathe to provide that information if it means loss of commercial benefit, control, or competitive advantage.

So we are increasingly faced with two opposing sides of this challenge.

On one, the digital King Canutes, who see the internet and social media as a threat to their established models, and will use whatever means - often resorting to the law or legislation - to protect their incumbency. The music and film industries are classic examples of sectors reluctant to change and reflect the new demands of their customers, resorting to lobbying government to impose overly restrictive controls on intellectual property.

On the other side, are the organisations that see information access as an opportunity - for example, to empower people to take better care of their health; to encourage innovation through access to government data; and to boost education and business through open access to research.

The question, therefore, is should governments and other representative bodies use their influence - through legislation, regulation or other measures - to lean one way or the other? The evidence to date is that their efforts to do so are cumbersome, slow, and often inappropriate.

It is the latter organisations - those with open, transparent, accountable attitudes to information use - that are gathering popularity and success. The restrictive, laggard organisations are struggling financially, culturally and often even democratically - witness the Arab Spring as an example of that.

My proposal would be to avoid fresh legislation wherever possible, to allow openness to flourish, and ultimately to allow citizens to choose whether they want to deal with those organisations that restrict or those that encourage information use. Such an organic process is already underway, and the best response of business and governments would be to allow it to continue to its natural conclusion.


Enhanced by Zemanta

Why CIOs need to Like the Facebook way of IT

bryang | No Comments
| More
This is a Powerpoint slide I've used a lot lately when I've been asked to give talks on the state of IT or "the next big thing", and it usually seems to raise a smile from the audience:

Likeslide.pngThe aim of the slide is to demonstrate to the audience - typically IT leaders - how the game has changed, and that user expectations of corporate IT are now set by consumer-oriented services such as Facebook.

I was interested, therefore, to read this article by Walter Adamson, a US "social media strategist":

Lessons for Enterprise from Facebook and its 1 billion users

I'd endorse the point it's making wholeheartedly.

If your CEO or CFO (or perhaps increasingly these days, your marketing director) questions the value and cost of IT at your organisation, it's likely they will draw comparisons with the likes of Facebook. One billion users, almost no downtime, with an approximate running cost of $1 per active user? Why can't we do that?

And of course, we in IT know that behind that Like button there is an enormous complexity of software, networks and datacentres. We know that the challenges of delivering hundreds of diverse business applications is different. But not so many IT leaders want to admit that there are lessons to be learned from the likes (pun unintended) of Facebook in how to run corporate IT.

The problem is in the IT supply chain, which over the years has evolved to a point where the  basic commodity that passes through that supply chain is complexity.

I don't mean it's a complex supply chain - retail firms have a complex supply chain but it works perfectly well.

A supply chain, at its simplest, is something that takes a basic commodity and passes it through a series of processes and organisations until enough value has been added to sell it to an end consumer for a profit.

In IT, that basic commodity is complexity.

When the corporate IT buyer talks to their IT supplier, and explains their business need, the supplier's response is, "Ooh, that's complicated." And so the buyer ends up purchasing lots of complicated products, and because they are complicated they have to buy lots of complicated services too, to make the complicated products work.

Unfortunately, IT departments just take that commodity, and add process to it. So, when the business manager comes to the IT manager and explains his or her need, the IT manager says, "Ooh, that's complicated."

And so it goes. For most of the history of corporate IT, that poor business user has had no choice but to accept the complexity. Complexity keeps IT professionals in a job. Jargon and acronyms formalise the complexity and reinforce the processes that add complexity to complexity through the supply chain. And of course, complexity keeps the profit margins of IT suppliers high.

But today, the business user can reject the complexity, and ignore the IT department that says "no", and go to the cloud, or point to the websites that are based on open source and commodity cloud services that support millions of users with no front-end of complexity.

As Adamson puts it in his article: "I am sure that there are all sorts of nuances, all sorts of reasons why 'we're different', 'we're more complicated' etc etc but for all intents and purposes Facebook stays up globally for 1 billion customers while very expensive dedicated enterprise systems in a single country serving a minute fraction of users don't stay up."

These are the questions that IT leaders will have to face - and if you aren't already, you will soon.

IT leaders know that the biggest challenge they face in their IT delivery is complexity - that history of poorly integrated, incompatible legacy systems that somehow grew organically over 10 or 20 years. It's the reason the banks have such a huge IT problem and why online banking and ATM systems seem to crash so much more often - constantly bolting more complexity onto an already over-complex, sprawling legacy infrastructure.

IT leaders want to simplify things, to make upgrades easier and technology more agile and flexible.

But it's going to take a bold CIO to say, we need to start from scratch, we need to learn from how a Facebook or a Twitter or a Google has created enormous IT infrastructures from nothing. But they need to acknowledge that corporate IT departments no longer have the only nor the best way to approach large-scale IT systems, and that there are lessons they can learn.

And those CIOs need to say the same thing to their suppliers, who need to learn that CIOs are no longer delivering IT systems, they are delivering business outcomes, and an annual licence fee with monthly maintenance payment does not deliver a business outcome.

I believe that most CIOs get this. I don't believe many traditional IT suppliers do. And there are problems ahead for corporate IT until that dichotomy is resolved.
Enhanced by Zemanta






HP's latest turnaround plan makes everyone dizzy

bryang | No Comments
| More
HP has announced so many "turnaround plans" in the last few years that employees and customers must be getting dizzy.

As well as the revolving door to the CEO's office, we've seen HP declare itself a services business, a software business, a hardware business, not a PC business, and then a PC business again.

There was much derision on social media about the supplier's latest claim to be the "the world leader in cloud infrastructure" with $4bn of cloud revenue - which may come as a surprise to the likes of Amazon, Google, IBM and others. Just because you sell a bunch of servers that run in a cloud-type environment, doesn't make you a cloud provider.

HP has form for taking the latest trend of the day and slapping it as a label on existing products. A few years ago, when green IT was the marketing vogue, the company claimed it had always been the greenest IT supplier in the world and had in fact been a green IT company since the 1960s.

Manhandling the latest buzzword onto your product range is a far cry from the days when HP's slogan was "Invent". Can anyone name the last, genuine invention or innovation that HP has brought to the market ahead of its rivals, without an acquisition?

People who deal with HP on a regular basis remark on the constant reorganisations, the changing faces among their contacts, and the general state of perpetual unrest. One former HP executive recently told me the strategy was all over the place, and that many employees felt that selling off the low-margin PC business would have been the best possible move.

HP has made a series of major acquisitions and somehow reduced the value of all of them - from Compaq, to EDS, and now Autonomy. The integration of those firms has left the company disorganised and unfocused, reliant on its sheer size and market presence to remain on the shortlists of IT leaders, bolstered by the profits from being the world's biggest ink provider.

Current CEO Meg Whitman has at least chosen the route of brutal honesty in her latest turnaround plan, admitting it will take several years to achieve, and that revenue will decline in key areas of the business. Wall Street responded to such honesty by driving the share price down to its lowest point in 10 years.

In May this year I wrote the following, after HP announced 27,000 job cuts: "HP's 2011 annual revenue was $127bn - but its current market value is less than $42bn. If all HP's customers got together, they could buy the company three times over for what they spend with it in a year."

Five months later, HP's market value is now just $28bn. Just as well those HP customers ignored me - today they could buy the company four times over and still share a profit of $15bn.

Is a "multiyear turnaround" as described by Whitman even a possibility in the current economic climate, and with huge disruptive change going on through the cloud and consumerisation? Can any company have four years' grace to be allowed to make what will have to be fundamental changes to its operations, culture and products?

If HP is to invent anything, it needs to invent a new future for itself.
Enhanced by Zemanta

Cloud is an economic opportunity for the UK - does the government know?

bryang | No Comments
| More
It has long been the case that governments and regulators struggle to keep up with the pace of change in technology. With the growth of cloud computing - the first genuinely globalised, commoditised, off-the-shelf IT service - that challenge threatens to become a serious problem for the European and UK IT sectors.

A new survey of cloud computing users in 50 countries has highlighted the failure of government regulations to keep up with developments as the number one factor eroding confidence in the cloud.

At software-as-a-service provider Salesforce.com's recent Dreamforce user conference, UK customers were critical of the supplier's failure to build a promised datacentre in Europe. For many organisations affected by the European Union's strict data protection laws, that's a showstopper. But should it be?

The European Commission (EC) has at least recognised the problem. This week it announced a new strategy to work with counterparts in the US and Japan to prevent data protection and differing international legal frameworks from hindering a market that the EC estimates could generate €900bn and an additional 3.8 million jobs across the EU by 2020.

But the wheels of such pan-governmental processes turn slowly.

While there are diligent firms that will shun the cloud without guarantees on the physical location of their data, you can bet there are plenty who barely give it a thought, and have sensitive information parked on an Amazon storage system somewhere in the US, because it's the cheapest and easiest place to put it.

The real problem here is that the likes of Salesforce.com and others have so far only considered building a cloud datacentre in the UK or Europe because they have been forced to. We all know that architecturally, with cloud computing it doesn't matter where the physical servers or storage are located.

But the issue for the UK/EU is that we're not seen as a natural location for the big cloud suppliers. The question we should be asking is why? If there are 3.8 million jobs that could be created in an economically depressed Europe, we need rapid incentives for cloud providers to set up here, not new regulations to help when they are forced to.

The financial services sector came to the UK because our location straddling US and Asian time zones in a loosely regulated market made London a highly attractive location. Cloud is a massive economic opportunity for the UK, for the very same reasons. The US and Asia will be happy to spend a few years talking to the EU, while they press ahead and make the most of that opportunity.

We need the government to provide reasons to bring the cloud to the UK now. Tax incentives and planning regulations, for example, that make it easy to build cloud datacentres - no taxpayers' money needed, lots of inward investment created, plus jobs, private sector investment in telecoms infrastructure, a boost for the green energy industry, and the whole cloud ecosystem looking at the UK as a place to be. A few million here and there for a bit of innovation is hardly enough.

Sadly, there is little or no evidence that the government is having such a conversation -or is even aware of the opportunity.
Enhanced by Zemanta

What the cloud means (and it isn't cutting costs)

bryang | No Comments
| More
What are the first words that come to mind when you think of the cloud? Low cost, perhaps. Pay as you go, maybe. Probably also: not secure, too complex, regulatory headaches, lacking standards, no interoperability.

Ask two CIOs to explain what the cloud means to them, and you'll almost certainly receive two different answers. Ask them what are their concerns about the cloud, and they will be in greater agreement.

Sadly, the cloud is currently going the way of so many great technologies in IT - from initial curiosity, to ensuing enthusiasm, to widespread confusion in the light of a welter of meaningless acronyms and a lack of best practice. IaaS? PaaS? SaaS? You can find a cloud supplier putting "at a service" on the end of pretty much every technology available, to the extent it all becomes rather meaningless.

And now that we have a few early adopters, we even hear that some find moving to the cloud doesn't necessarily save the money they had been promised.

Perhaps part of the problem is that the cloud means so many different things to so many different people. So here is a definition that we think will become increasingly significant: cloud is no more, and no less, than the commoditisation of processing power.

In the same way as the internet commoditised networking, and that smartphones, tablets and laptops are commoditising end-user devices, the cloud is doing the same to servers, storage and the provision of software applications on top.

Commoditisation does, typically, mean lower unit costs. But its significance goes much further - it creates a platform for innovation. Once the big, costly, processor, storage, server stuff is reduced to the level of Amazon offering 1Gb of archive disk space per month for just one US cent, it opens up access to computer power previously inaccessible to start-ups and innovators, and really shakes up markets.

There is enormous competitive advantage to be gained by organisations that understand how to make the most of the opportunities for innovation that the cloud presents. If all you want from the cloud is to save money, then you can do that too, if you get it right. But the potential benefits are so much more.Enhanced by Zemanta

Time for mobile makers to eliminate business-consumer divide

bryang | No Comments
| More
It's been one of those weeks where it seems the only thing that matters in technology is the fact that someone has produced a slightly thinner smartphone.

There are cynics who say the launch of the iPhone 5 shows why Apple needs to resort to patents to hinder its competitors.

But it's certainly true that the smartphone market has become one of minor, incremental improvements now, rather than the huge leaps that were first catalysed by the release of the original iPhone.

Wouldn't it be good, therefore, if the big mobile makers devoted more of their product development towards the needs of business?

Research and development (R&D) cash for technology products is understandably biased towards consumer products. Gone are the days when IT was created for business first, then morphed into a consumer device. But now, with the growth of consumerisation and bring your own device (BYOD) schemes, there's a gap between the capability of mobile technology and what business needs.

Every IT leader facing demands from employees to use their own devices to access corporate systems will vouch for the problems it continues to present, while no doubt quietly thinking, "If only everyone would be happy using a BlackBerry, like the old days".

The forthcoming launch of Windows 8 could prove to be another catalyst, in that Microsoft is hoping that a common operating system from phone to tablet to desktop will be the answer to the IT department's prayers.

But there's still going to be the challenge of users saying, "Sorry, I don't want a Windows Phone".

There's also the launch of BlackBerry 10, although we wait to see if that will be enough to stem manufacturer RIM's decline.

Nonetheless, it opens up an opportunity for Apple - or someone in the Apple ecosystem - and Google/Android to make it easier for those users to turn to the IT team and show how easily and securely they can access key applications and data.

The ultimate aim for IT is to remove the distinction between business and consumer technology. Many cloud services already blur those lines. The only reason we're fretting about BYOD is because it is trying to bridge two worlds that have historically been considered diametrically opposed.

But the people using technology on a daily basis no longer see such a distinction - well, other than when consumer products are easy and fun to use, and corporate systems are complex and difficult.

IT R&D - and mobile makers in particular - would benefit themselves and their customers by eliminating that business-consumer divide.
Enhanced by Zemanta

Everybody lost in NHS IT disaster

bryang | No Comments
| More

A degree of ironic congratulation is due to the Department of Health (DoH) and Cabinet Office minister Francis Maude for finally extricating the NHS from its disastrous contract with CSC.

The supplier, which was meant to deliver patient record systems to 160 NHS trusts, has ended up with just 10 trusts to complete.

The DoH says it has saved £1bn by renegotiating the original £3bn deal with CSC, but is reluctant to disclose how much it will ultimately spend with CSC. Let's hope it's not the apparent balance of £2bn - that would give 10 trusts a £200m patient record system each, which would probably be the least value for money IT projects in global healthcare history.

The irony of the congratulation comes in light of the billions that the DoH has spent over 10 years on the National Programme for IT (NPfIT) with such a poor return. There have been success stories - the Choose & Book appointment booking system (eventually), the PACS imaging system, NHS-wide email and the N3 broadband network - but based on the major objective of a national electronic patient record system, NPfIT was an abject failure.

According to a National Audit Office report last year, £6.4bn had been spent on the programme, with a further £4.3bn earmarked. Presumably, the £1bn saving with CSC is simply a reduction in that figure.

But it's not only the NHS and the taxpayer that has lost from NPfIT.

Two of the originally contracted suppliers, Accenture and Fujitsu, quit or were thrown off the programme at significant cost to each after they found they couldn't deliver their commitments and were losing money.

BT was forced to write off £1.6bn in 2009, due mainly to its NHS contract. BT at least had an early opportunity to renegotiate its deal before Maude's austerity-led supplier renegotiations, and is probably feeling pretty smug now as it watches CSC's woes.

CSC itself wrote off $1.5bn - effectively its entire investment in NPfIT - making a substantial loss on the contract, not to mention the reputational disaster it has been for one of the world's largest IT services providers.

When the NPfIT contracts were first awarded, I was told privately by IBM and EDS - then the two biggest outsourcers in the world - that they wouldn't touch the deals with a bargepole because the risks were weighted too heavily against the suppliers.

The architect of those deals, former NHS IT director-general Richard Granger, famously said he would "hold suppliers' feet to the fire until the smell of burning flesh is overpowering." Accenture, Fujitsu, BT and CSC certainly got their feet burned, even years after Granger quit.

Granger set out to reverse the historic perception that IT suppliers had Whitehall over a barrel when negotiating contracts, but hindsight shows that his combative style pushed the balance of risk too far the other way.

Therein lies the only real lesson that can be taken from the whole humiliating history of NPfIT.

It's become a cliché, but major projects have to be a genuine partnership. That requires an intelligent buyer, with sufficient in-house skills to assess suppliers and hold them to account. It also requires suppliers who stop selling products and boxes and packaged services, and understand what it really means to deliver business outcomes. Sadly you would struggle to name a single major IT supplier who would fit that description, even today.

The government's reaction to NPfIT - and other major IT disasters - is to loosen its reliance on big system integrators entirely, trying to find ways to make it easier for small IT firms to do business with Whitehall. There's a long way to go with that one too, and we will be reading about mega-contracts going to global system integrators for a while yet.

CSC, meanwhile, gave the most entertainingly positive spin on the culmination of its negotiation with the NHS, calling it a "a significant milestone in our relationship with the NHS" and "a renewed commitment by the NHS and CSC to a long-term partnership." Certainly it's a significant milestone, and those 10 trusts are stuck with CSC's Lorenzo patient record system for the long term.

CSC will compete on an equal footing with rivals for every other NHS trust as the NPfIT is dismantled and devolved to local IT purchasing decisions. On that basis alone, the only way is up for CSC.

Everybody lost in the NHS IT debacle. We can only hope everybody learned from the experience.

Enhanced by Zemanta






When software becomes a utility, everything changes - and it will

bryang | 1 Comment
| More

It's a challenge faced so far only by the most ultra-successful software companies, but a major turning point comes when a product becomes a utility.

It doesn't happen often, but there's a big difference when a piece of software goes from something you use to compete against your rivals, to something that the market is widely reliant upon.

Oracle is learning this lesson right now, after its acquisition of Sun Microsystems in 2009 made the company the custodian of Java.

The database giant has been widely criticised for what was seen as a lacklustre response to a major security hole discovered in Java 7, which was already being exploited by malware writers and created a vulnerability in every device that uses that version of Java.

The problem is that almost every device these days, uses Java - PCs, laptops, smartphones, tablets; if you're connecting to the web and using a browser, you probably have Java installed. That's a very attractive hole for a virus to target.

Microsoft, of course, went through this process many years ago. It was branded a monopolist in court when aggressive sales tactics became anti-competitive behaviour. That could only happen because the firm became so successful and so dominant in its market that behaviour previously considered aggressive but acceptable, no longer became legally acceptable.

The lessons for Redmond continued when Windows became the favoured target for virus writers, and the company realised it had to take a whole new attitude towards security. Now, 10 years after the launch of its Trustworthy Computing initiative, and with Patch Tuesday part of the IT vocabulary, Microsoft - while not perfect - is seen in many quarters as an example of how to approach software security.

The Windows maker realised it had to change from snarling competitor to responsible citizen and trusted partner. It doesn't always achieve that, but it's come a long way since those times.

Oracle has always been among the most aggressive of IT companies in its approach to selling - formed very much in the mould of ultra-competitive founder Larry Ellison, a man so competitive that he literally rewrote the rules of America's Cup sailing to suit his team.

A former managing director of Oracle's UK operation once looked me in the eye and said, "Believe me, what Larry wants, Larry gets."

Larry certainly wanted Java, but his company is now starting to learn the responsibility that comes with it.

By contrast, Apple is doing its absolute best to avoid its products becoming a utility in any way. The court case that has seen Samsung hit with $1bn damages for violating Apple patents is all about establishing that only Apple can do things the Apple way. And if you want to play the Apple way, you do so only in Apple's walled garden, to Apple's rules, and nobody else in the playground gets to take the ball home at the end of the game. It's going to be very interesting to see if the iPhone maker can maintain that position forever.

We will, at some point in the coming years, start to see cloud providers going through the same learning process. When a cloud service becomes a utility - and seeing as utility computing is one of the big aims of the cloud, it's going to happen - the provider will have to change its behaviour, or potentially face external regulation.

That's why we refer to heavily regulated gas, electricity and water companies as utilities.

With the growing interconnectedness of everything, one supplier becomes dependent on another becomes dependent on another, and so on. In a market ruled by ultra-competitors, that doesn't work.

That is why much of the cloud is being built on open source, and why firms like Google choose to donate their software to the open-source community. Suddenly software needs to be something owned by nobody if it's going to be efficient and cost-effective for everybody. That's why the UK government is so keen to adopt open source and open standards and find ways to avoid being locked in to patent-encumbered software products.

Oracle's recent Java security experience is, in the grand scheme of things, a small problem, quickly resolved. But it's a big example of the way the software sector is going to evolve over the next 10 years.

Enhanced by Zemanta

Users will decide the ultimate winner in Apple vs Samsung skirmish

bryang | 1 Comment
| More

In the long term, the Apple vs Samsung patent war will come to be seen as little more than a skirmish in a technology revolution led by users, not by manufacturers.

It's difficult not to take sides after Apple won $1bn damages against Samsung in a California court that sits on the doorstep of the US tech heartland in Silicon Valley - and I'm on the side of Samsung, or at least on the side that says this patent action is damaging to the wider technology industry, will reduce consumer choice and hamper innovation.

As many others with far greater knowledge of patent law have already written in recent days, this case is a unique facet of US patent legislation that allows protection of basic elements of design, process and functionality that would to any normal user be considered obvious. Patents should protect genuine innovations and ideas - they were created to help inventors. That Apple can claim damages because is a rival product is also rectangular with rounded corners is absurd.

If Apple, as many predict, launches a television, then I hope that in the same spirit of not copying such elements, an Apple TV will be triangular, or maybe even hexagonal in shape. It's notable how few other countries' legal systems have taken the same one-sided view as the Californian court.

The counter argument is that Apple is encouraging rivals to innovate not emulate. But if emulation had been banned through the history of technological innovation, we'd have seen unsuccessful attempt to sell five-wheeled cars, fly three-winged aircraft, and 20 different versions of a plug socket in one house. We would have to buy Ford cars, perhaps, or only listen to a Marconi radio.

Without emulation, Apple would say that Microsoft would not have been able to copy the Mac interface when creating Windows - which is exactly what Apple hopes to avoid with Android in the smartphone era.

Of course, without emulation, Xerox would have been the main developer of PC operating system software and Apple would have been prevented from copying its "Wimp" (windows, icons, mouse and pull-down menu) graphical interface.

What Apple is hoping to stop is the inexorable process of standardisation. Hoover would have loved to stop us all buying "hoovers" from rival vacuum cleaner makers, but we have all benefited from such standardisation. We should be grateful to IBM for not preventing Compaq producing personal computers that looked strikingly similar to its own, albeit slightly more beige.

Standardisation is, to some innovators, the enemy. Standardisation takes one person's innovation and commoditises it, reducing its value and profit potential. Of course, standardisation also drives down prices, increases consumer choice and improves competition.

It would be nice to think, as this blog post claims, that all Apple has done is signal to consumers who would not previously have considered Samsung, that the Korean firm's products are very similar to the iPhone and iPad and therefore a worthy alternative.

But I would subscribe to the view that standardisation is essential to revolutionary innovation. I'm a fan of researchers such as Carlota Perez and Simon Wardley, who suggest that commoditisation of technology is an essential pre-requisite to genuine, era-defining, disruptive change and an explosion of innovation. When a non-protected technology is standardised, it becomes a platform for mass innovation.

Apple knows this, and wants to own the standard - it wants the iPhone to be the standard smartphone, and iPad the standard tablet. The iPhone itself proves the process of standards leading to innovation - the iPhone led directly to the wealth of innovation in app development. But Apple wants that to exist in a closed environment that it controls. You can have other closed environments, but you can't call them an App Store, and you can't access them through a rectangular thing with rounded edges and a grid of icons that happens to look like everyone's idea of roughly what a smartphone should be.

Apple's approach has, of course, been phenomenally successful - the most successful technology company in history, as of this moment.

But I would like to think that, in time, users will see through it. They will demand choice, and standardisation, and commoditisation, as well as innovation. And they will continue to choose Apple, and make the company and its investors very rich, because Apple makes damn good products. But they want to choose because of decisions they make on the high street or online, not decisions made in a US court. And they will one day look back and laugh at how stupid it was that one company tried to stop another making a good product because it looked and acted somewhat like its own.

Enhanced by Zemanta

RIM / Blackberry: one upgrade cycle from oblivion

bryang | No Comments
| More

News that the government is close to giving security clearance for use of smartphones other than Blackberry is another small step in Research in Motion's (RIM's) seemingly headlong rush into potential oblivion.

When newly appointed ministers in the coalition government came into Whitehall in 2010, the more tech-savvy among them were asking, "Why can't I use my iPhone?" There was genuine frustration that they were forced to use Blackberry. Since then, it's only been a matter of time, and the impending changes to the "impact level" security clearance scheme will open up the government market to new entrants.

The pressure and expectation now placed on the forthcoming - but much delayed - Blackberry 10 operating system and its associated devices to turn around the ailing mobile maker's fortunes is immense.

If BB10 flops, or is even perceived to be a flop, investors will bail out and the chances of the company surviving in its present form are almost zero. Let's look at the numbers.

RIM's market value peaked in May 2008 at over $78bn. Today it's worth just $3.8bn. The company's book value - its total assets less liabilities - is $9.6bn.

You could buy all of RIM, even at a 100% premium to its current share price, pay off its liabilities, sell all its assets, and be left with a profit of $2bn. You don't get a much bigger turnaround challenge in business than that.

If the share price drops further after BB10, the firm becomes a bargain too good to ignore for likely predators such as Microsoft/Nokia, Google/Motorola, Samsung, or even IBM - the latter was recently rumoured to have enquired about RIM's enterprise business.

Look at the numbers another way.

RIM has about 78 million subscribers for Blackberry worldwide. Estimates of the firm's average revenue per user (ARPU) seem to vary depending on where you read them, but at the most simplistic calculation, with total sales of $18.4bn in its fiscal year to March 2012 (and bear in mind that quarterly revenue since has plummeted), that gives an ARPU of $235 per year.

So if you bought RIM today at, say, a 30% stock price premium (typically sufficient to persuade shareholders to sell), then upgrade only half of the current subscriber base to a new device such as Android or Windows Phone, and even if the ARPU drops by, say, 25%, then your first-year post-acquisition revenue still exceeds the purchase price by $1.8bn.

RIM makes a gross profit margin of around 30% (but a net loss of over $500m in its most recent quarter), so if you strip out much of the firm's costs, you recoup your acquisition price within three years.

I'm not a stock market expert, but that sounds like a good deal to me.

RIM's financial report acknowledges that its business heartland is being chipped away by the rise of bring your own device (BYOD) schemes, so it needs to convince consumers that they want a Blackberry rather than an iPhone or Android device. RIM's competitive edge came from the security of email through Blackberry Exchange Server, but there are security software packages for other mobile environments now too.

Outside of teenagers who love Blackberry Messenger - who aren't exactly business users - fewer people today turn to Blackberry as their smartphone of choice. RIM is trying to attract app developers to make the platform more appealing, but with limited success. With most mobile contracts lasting between 12 and 18 months, Blackberry could be one upgrade cycle away from vanishing.

And it's worth mentioning that the latest iPhone 5 is expected to be released before BB10, re-setting consumer expectations of their smartphone yet again.

In the not-too-distant future, MBA students will be reading extensive case studies about RIM. There's an outside chance they might be reading about a great business turnaround, an example to all firms of how to turn decline into success.

More likely, they will be reading about how a technology company that was once dominant in its market disappeared into oblivion in barely five years, thanks to the relentless pace of innovation and technological change.

Oh, and RIM won't be the only one.

Enhanced by Zemanta

IT exams are a waste of time - let's scrap them

bryang | 1 Comment
| More

Let's save us all some time. After this year's A-level results, please read what we wrote about at the same time last year, and the year before - and, frankly, probably every year for at least the last five years.

We can easily regurgitate the same headlines: "Number of students taking IT-related exams falls again." We could almost use the same words in every story, just change the numbers slightly.

It's beyond a joke really, but the truth is that ICT and computing GCSE and A-levels are little more than a joke these days. Just 297 girls sat the computing A-level, for example. What's the point?

The curriculum for ICT and computing is so poorly perceived that IT employers pay it no attention. Hardly any companies look for new recruits with those qualifications - maths, sciences, even languages are more likely to get you a job in IT.

The government has at least finally recognised that the GCSE curriculum is a waste of time, and education secretary Michael Gove duly scrapped it earlier this year - but hasn't replaced it, leaving a vacuum in its place that will likely see students numbers drop even further.

So should we bother at all with IT education in schools? Why not just look for students who have done well in the basic science, maths or engineering topics and leave the IT training to employers?

Well, if IT employers still funded sufficient training, maybe we could. But lack of training remains one of the biggest skills issues facing the IT profession.

We are genuinely fed up of having to write the same story every year. Each time, the same commentators and experts bemoan the lack of progress, but nothing changes. We all know what needs to be done - IT employers need more outreach into schools; the IT profession needs to promote better role models to attract kids to study with the aim of a career in IT; the curriculum needs to reflect the digital skills we will need in 10 years, not those we had 10 years ago.

But it's hard to have any confidence whatsoever that it's going to happen soon. Perhaps this demographic timebomb will need to explode before anything happens - but by then it may be too late.

At the very least, let's do one thing now - recognise that the current exams at all levels are a waste of time, and scrap them. Perhaps that, at least, will spur employers, academics and politicians to make the radical changes that IT education needs.

Enhanced by Zemanta

Banks need to tackle IT complexity to avoid further regulation

bryang | 2 Comments
| More

Complexity is the common enemy of every IT leader. Ask any group of IT managers what is their biggest day-to-day challenge, and the answer you will most often receive points at the complexity of legacy infrastructure.

For too long, IT departments and their suppliers have used that complexity as a protective suit - when asked, "Can you do this?" the response has been, "Ooh, that's complicated."
For many years, it's been easy to get away with it, as the business users or IT buyers were at a disadvantage in their technology knowledge.

But things have changed. Now, everyone's expectations of technology simplicity are set by Google, Facebook and eBay. We as IT experts might understand the incredible complexity behind the ability to "Like" the Team GB Olympics page, but for the users all they see is the simplicity of clicking a button.

Those expectations are what IT departments now have to live up to.

Nowhere is this becoming more apparent than in financial services. There are growing calls for IT to be regulated across the banking sector after the Royal Bank of Scotland and NatWest fiasco that prevented customers accessing their accounts.

A report this week by IT trade body intellect estimates that banks spend 90% of their IT budget on managing legacy infrastructure - that is simply unsustainable.

Banking systems are among the most complex around - developed over years, often reliant on ageing mainframes and applications that are well past their best-before date, but which do their job day in, day out. Add to that the number of mergers in the sector, with overlapping systems having to be integrated - and add too the customer-facing systems that have been bolted on top to deliver online and mobile banking.

It all works, mostly. And the cost of updating a mostly functioning but ageing system is difficult to justify when the replacement will effectively do the same job, just with newer hardware and software. It's even harder to justify in the middle of an ongoing banking crisis.

But if IT now represents the arteries of the financial world, then those arteries are increasingly sclerotic and one day the blockage will become terminal.

Banks are in an impossible position - needing to spend money to remove complexity with little return on investment. But if they want to avoid regulated IT systems, and have an IT infrastructure with the flexibility to cope with rapidly changing customer requirements, that cash is going to have to be spent.

Enhanced by Zemanta

Find recent content on the main index or look in the archives to find all content.

Archives

Recent Comments

  • Stevey McFly: Windows 8 should be rather called Windows 7.5 in my read more
  • Ashutosh Sharma: Bryan, I agree the real winner will be decided by read more
  • David Chassels: The BIG step is commoditisation of business software, no read more
  • Tony Glass: In our work with organisations of all sizes, across all read more
  • Ashutosh Sharma: Hi Bryan, Interesting blog, I have a different slant to read more
  • Derek Britton: I completely agree with Bryan: complexity in today’s IT infrastructure read more
  • Ashutosh Sharma: Hi Bryan, A point well made. I had a different read more
  • Bill Maslen: This is an extraordinary decision, and clearly underlines why the read more
  • David Chassels: I get troubled every time I see open standards and read more
  • David Chassels: I raised exactly these points in a letter on 25th read more

Dilbert

 

 

-- Advertisement --