Gary McKinnon's legacy

bryang | No Comments
| More
So Gary McKinnon stays free - for now.

At Computer Weekly, we've followed the self-confessed hacker's story for the 10 years it's taken to fight his extradition to the US. Along the way we've seen his cause become an international issue, with prime ministers and presidents discussing his case.

It's for others to discuss the legalities of home secretary Theresa May's decision to rescind the extradition order. It's also for others to debate the approach of US prosecutors that once told McKinnon they wanted him "to fry".

But it's also important to remember that McKinnon is guilty - something he has never denied. It is right that he should face up to the law, and the consequences of his actions - but it's equally right that those consequences should be proportionate to the crime.

The 10 years since McKinnon came to public attention have put his hacking into a very different context. Governments now do far worse on a regular basis than Gary did. It is easy to ponder that the Pentagon would have remained vulnerable to cyber attacks from people with much worse intent were it not for the holes that McKinnon exposed. That's no excuse though, of course.

It's probably not difficult to argue also that the Pentagon and other intelligence services learned a lot about what they can get away with in cross-border cyber intrusion.

And as we've seen in the last year or two, there are plenty of new young Garys out there, operating under the guise of hacktivist groups like Anonymous, still exploiting the security flaws that are all too inherent in modern technology.

The immediate priority for McKinnon is his health. Then he has to face whatever the legal authorities in the UK decide to do about his case. But he also now has an opportunity to put something back into the IT security community, and it would be great to see him put his unwanted notoriety to good use in highlighting to others just how vulnerable our IT systems remain. Nick Leeson, the man who brought down Barings Bank, does a similar thing these days about banking fraud.

But beyond the legalities, and the human cost of McKinnon's 10-year ordeal, there is a lesson for everyone in IT. Information security is now a matter of national security, let alone of business success. Gary McKinnon's case went well beyond the hacking crime he committed. IT security goes well beyond the technicalities of hackers and viruses. If Gary's legacy is to put the topic onto the boardroom agenda of every organisation, then he can, at least, be thanked for that.

Enhanced by Zemanta

When will the traditional model of software licensing die?

bryang | No Comments
| More

When will the traditional model of software licensing die? Surely it is only a matter of time.

This week alone, we've seen Oracle accused of costing customers millions of pounds through its "non-transparent and complicated licensing policy". Legal experts say that firms are being forced to pay huge penalties when using Oracle software in a virtualised environment - a set-up that is increasingly common.

Meanwhile, SAP users in the UK have called for more transparency and better value for money in the supplier's licensing policies. Here too, a user group survey suggests that 95% of SAP customers believe the firm's software licensing is too complicated.

Licensing has always been a point of contention for IT leaders. Computer Weekly was writing about how big software firms would rip off customers through opaque terms and conditions as long ago as the 1990s.

But now, with the growth of the cloud and software as a service, those old models of upfront licence fees with annual maintenance payments look increasingly outdated and inappropriate for a modern IT environment.

One of the biggest culprits is Microsoft, but even the world's biggest software provider is showing early hints of realising the world has changed. CEO Steve Ballmer told shareholders this week that the firm is undergoing a "fundamental shift", and now sees itself as "devices and services company".

The implication between the lines, surely, must be that you don't sell devices and services on the same basis as a conventional software licence. It would be a huge change, with enormous financial implications, were Microsoft to move to a subscription-based model more in tune with the pay-as-you-go ethos of the cloud. It clearly won't happen overnight - but if that is the direction of travel, then perhaps even Microsoft is starting to get it right.

Of course, supporters of open source will be smiling smugly at the travails of licence-encumbered users. It is no coincidence that most of the new cloud services - Amazon, Google, Facebook etc - are built on open-source principles. Imagine the cost of an Oracle database licence for Facebook's server infrastructure.  

There's a bright future for software companies - their products will power the world and our lives. But there are gloomy prospects for any firms that insist on hanging on to outdated software licensing practices from a different age.Enhanced by Zemanta

GES2012: Optimising information use through the internet and social media

bryang | No Comments
| More
This year's Global Economic Symposium (GES) takes place in Rio de Janeiro next week, on 16-17 October.

GES is an annual event that invites stakeholders from around the world to discuss global issues, challenges and problems. It's a great coming together of politicians, business leaders, NGOs, and experts across a huge range of topics - like a smaller version of the World Economic Forum in Davos.

I've been fortunate to be invited for the past three events to moderate the session on technology, which this year is titled "Optimising information use through the internet and social media" - a subject that would take far longer than the allotted 90 minutes to discuss in its entirety.

Panelists are invited to submit their views on the topic at hand in advance, and I thought I'd publish my submission here - I'd be interested in your opinions too:

Encourage the innovators, and allow consumers of information to make the choice

There is a fundamental dilemma to consider when looking for solutions to the challenge of "optimising information use through the internet and social media": the internet and social media have grown as "bottom-up" technologies often used by people to bypass traditional social, cultural and establishment controls, yet the control of most of the information that has value to those people remains in the hands of businesses and governments.

So, when considering how to "optimise" information use, one has to look at who wants to use that information, and who has that information.

Typically the "user" is you and I - individual citizens going about their daily lives, requiring information owned by the state, by business, by educational institutions, healthcare organisations, and in the world of social media, by each other.

In most of those organisations, that information has either a commercial value, or more likely a power value - information being power in so many areas of life. Those organisations are loathe to provide that information if it means loss of commercial benefit, control, or competitive advantage.

So we are increasingly faced with two opposing sides of this challenge.

On one, the digital King Canutes, who see the internet and social media as a threat to their established models, and will use whatever means - often resorting to the law or legislation - to protect their incumbency. The music and film industries are classic examples of sectors reluctant to change and reflect the new demands of their customers, resorting to lobbying government to impose overly restrictive controls on intellectual property.

On the other side, are the organisations that see information access as an opportunity - for example, to empower people to take better care of their health; to encourage innovation through access to government data; and to boost education and business through open access to research.

The question, therefore, is should governments and other representative bodies use their influence - through legislation, regulation or other measures - to lean one way or the other? The evidence to date is that their efforts to do so are cumbersome, slow, and often inappropriate.

It is the latter organisations - those with open, transparent, accountable attitudes to information use - that are gathering popularity and success. The restrictive, laggard organisations are struggling financially, culturally and often even democratically - witness the Arab Spring as an example of that.

My proposal would be to avoid fresh legislation wherever possible, to allow openness to flourish, and ultimately to allow citizens to choose whether they want to deal with those organisations that restrict or those that encourage information use. Such an organic process is already underway, and the best response of business and governments would be to allow it to continue to its natural conclusion.


Enhanced by Zemanta

Why CIOs need to Like the Facebook way of IT

bryang | No Comments
| More
This is a Powerpoint slide I've used a lot lately when I've been asked to give talks on the state of IT or "the next big thing", and it usually seems to raise a smile from the audience:

Likeslide.pngThe aim of the slide is to demonstrate to the audience - typically IT leaders - how the game has changed, and that user expectations of corporate IT are now set by consumer-oriented services such as Facebook.

I was interested, therefore, to read this article by Walter Adamson, a US "social media strategist":

Lessons for Enterprise from Facebook and its 1 billion users

I'd endorse the point it's making wholeheartedly.

If your CEO or CFO (or perhaps increasingly these days, your marketing director) questions the value and cost of IT at your organisation, it's likely they will draw comparisons with the likes of Facebook. One billion users, almost no downtime, with an approximate running cost of $1 per active user? Why can't we do that?

And of course, we in IT know that behind that Like button there is an enormous complexity of software, networks and datacentres. We know that the challenges of delivering hundreds of diverse business applications is different. But not so many IT leaders want to admit that there are lessons to be learned from the likes (pun unintended) of Facebook in how to run corporate IT.

The problem is in the IT supply chain, which over the years has evolved to a point where the  basic commodity that passes through that supply chain is complexity.

I don't mean it's a complex supply chain - retail firms have a complex supply chain but it works perfectly well.

A supply chain, at its simplest, is something that takes a basic commodity and passes it through a series of processes and organisations until enough value has been added to sell it to an end consumer for a profit.

In IT, that basic commodity is complexity.

When the corporate IT buyer talks to their IT supplier, and explains their business need, the supplier's response is, "Ooh, that's complicated." And so the buyer ends up purchasing lots of complicated products, and because they are complicated they have to buy lots of complicated services too, to make the complicated products work.

Unfortunately, IT departments just take that commodity, and add process to it. So, when the business manager comes to the IT manager and explains his or her need, the IT manager says, "Ooh, that's complicated."

And so it goes. For most of the history of corporate IT, that poor business user has had no choice but to accept the complexity. Complexity keeps IT professionals in a job. Jargon and acronyms formalise the complexity and reinforce the processes that add complexity to complexity through the supply chain. And of course, complexity keeps the profit margins of IT suppliers high.

But today, the business user can reject the complexity, and ignore the IT department that says "no", and go to the cloud, or point to the websites that are based on open source and commodity cloud services that support millions of users with no front-end of complexity.

As Adamson puts it in his article: "I am sure that there are all sorts of nuances, all sorts of reasons why 'we're different', 'we're more complicated' etc etc but for all intents and purposes Facebook stays up globally for 1 billion customers while very expensive dedicated enterprise systems in a single country serving a minute fraction of users don't stay up."

These are the questions that IT leaders will have to face - and if you aren't already, you will soon.

IT leaders know that the biggest challenge they face in their IT delivery is complexity - that history of poorly integrated, incompatible legacy systems that somehow grew organically over 10 or 20 years. It's the reason the banks have such a huge IT problem and why online banking and ATM systems seem to crash so much more often - constantly bolting more complexity onto an already over-complex, sprawling legacy infrastructure.

IT leaders want to simplify things, to make upgrades easier and technology more agile and flexible.

But it's going to take a bold CIO to say, we need to start from scratch, we need to learn from how a Facebook or a Twitter or a Google has created enormous IT infrastructures from nothing. But they need to acknowledge that corporate IT departments no longer have the only nor the best way to approach large-scale IT systems, and that there are lessons they can learn.

And those CIOs need to say the same thing to their suppliers, who need to learn that CIOs are no longer delivering IT systems, they are delivering business outcomes, and an annual licence fee with monthly maintenance payment does not deliver a business outcome.

I believe that most CIOs get this. I don't believe many traditional IT suppliers do. And there are problems ahead for corporate IT until that dichotomy is resolved.
Enhanced by Zemanta

HP's latest turnaround plan makes everyone dizzy

bryang | No Comments
| More
HP has announced so many "turnaround plans" in the last few years that employees and customers must be getting dizzy.

As well as the revolving door to the CEO's office, we've seen HP declare itself a services business, a software business, a hardware business, not a PC business, and then a PC business again.

There was much derision on social media about the supplier's latest claim to be the "the world leader in cloud infrastructure" with $4bn of cloud revenue - which may come as a surprise to the likes of Amazon, Google, IBM and others. Just because you sell a bunch of servers that run in a cloud-type environment, doesn't make you a cloud provider.

HP has form for taking the latest trend of the day and slapping it as a label on existing products. A few years ago, when green IT was the marketing vogue, the company claimed it had always been the greenest IT supplier in the world and had in fact been a green IT company since the 1960s.

Manhandling the latest buzzword onto your product range is a far cry from the days when HP's slogan was "Invent". Can anyone name the last, genuine invention or innovation that HP has brought to the market ahead of its rivals, without an acquisition?

People who deal with HP on a regular basis remark on the constant reorganisations, the changing faces among their contacts, and the general state of perpetual unrest. One former HP executive recently told me the strategy was all over the place, and that many employees felt that selling off the low-margin PC business would have been the best possible move.

HP has made a series of major acquisitions and somehow reduced the value of all of them - from Compaq, to EDS, and now Autonomy. The integration of those firms has left the company disorganised and unfocused, reliant on its sheer size and market presence to remain on the shortlists of IT leaders, bolstered by the profits from being the world's biggest ink provider.

Current CEO Meg Whitman has at least chosen the route of brutal honesty in her latest turnaround plan, admitting it will take several years to achieve, and that revenue will decline in key areas of the business. Wall Street responded to such honesty by driving the share price down to its lowest point in 10 years.

In May this year I wrote the following, after HP announced 27,000 job cuts: "HP's 2011 annual revenue was $127bn - but its current market value is less than $42bn. If all HP's customers got together, they could buy the company three times over for what they spend with it in a year."

Five months later, HP's market value is now just $28bn. Just as well those HP customers ignored me - today they could buy the company four times over and still share a profit of $15bn.

Is a "multiyear turnaround" as described by Whitman even a possibility in the current economic climate, and with huge disruptive change going on through the cloud and consumerisation? Can any company have four years' grace to be allowed to make what will have to be fundamental changes to its operations, culture and products?

If HP is to invent anything, it needs to invent a new future for itself.
Enhanced by Zemanta






Cloud is an economic opportunity for the UK - does the government know?

bryang | No Comments
| More
It has long been the case that governments and regulators struggle to keep up with the pace of change in technology. With the growth of cloud computing - the first genuinely globalised, commoditised, off-the-shelf IT service - that challenge threatens to become a serious problem for the European and UK IT sectors.

A new survey of cloud computing users in 50 countries has highlighted the failure of government regulations to keep up with developments as the number one factor eroding confidence in the cloud.

At software-as-a-service provider Salesforce.com's recent Dreamforce user conference, UK customers were critical of the supplier's failure to build a promised datacentre in Europe. For many organisations affected by the European Union's strict data protection laws, that's a showstopper. But should it be?

The European Commission (EC) has at least recognised the problem. This week it announced a new strategy to work with counterparts in the US and Japan to prevent data protection and differing international legal frameworks from hindering a market that the EC estimates could generate €900bn and an additional 3.8 million jobs across the EU by 2020.

But the wheels of such pan-governmental processes turn slowly.

While there are diligent firms that will shun the cloud without guarantees on the physical location of their data, you can bet there are plenty who barely give it a thought, and have sensitive information parked on an Amazon storage system somewhere in the US, because it's the cheapest and easiest place to put it.

The real problem here is that the likes of Salesforce.com and others have so far only considered building a cloud datacentre in the UK or Europe because they have been forced to. We all know that architecturally, with cloud computing it doesn't matter where the physical servers or storage are located.

But the issue for the UK/EU is that we're not seen as a natural location for the big cloud suppliers. The question we should be asking is why? If there are 3.8 million jobs that could be created in an economically depressed Europe, we need rapid incentives for cloud providers to set up here, not new regulations to help when they are forced to.

The financial services sector came to the UK because our location straddling US and Asian time zones in a loosely regulated market made London a highly attractive location. Cloud is a massive economic opportunity for the UK, for the very same reasons. The US and Asia will be happy to spend a few years talking to the EU, while they press ahead and make the most of that opportunity.

We need the government to provide reasons to bring the cloud to the UK now. Tax incentives and planning regulations, for example, that make it easy to build cloud datacentres - no taxpayers' money needed, lots of inward investment created, plus jobs, private sector investment in telecoms infrastructure, a boost for the green energy industry, and the whole cloud ecosystem looking at the UK as a place to be. A few million here and there for a bit of innovation is hardly enough.

Sadly, there is little or no evidence that the government is having such a conversation -or is even aware of the opportunity.
Enhanced by Zemanta

What the cloud means (and it isn't cutting costs)

bryang | No Comments
| More
What are the first words that come to mind when you think of the cloud? Low cost, perhaps. Pay as you go, maybe. Probably also: not secure, too complex, regulatory headaches, lacking standards, no interoperability.

Ask two CIOs to explain what the cloud means to them, and you'll almost certainly receive two different answers. Ask them what are their concerns about the cloud, and they will be in greater agreement.

Sadly, the cloud is currently going the way of so many great technologies in IT - from initial curiosity, to ensuing enthusiasm, to widespread confusion in the light of a welter of meaningless acronyms and a lack of best practice. IaaS? PaaS? SaaS? You can find a cloud supplier putting "at a service" on the end of pretty much every technology available, to the extent it all becomes rather meaningless.

And now that we have a few early adopters, we even hear that some find moving to the cloud doesn't necessarily save the money they had been promised.

Perhaps part of the problem is that the cloud means so many different things to so many different people. So here is a definition that we think will become increasingly significant: cloud is no more, and no less, than the commoditisation of processing power.

In the same way as the internet commoditised networking, and that smartphones, tablets and laptops are commoditising end-user devices, the cloud is doing the same to servers, storage and the provision of software applications on top.

Commoditisation does, typically, mean lower unit costs. But its significance goes much further - it creates a platform for innovation. Once the big, costly, processor, storage, server stuff is reduced to the level of Amazon offering 1Gb of archive disk space per month for just one US cent, it opens up access to computer power previously inaccessible to start-ups and innovators, and really shakes up markets.

There is enormous competitive advantage to be gained by organisations that understand how to make the most of the opportunities for innovation that the cloud presents. If all you want from the cloud is to save money, then you can do that too, if you get it right. But the potential benefits are so much more.Enhanced by Zemanta

Time for mobile makers to eliminate business-consumer divide

bryang | No Comments
| More
It's been one of those weeks where it seems the only thing that matters in technology is the fact that someone has produced a slightly thinner smartphone.

There are cynics who say the launch of the iPhone 5 shows why Apple needs to resort to patents to hinder its competitors.

But it's certainly true that the smartphone market has become one of minor, incremental improvements now, rather than the huge leaps that were first catalysed by the release of the original iPhone.

Wouldn't it be good, therefore, if the big mobile makers devoted more of their product development towards the needs of business?

Research and development (R&D) cash for technology products is understandably biased towards consumer products. Gone are the days when IT was created for business first, then morphed into a consumer device. But now, with the growth of consumerisation and bring your own device (BYOD) schemes, there's a gap between the capability of mobile technology and what business needs.

Every IT leader facing demands from employees to use their own devices to access corporate systems will vouch for the problems it continues to present, while no doubt quietly thinking, "If only everyone would be happy using a BlackBerry, like the old days".

The forthcoming launch of Windows 8 could prove to be another catalyst, in that Microsoft is hoping that a common operating system from phone to tablet to desktop will be the answer to the IT department's prayers.

But there's still going to be the challenge of users saying, "Sorry, I don't want a Windows Phone".

There's also the launch of BlackBerry 10, although we wait to see if that will be enough to stem manufacturer RIM's decline.

Nonetheless, it opens up an opportunity for Apple - or someone in the Apple ecosystem - and Google/Android to make it easier for those users to turn to the IT team and show how easily and securely they can access key applications and data.

The ultimate aim for IT is to remove the distinction between business and consumer technology. Many cloud services already blur those lines. The only reason we're fretting about BYOD is because it is trying to bridge two worlds that have historically been considered diametrically opposed.

But the people using technology on a daily basis no longer see such a distinction - well, other than when consumer products are easy and fun to use, and corporate systems are complex and difficult.

IT R&D - and mobile makers in particular - would benefit themselves and their customers by eliminating that business-consumer divide.
Enhanced by Zemanta

Everybody lost in NHS IT disaster

bryang | No Comments
| More

A degree of ironic congratulation is due to the Department of Health (DoH) and Cabinet Office minister Francis Maude for finally extricating the NHS from its disastrous contract with CSC.

The supplier, which was meant to deliver patient record systems to 160 NHS trusts, has ended up with just 10 trusts to complete.

The DoH says it has saved £1bn by renegotiating the original £3bn deal with CSC, but is reluctant to disclose how much it will ultimately spend with CSC. Let's hope it's not the apparent balance of £2bn - that would give 10 trusts a £200m patient record system each, which would probably be the least value for money IT projects in global healthcare history.

The irony of the congratulation comes in light of the billions that the DoH has spent over 10 years on the National Programme for IT (NPfIT) with such a poor return. There have been success stories - the Choose & Book appointment booking system (eventually), the PACS imaging system, NHS-wide email and the N3 broadband network - but based on the major objective of a national electronic patient record system, NPfIT was an abject failure.

According to a National Audit Office report last year, £6.4bn had been spent on the programme, with a further £4.3bn earmarked. Presumably, the £1bn saving with CSC is simply a reduction in that figure.

But it's not only the NHS and the taxpayer that has lost from NPfIT.

Two of the originally contracted suppliers, Accenture and Fujitsu, quit or were thrown off the programme at significant cost to each after they found they couldn't deliver their commitments and were losing money.

BT was forced to write off £1.6bn in 2009, due mainly to its NHS contract. BT at least had an early opportunity to renegotiate its deal before Maude's austerity-led supplier renegotiations, and is probably feeling pretty smug now as it watches CSC's woes.

CSC itself wrote off $1.5bn - effectively its entire investment in NPfIT - making a substantial loss on the contract, not to mention the reputational disaster it has been for one of the world's largest IT services providers.

When the NPfIT contracts were first awarded, I was told privately by IBM and EDS - then the two biggest outsourcers in the world - that they wouldn't touch the deals with a bargepole because the risks were weighted too heavily against the suppliers.

The architect of those deals, former NHS IT director-general Richard Granger, famously said he would "hold suppliers' feet to the fire until the smell of burning flesh is overpowering." Accenture, Fujitsu, BT and CSC certainly got their feet burned, even years after Granger quit.

Granger set out to reverse the historic perception that IT suppliers had Whitehall over a barrel when negotiating contracts, but hindsight shows that his combative style pushed the balance of risk too far the other way.

Therein lies the only real lesson that can be taken from the whole humiliating history of NPfIT.

It's become a cliché, but major projects have to be a genuine partnership. That requires an intelligent buyer, with sufficient in-house skills to assess suppliers and hold them to account. It also requires suppliers who stop selling products and boxes and packaged services, and understand what it really means to deliver business outcomes. Sadly you would struggle to name a single major IT supplier who would fit that description, even today.

The government's reaction to NPfIT - and other major IT disasters - is to loosen its reliance on big system integrators entirely, trying to find ways to make it easier for small IT firms to do business with Whitehall. There's a long way to go with that one too, and we will be reading about mega-contracts going to global system integrators for a while yet.

CSC, meanwhile, gave the most entertainingly positive spin on the culmination of its negotiation with the NHS, calling it a "a significant milestone in our relationship with the NHS" and "a renewed commitment by the NHS and CSC to a long-term partnership." Certainly it's a significant milestone, and those 10 trusts are stuck with CSC's Lorenzo patient record system for the long term.

CSC will compete on an equal footing with rivals for every other NHS trust as the NPfIT is dismantled and devolved to local IT purchasing decisions. On that basis alone, the only way is up for CSC.

Everybody lost in the NHS IT debacle. We can only hope everybody learned from the experience.

Enhanced by Zemanta

When software becomes a utility, everything changes - and it will

bryang | 1 Comment
| More

It's a challenge faced so far only by the most ultra-successful software companies, but a major turning point comes when a product becomes a utility.

It doesn't happen often, but there's a big difference when a piece of software goes from something you use to compete against your rivals, to something that the market is widely reliant upon.

Oracle is learning this lesson right now, after its acquisition of Sun Microsystems in 2009 made the company the custodian of Java.

The database giant has been widely criticised for what was seen as a lacklustre response to a major security hole discovered in Java 7, which was already being exploited by malware writers and created a vulnerability in every device that uses that version of Java.

The problem is that almost every device these days, uses Java - PCs, laptops, smartphones, tablets; if you're connecting to the web and using a browser, you probably have Java installed. That's a very attractive hole for a virus to target.

Microsoft, of course, went through this process many years ago. It was branded a monopolist in court when aggressive sales tactics became anti-competitive behaviour. That could only happen because the firm became so successful and so dominant in its market that behaviour previously considered aggressive but acceptable, no longer became legally acceptable.

The lessons for Redmond continued when Windows became the favoured target for virus writers, and the company realised it had to take a whole new attitude towards security. Now, 10 years after the launch of its Trustworthy Computing initiative, and with Patch Tuesday part of the IT vocabulary, Microsoft - while not perfect - is seen in many quarters as an example of how to approach software security.

The Windows maker realised it had to change from snarling competitor to responsible citizen and trusted partner. It doesn't always achieve that, but it's come a long way since those times.

Oracle has always been among the most aggressive of IT companies in its approach to selling - formed very much in the mould of ultra-competitive founder Larry Ellison, a man so competitive that he literally rewrote the rules of America's Cup sailing to suit his team.

A former managing director of Oracle's UK operation once looked me in the eye and said, "Believe me, what Larry wants, Larry gets."

Larry certainly wanted Java, but his company is now starting to learn the responsibility that comes with it.

By contrast, Apple is doing its absolute best to avoid its products becoming a utility in any way. The court case that has seen Samsung hit with $1bn damages for violating Apple patents is all about establishing that only Apple can do things the Apple way. And if you want to play the Apple way, you do so only in Apple's walled garden, to Apple's rules, and nobody else in the playground gets to take the ball home at the end of the game. It's going to be very interesting to see if the iPhone maker can maintain that position forever.

We will, at some point in the coming years, start to see cloud providers going through the same learning process. When a cloud service becomes a utility - and seeing as utility computing is one of the big aims of the cloud, it's going to happen - the provider will have to change its behaviour, or potentially face external regulation.

That's why we refer to heavily regulated gas, electricity and water companies as utilities.

With the growing interconnectedness of everything, one supplier becomes dependent on another becomes dependent on another, and so on. In a market ruled by ultra-competitors, that doesn't work.

That is why much of the cloud is being built on open source, and why firms like Google choose to donate their software to the open-source community. Suddenly software needs to be something owned by nobody if it's going to be efficient and cost-effective for everybody. That's why the UK government is so keen to adopt open source and open standards and find ways to avoid being locked in to patent-encumbered software products.

Oracle's recent Java security experience is, in the grand scheme of things, a small problem, quickly resolved. But it's a big example of the way the software sector is going to evolve over the next 10 years.

Enhanced by Zemanta






Users will decide the ultimate winner in Apple vs Samsung skirmish

bryang | 1 Comment
| More

In the long term, the Apple vs Samsung patent war will come to be seen as little more than a skirmish in a technology revolution led by users, not by manufacturers.

It's difficult not to take sides after Apple won $1bn damages against Samsung in a California court that sits on the doorstep of the US tech heartland in Silicon Valley - and I'm on the side of Samsung, or at least on the side that says this patent action is damaging to the wider technology industry, will reduce consumer choice and hamper innovation.

As many others with far greater knowledge of patent law have already written in recent days, this case is a unique facet of US patent legislation that allows protection of basic elements of design, process and functionality that would to any normal user be considered obvious. Patents should protect genuine innovations and ideas - they were created to help inventors. That Apple can claim damages because is a rival product is also rectangular with rounded corners is absurd.

If Apple, as many predict, launches a television, then I hope that in the same spirit of not copying such elements, an Apple TV will be triangular, or maybe even hexagonal in shape. It's notable how few other countries' legal systems have taken the same one-sided view as the Californian court.

The counter argument is that Apple is encouraging rivals to innovate not emulate. But if emulation had been banned through the history of technological innovation, we'd have seen unsuccessful attempt to sell five-wheeled cars, fly three-winged aircraft, and 20 different versions of a plug socket in one house. We would have to buy Ford cars, perhaps, or only listen to a Marconi radio.

Without emulation, Apple would say that Microsoft would not have been able to copy the Mac interface when creating Windows - which is exactly what Apple hopes to avoid with Android in the smartphone era.

Of course, without emulation, Xerox would have been the main developer of PC operating system software and Apple would have been prevented from copying its "Wimp" (windows, icons, mouse and pull-down menu) graphical interface.

What Apple is hoping to stop is the inexorable process of standardisation. Hoover would have loved to stop us all buying "hoovers" from rival vacuum cleaner makers, but we have all benefited from such standardisation. We should be grateful to IBM for not preventing Compaq producing personal computers that looked strikingly similar to its own, albeit slightly more beige.

Standardisation is, to some innovators, the enemy. Standardisation takes one person's innovation and commoditises it, reducing its value and profit potential. Of course, standardisation also drives down prices, increases consumer choice and improves competition.

It would be nice to think, as this blog post claims, that all Apple has done is signal to consumers who would not previously have considered Samsung, that the Korean firm's products are very similar to the iPhone and iPad and therefore a worthy alternative.

But I would subscribe to the view that standardisation is essential to revolutionary innovation. I'm a fan of researchers such as Carlota Perez and Simon Wardley, who suggest that commoditisation of technology is an essential pre-requisite to genuine, era-defining, disruptive change and an explosion of innovation. When a non-protected technology is standardised, it becomes a platform for mass innovation.

Apple knows this, and wants to own the standard - it wants the iPhone to be the standard smartphone, and iPad the standard tablet. The iPhone itself proves the process of standards leading to innovation - the iPhone led directly to the wealth of innovation in app development. But Apple wants that to exist in a closed environment that it controls. You can have other closed environments, but you can't call them an App Store, and you can't access them through a rectangular thing with rounded edges and a grid of icons that happens to look like everyone's idea of roughly what a smartphone should be.

Apple's approach has, of course, been phenomenally successful - the most successful technology company in history, as of this moment.

But I would like to think that, in time, users will see through it. They will demand choice, and standardisation, and commoditisation, as well as innovation. And they will continue to choose Apple, and make the company and its investors very rich, because Apple makes damn good products. But they want to choose because of decisions they make on the high street or online, not decisions made in a US court. And they will one day look back and laugh at how stupid it was that one company tried to stop another making a good product because it looked and acted somewhat like its own.

Enhanced by Zemanta

RIM / Blackberry: one upgrade cycle from oblivion

bryang | No Comments
| More

News that the government is close to giving security clearance for use of smartphones other than Blackberry is another small step in Research in Motion's (RIM's) seemingly headlong rush into potential oblivion.

When newly appointed ministers in the coalition government came into Whitehall in 2010, the more tech-savvy among them were asking, "Why can't I use my iPhone?" There was genuine frustration that they were forced to use Blackberry. Since then, it's only been a matter of time, and the impending changes to the "impact level" security clearance scheme will open up the government market to new entrants.

The pressure and expectation now placed on the forthcoming - but much delayed - Blackberry 10 operating system and its associated devices to turn around the ailing mobile maker's fortunes is immense.

If BB10 flops, or is even perceived to be a flop, investors will bail out and the chances of the company surviving in its present form are almost zero. Let's look at the numbers.

RIM's market value peaked in May 2008 at over $78bn. Today it's worth just $3.8bn. The company's book value - its total assets less liabilities - is $9.6bn.

You could buy all of RIM, even at a 100% premium to its current share price, pay off its liabilities, sell all its assets, and be left with a profit of $2bn. You don't get a much bigger turnaround challenge in business than that.

If the share price drops further after BB10, the firm becomes a bargain too good to ignore for likely predators such as Microsoft/Nokia, Google/Motorola, Samsung, or even IBM - the latter was recently rumoured to have enquired about RIM's enterprise business.

Look at the numbers another way.

RIM has about 78 million subscribers for Blackberry worldwide. Estimates of the firm's average revenue per user (ARPU) seem to vary depending on where you read them, but at the most simplistic calculation, with total sales of $18.4bn in its fiscal year to March 2012 (and bear in mind that quarterly revenue since has plummeted), that gives an ARPU of $235 per year.

So if you bought RIM today at, say, a 30% stock price premium (typically sufficient to persuade shareholders to sell), then upgrade only half of the current subscriber base to a new device such as Android or Windows Phone, and even if the ARPU drops by, say, 25%, then your first-year post-acquisition revenue still exceeds the purchase price by $1.8bn.

RIM makes a gross profit margin of around 30% (but a net loss of over $500m in its most recent quarter), so if you strip out much of the firm's costs, you recoup your acquisition price within three years.

I'm not a stock market expert, but that sounds like a good deal to me.

RIM's financial report acknowledges that its business heartland is being chipped away by the rise of bring your own device (BYOD) schemes, so it needs to convince consumers that they want a Blackberry rather than an iPhone or Android device. RIM's competitive edge came from the security of email through Blackberry Exchange Server, but there are security software packages for other mobile environments now too.

Outside of teenagers who love Blackberry Messenger - who aren't exactly business users - fewer people today turn to Blackberry as their smartphone of choice. RIM is trying to attract app developers to make the platform more appealing, but with limited success. With most mobile contracts lasting between 12 and 18 months, Blackberry could be one upgrade cycle away from vanishing.

And it's worth mentioning that the latest iPhone 5 is expected to be released before BB10, re-setting consumer expectations of their smartphone yet again.

In the not-too-distant future, MBA students will be reading extensive case studies about RIM. There's an outside chance they might be reading about a great business turnaround, an example to all firms of how to turn decline into success.

More likely, they will be reading about how a technology company that was once dominant in its market disappeared into oblivion in barely five years, thanks to the relentless pace of innovation and technological change.

Oh, and RIM won't be the only one.

Enhanced by Zemanta

IT exams are a waste of time - let's scrap them

bryang | 1 Comment
| More

Let's save us all some time. After this year's A-level results, please read what we wrote about at the same time last year, and the year before - and, frankly, probably every year for at least the last five years.

We can easily regurgitate the same headlines: "Number of students taking IT-related exams falls again." We could almost use the same words in every story, just change the numbers slightly.

It's beyond a joke really, but the truth is that ICT and computing GCSE and A-levels are little more than a joke these days. Just 297 girls sat the computing A-level, for example. What's the point?

The curriculum for ICT and computing is so poorly perceived that IT employers pay it no attention. Hardly any companies look for new recruits with those qualifications - maths, sciences, even languages are more likely to get you a job in IT.

The government has at least finally recognised that the GCSE curriculum is a waste of time, and education secretary Michael Gove duly scrapped it earlier this year - but hasn't replaced it, leaving a vacuum in its place that will likely see students numbers drop even further.

So should we bother at all with IT education in schools? Why not just look for students who have done well in the basic science, maths or engineering topics and leave the IT training to employers?

Well, if IT employers still funded sufficient training, maybe we could. But lack of training remains one of the biggest skills issues facing the IT profession.

We are genuinely fed up of having to write the same story every year. Each time, the same commentators and experts bemoan the lack of progress, but nothing changes. We all know what needs to be done - IT employers need more outreach into schools; the IT profession needs to promote better role models to attract kids to study with the aim of a career in IT; the curriculum needs to reflect the digital skills we will need in 10 years, not those we had 10 years ago.

But it's hard to have any confidence whatsoever that it's going to happen soon. Perhaps this demographic timebomb will need to explode before anything happens - but by then it may be too late.

At the very least, let's do one thing now - recognise that the current exams at all levels are a waste of time, and scrap them. Perhaps that, at least, will spur employers, academics and politicians to make the radical changes that IT education needs.

Enhanced by Zemanta

Banks need to tackle IT complexity to avoid further regulation

bryang | 2 Comments
| More

Complexity is the common enemy of every IT leader. Ask any group of IT managers what is their biggest day-to-day challenge, and the answer you will most often receive points at the complexity of legacy infrastructure.

For too long, IT departments and their suppliers have used that complexity as a protective suit - when asked, "Can you do this?" the response has been, "Ooh, that's complicated."
For many years, it's been easy to get away with it, as the business users or IT buyers were at a disadvantage in their technology knowledge.

But things have changed. Now, everyone's expectations of technology simplicity are set by Google, Facebook and eBay. We as IT experts might understand the incredible complexity behind the ability to "Like" the Team GB Olympics page, but for the users all they see is the simplicity of clicking a button.

Those expectations are what IT departments now have to live up to.

Nowhere is this becoming more apparent than in financial services. There are growing calls for IT to be regulated across the banking sector after the Royal Bank of Scotland and NatWest fiasco that prevented customers accessing their accounts.

A report this week by IT trade body intellect estimates that banks spend 90% of their IT budget on managing legacy infrastructure - that is simply unsustainable.

Banking systems are among the most complex around - developed over years, often reliant on ageing mainframes and applications that are well past their best-before date, but which do their job day in, day out. Add to that the number of mergers in the sector, with overlapping systems having to be integrated - and add too the customer-facing systems that have been bolted on top to deliver online and mobile banking.

It all works, mostly. And the cost of updating a mostly functioning but ageing system is difficult to justify when the replacement will effectively do the same job, just with newer hardware and software. It's even harder to justify in the middle of an ongoing banking crisis.

But if IT now represents the arteries of the financial world, then those arteries are increasingly sclerotic and one day the blockage will become terminal.

Banks are in an impossible position - needing to spend money to remove complexity with little return on investment. But if they want to avoid regulated IT systems, and have an IT infrastructure with the flexibility to cope with rapidly changing customer requirements, that cash is going to have to be spent.

Enhanced by Zemanta

Making the Olympics a showcase for IT

bryang | No Comments
| More

So far, the London 2012 Olympics has been a triumph all round. An amazing opening ceremony, Team GB gold medals sprinkled generously around, and even the transport system has coped.

Ironically, one area that has come in for criticism - albeit at a low level - has been aspects of the technology.

Fans at the cycling road race were told not to use their smartphones so much and to use social media less, after the overload on mobile networks meant that GPS data from the bikes were not relayed to broadcasters, leading to criticism of the TV coverage.

The ticketing website continues to attract complaints too - for a population used to real-time online purchasing, attempts to buy last-minute tickets have proved frustrating for all, and the site seems to struggle with the traffic.

We all knew this would be the most connected Games ever, but it just goes to show that for fans, technology is as integral to their Olympic experience as the venues and the sport itself. The London IT team has done a fantastic job overall - as has the BBC IT team behind the most comprehensive coverage of any sporting event in history - but there's a big lesson for the International Olympic Committee and key IT supplier Atos to learn for Rio in 2016.

The approach to IT in London has been understandably low risk - Cisco, for example, was told it had to supply only proven products that had been on the market for several years. The Games was never intended to be a showcase for the latest technologies, no matter how much the public wants to use the latest devices to follow the action.

Looking ahead, surely there is a strong case to be made for cloud-enabling the Olympics. At the moment, organisers have to set up their own IT infrastructure every four years, often replicating with minor tweaks the systems used last time.

A cloud-based Olympics IT set-up would eliminate that effort - and cost - and also provide some consistency and elasticity for online ticket sales.

If the Olympics are a showcase for the best of sport, and for fans to demonstrate their love of technology, there is a great opportunity now to make it a showcase for the best of technology too.

Enhanced by Zemanta






UK's IT skills base gets the thumbs-up - government please take note

bryang | No Comments
| More

As the Olympic Games finally opens in London this week, there are 300 or so highly skilled IT experts about to be out of a job. Thanks to some of the leading lights of the new technology economy, they might not have far to look for a new employer.

Those 300 IT experts, currently employed in the IT department of London 2012 organiser Locog, will of course be rather busy over the next few weeks, part of a 5,000-strong technology team that includes 2,500 volunteers plus staff from the key Games IT suppliers, such as Atos, BT, Samsung, Acer and Cisco.

Perhaps Amazon and Facebook should give them a call - the two US giants both announced major investments in London this week.

The social media firm is opening its first software engineering centre outside the US in London - a major coup for the UK's IT skills base. "London is a perfect fit for Facebook engineering -- it's a global hub, and it has a vibrant local start-up community with lots of great technical talent," said Philip Su, who will head up the London team.

The Facebook centre will initially create 22 jobs, but the numbers will grow quickly.

Amazon, meanwhile, is opening an eight-floor development centre, a move described as "a splendid feather in our cap" by London major Boris Johnson.

"London is a hotbed of tech talent, and testament to that fact is Amazon choosing the capital as the location for its new global digital media development centre," said the site's managing director, Paula Byrne.

But Amazon and Facebook may find recruitment competition closer to the Olympics home, as news emerged that the media centre in the Olympic Park will be turned into a datacentre and digital incubator after the Games, exploiting the 600km of fibre and copper cabling installed in the building. The winning bidder - iCity, a subsidiary of UK datacentre operator Infinity - will create as many as 6,600 jobs.

Elsewhere, the UK's recession has deepened, with the latest GDP figures showing a further 0.7% contraction in the economy.

Given all these developments, if you were a government in desperate need of private sector growth, what industry sector and which in-demand professional skills would you invest in?

 

Enhanced by Zemanta

IT fiascos will continue where IT is seen as "back office"

bryang | 1 Comment
| More

The latest in a series of outages, fiascos and scandals has put outsourcing under fresh scrutiny.

First, the RBS/NatWest IT problems led some critics to point the finger at offshore outsourcing. The O2 network outage raised questions over the mobile operator's managed service. And now the G4S Olympics security scandal (with its shameless attempts to blame the IT) has national newspaper commentators debating the worth of outsourcing government services.

Labour leader Ed Miliband has joined in too, questioning the role of G4S and others in outsourcing (or privatising as he prefers to call it) aspects of the police. According to the BBC, "he was not opposed to private sector involvement, but said it should be restricted to back-office functions, such as providing computer systems."

And here's the rub. In every case above, outsourcing of IT has been implicated, even if the problems have nothing to do with the fact IT is outsourced. The telling phrase in Miliband's comment is not the "private sector involvement" but the line "back-office functions, such as computer systems."

The problem here is not the outsourcing of IT. It's the attitude that IT is a back-office function, and therefore unworthy of strategic consideration.

As police officers increasingly rely on IT, and use smartphones as essential tools on the beat, how can that be seen as "back-office"?

All those RBS press releases over the past few years, announcing more job losses in "back-office functions" that included IT, were phrased as if to say, "Don't worry, it's not job losses in anything we do that matters." I doubt many RBS executives thought IT didn't matter when customers couldn't access their money.

Is it possible to outsource IT and for it to still remain strategic? In some areas, yes - physical hosting of servers, running email, even important but mature applications such as payroll and accounts, for example - but it's the attitude and the reasons for outsourcing that matter most.

Company executives need to stop looking at IT as a back-office function. The successful businesses of the future - and indeed the efficient public services - will be those that put technology front and centre in their strategic planning and customer/citizen engagement. Tomorrow's leaders will see technology as a competitive weapon, not an administrative necessity. You don't put your competitive edge in the back office.

Every company should debate the value of outsourcing - but should not question the strategic value of IT.

Enhanced by Zemanta

Why we don't want to write about 'women in IT' anymore

bryang | No Comments
| More

We've been asked a few times why we put together an award and an event to showcase women in IT. It's quite simple - we don't want to have to discuss the issue of women in IT again.

How much better it would be if this was no longer a topic of such debate? That would mean that we had a truly diverse workforce in IT, one that reflected the technology users it serves and took advantage of the range of skills available from employees of every age, gender, race or creed in the UK.

It would mean that as we look to find the 250,000 new entrants to the IT profession that are forecast to be needed over the next five years, employers would be able to choose from the widest source of talent possible.

It would also mean that at school, boys and girls equally saw IT as a desirable career to pursue, creating a pipeline of the skills required to develop the UK as a high-tech economy.

So, since none of those things are true at the moment, we thought it was a good idea to recognise the most influential women in UK IT, and promote them as role models to help get a little closer to the sort of situation, as described above, that we should be able to take for granted.

Several guests at our event commented that they didn't realise there were so many women in senior IT leadership positions in the UK - just take a look at the 25 names on our list. It's certainly true to say that male IT leaders are far more likely to successfully self-publicise as a means to enhance their career prospects. Women are more likely to just get on with the job, and hope they will be acknowledged as such. We felt it was only right to give some of them that public acknowledgement.

But this is not about promoting women in IT, for the sake of women in IT. As Jane Moran, the Thomson Reuters CIO voted as the most influential woman in UK IT, points out - companies with a more diverse workforce are more successful. This isn't an esoteric do-gooder cause - it's good business sense. More women in IT means their employers make more money. It's as simple as that.

We will be back with another women in IT event next year. Wouldn't it be great if it could be the last one?

Enhanced by Zemanta

Time to invest in the future - and the future is technology, not banks

bryang | No Comments
| More

After the latest banking industry scandal, it is surely time for the government to invest in the future - and that future is surely technology.

For the last 25 years, since Margaret Thatcher deregulated the City of London in the so-called Big Bang, government industrial policy has favoured and nurtured the UK's financial services industry.  For most of that quarter-century, we all benefited, as ever more creative banking products caused a debt-fuelled economic boom of unprecedented proportions. We all know what happened next.

The debt crisis is showing no signs of easing in the near future, and the Eurozone lurches from short-term fix to short-term fix. The latest banking scandals - Libor manipulation, financial mis-selling etc - have led many to believe that the UK's banking industry is institutionally flawed and motivated by avarice.

There is a creeping inexorability that the financial services industry is heading east to escape the bitter censure it is increasingly receiving in the West.

It seems more and more likely that the Eurozone will impose some form of financial transactions tax. David Cameron will try again to resist, but public sentiment towards the banks is in the toilet, and people feel it is time the banks paid for the austerity they induced. Greater regulation is also inevitable.

The long-touted excuse that banks need to make huge profits and pay executives exorbitant bonuses to attract the best people to work in the UK is becoming a lame justification, as those "best people" get exposed for their parts in the latest scandals.

BBC Newsnight correspondent Paul Mason (an ex-Computer Weekly journalist, by the way) wrote a devastating commentary on Barclays, revealing that the bank has taken £3bn of capital out of manufacturing industry, and more than £3bn out of the retail/wholesale sector.

It seems that banks are not only feathering their own nests, but taking the contents of others.

China will be only too keen to encourage a shift in the money markets to the East. It's not going to happen overnight, and there will always be a sizeable finance industry left in the UK, but slowly, inexorably, over the next 20 years or so - perhaps less - the UK will no longer be able to rely on financial services as a cash cow as finance firms move their headquarters to more welcoming environments.

That all means that this and subsequent governments need to decide what sort of economy the UK should have in 20 years' time, and set policy and investment priorities for that now.

Surely, it must be obvious that such a future will be as a high-tech, digital economy. Successive governments have promised to deliver a Digital Britain, but have mostly delivered only rhetoric.

There are obvious things to do: further invest in fibre broadband; speed-up the roll-out of 4G mobile networks; tax breaks for research and development; IT skills development at all stages of education; encouragement and support for more technology clusters such as the much-talked-about Tech City in London - the list goes on, and is not a difficult one to produce.

Financial services is no longer the future for the UK economy - technology is. We need policy-makers, lobbyists, advisers, politicians and decision-makers in government to accept that and act accordingly.

Enhanced by Zemanta

RBS fiasco is a wake-up call to boardrooms of the importance of IT professionals

bryang | No Comments
| More

"Financial services can be thought of as a technology business with a financial domain of expertise."

That quote came from the CIO at investment banking giant JPMorgan Chase, and it's one that senior executives at Royal Bank of Scotland (RBS) would do well to pin on their walls.

After days of conspiracy theories about the software failure that has caused such huge reputational damage to RBS and its NatWest subsidiary, it's becoming clearer what seems to have happened.

Speculative fingers are starting to point at the bank's CA7 mainframe batch processing software from CA Technologies, suggesting that what should have been a routine maintenance patch caused a wholesale collapse of an overnight batch run, leaving hundreds of thousands of financial transactions un- reconciled.

The massively complex, massively interconnected nature of banking software systems meant the knock-on effects of rolling back to the previous state and re-applying all those transactions is taking days and days and huge amounts of customer discontent.
It's not due to offshore outsourcing, or even to non-offshore outsourcing. The biggest failure seems to have come either in testing or through poor contingency planning for how to deal with such a problem.

But it's difficult to get away from the fact that RBS has laid off thousands of IT professionals in the last few years, and almost impossible to think that wouldn't have had repercussions.
The RBS problems should be a wake-up call to the boardroom of every major bank, telling them that their operation is entirely dependent on IT and in particular on the IT resources they have to run their systems.

There can be no mistaking the fact that the RBS fiasco shows the company that it cannot scrimp on IT expertise and resources, and those skilled IT professionals are central to its business.

The post mortem on RBS/NatWest will no doubt highlight the importance of testing, of business continuity planning and contingency management. But its biggest lesson should be that IT professionals are now the core of every major organisation, and that boardrooms everywhere need to recognise them as such if they are to avoid the sort of disaster that has hit RBS.

Enhanced by Zemanta

Find recent content on the main index or look in the archives to find all content.

Archives

Recent Comments

  • Ashutosh Sharma: Bryan, I agree the real winner will be decided by read more
  • David Chassels: The BIG step is commoditisation of business software, no read more
  • Tony Glass: In our work with organisations of all sizes, across all read more
  • Ashutosh Sharma: Hi Bryan, Interesting blog, I have a different slant to read more
  • Derek Britton: I completely agree with Bryan: complexity in today’s IT infrastructure read more
  • Ashutosh Sharma: Hi Bryan, A point well made. I had a different read more
  • Bill Maslen: This is an extraordinary decision, and clearly underlines why the read more
  • David Chassels: I get troubled every time I see open standards and read more
  • David Chassels: I raised exactly these points in a letter on 25th read more
  • Joao Felizardo: Cloud interest has indeed been increasing year by year. IDC read more

Dilbert

 

 

-- Advertisement --