Archive

Archive for March, 2010

A chat with Luis Corrons, technical director at PandaLabs

March 31, 2010 2 comments

PandaLabs has just published its Quarterly Report, January-March 2010. This gave me the reason and opportunity to chat with Luis Corrons, PandaLabs’ technical director (but don’t let this stop you reading the full report – it’s well worth it).

Luis Corrons, PandaLabs

Luis Corrons, technical director, PandaLabs

We discussed a number of issues. The first was Mariposa, the largest ever known botnet with more than 12 million infected computers at its command. PandaLabs was instrumental (along with the Canadian Defence Intelligence) in the takedown of Mariposa in February, and the subsequent arrest of three gang leaders at the beginning of March.

Mariposa is not the only recent success against botnets: Microsoft has been successful in ‘crippling’ the Waledac botnet. But there is much debate about how successful Microsoft has actually been. This is because Waledac was effectively disconnected rather than destroyed; in theory, or so it is suggested, Waledac could return. So my first question to Luis was simply, will Mariposa stay down? Luis thinks it will.

“We didn’t just get the command and control servers; with Mariposa we got the guys behind it. The problem is that we can take down the botnet but the criminals are still out there and can start a new botnet – that happens most of the times – but in this case the takedown is permanent. The botnet has been dismantled and the organisers caught.”

Luis explained that botherders are generally very good at hiding their whereabouts.

“These guys would use an anonymous VPN service to communicate with the control servers. We never knew where they were. So we could take over their botnet, but we didn’t know who they were. But on this occasion their leader got nervous. We had control of Mariposa. He wanted it back. He got sloppy. He forgot to use the VPN and tried to regain Mariposa using his own home computer. That gave us his IP address – and we’d got him.”

By that one single, simple error did Mariposa fall. Luis is happy about that. But there are two other aspects of this case that are more worrying. The first is that these ‘hackers’ are not true hackers; they seem to have very little genuine technical expertise. What they had was a botkit, a piece of software that can be bought over the internet. There was a time when script kiddies were more of an annoyance, just internet noise, rather than a serious threat. But the quality of today’s scripts, which are now sophisticated ready-made hacking tools, means that just about any script kiddie wannabe hacker can do serious damage. And that really is worrying.

The second concern is worrying in a different way. Luis doesn’t think there will be a jail sentence for the Mariposa botherders.

“Probably they will be released free, without prison. In Spain it is not illegal to have a botnet, even one comprising more than 10m computers. I know the police in this case; they know the guys are guilty and can be proven guilty; but they don’t think that these guys will go to jail. In fact the police had to accuse them of cyberterrorism, which is not really accurate in this case; but that was the only way they could arrest the leader and confiscate his computers for forensic analysis.”

[Memo to Mandelson: WTF are you doing faffing around with trying to get disconnection for private citizens that you cannot possibly prove to be guilty when Spain cannot even lock up proven criminals that have been instrumental in countless identity thefts all round the world? Don't bother answering that. Just pack your bags and bugger off.]

Next we talked about Aurora. How, I asked him, could anti-malware protect you from sophisticated spear-phishing coupled with zero-day exploits.

“My basic security advice is to have everything patched and don’t trust anyone. But in this case that wouldn’t have worked. The combination of a zero-day exploit and such a targeted attack – someone you know talking about something you’re both interested in with a link to more interesting information – that’s really, really difficult to resist. The weak link in this is always the user, and in general the user is easy to fool – and that’s why so many people get infected. Even if you know about security, and you know you have to be careful on the internet, no-one is safe when something is really targeted at you. I’m not really optimistic – there is no way to be 100% safe – you can be pretty safe, but you cannot guarantee security. OK, you’ve got your anti-virus and it’s up to date, but they will know which anti-virus you’re using and they will test their trojan against your anti-virus to see if it is detected before they attack you with it. They will have studied your movements and know your weakpoints.”

Paradoxically, in such a situation, your security defences merely confirm your security weaknesses. But it was a nice link to my next subject. I wanted to know Luis’ opinion on my claim that the majority of security product testing is a waste of time. You may recall that I said: “If the product in question is in any way anti-malware, the vendor can simply claim that the product kills 99% of all known germs. The validation process will inevitably prove it to be true and the company has a marketing bonus that is actually meaningless. Why? Because the product will inevitably be tested against the Wild List.” I wanted a view from the coalface and expected protestations – but I got a surprise:

“That’s an interesting topic. Using the Wild List as a test to say this particular anti-virus is good is impossible – and most of the tests used could be considered useless anyway. Most take a bunch of virus samples, scan them with the anti-virus product and look at the results. Is that accurate? Well, yes if you’re measuring signature recognitions – but most of the AV companies rely on a whole bunch of other detection measures that aren’t tested. heuristics, behavioural blockers, action in the cloud and others. If a test is a real-life test simulating real-life conditions, it would be good if you could see when and how the threat was stopped. But there are very few people who can do this sort of testing; and it would be really expensive. There are a few AV researchers in universities and elsewhere that are working on this kind of test – but most of the tests we come across are useless because they do not reflect the real situation. But users read them.”

Takeaway

So what can we take away from this interview? We have the technical director of one of the world’s leading anti-malware companies declining to claim that his products would keep you secure from Aurora-like APTs, admitting that you cannot be secure on the internet, and agreeing that most of the product tests we come across are worthless. Frankly, I feel a whole lot safer with this sort of open honesty from a man at the top of the security industry than with the more common ‘we’ll make you 100% safe’ marketing hype we usually come across. PandaLabs just went up in my estimation.

Categories: All, Security Issues

Sustaining West European support in Afghanistan – the CIA’s view

March 28, 2010 1 comment

The WikiLeaked CIA Red Cell Special Memorandum on “Sustaining West European Support” for the Afghan war is disturbing. It’s not so much the function of the document. After all, it’s basically a standard PR document of the type that all large companies produce under crisis management; and we tend to accept that business needs to do this. No, this is disturbing because of the cynicism involved in national manipulation and the end product of the ‘business’: war and death.

The document starts:

The fall of the Dutch Government over its troop commitment to Afghanistan demonstrates the fragility of European support for the NATO-led ISAF mission.

and the first three paragraphs headings explain both the problem and the perceived solution:

Public Apathy Enables Leaders To Ignore Voters. . .

. . . But Casualties Could Precipitate Backlash

Tailoring Messaging Could Forestall or At Least Contain Backlash

At one level it is a fascinating discussion of European attitudes. It talks  about France and Germany, the third and fourth largest troop providers to the war. In both countries it is the women who are considered the weak or danger point. While women are anti-war, men are either apathetic or vaguely supportive. The French concern is over civilians and refugees. The German concern is more pragmatic: they are “worried about price and principle”.

But all is not lost, the force for change still has some ‘traction’:

The confidence of the French and German publics in President Obama’s ability to handle foreign affairs in general and Afghanistan in particular suggest that they would be receptive to his direct affirmation of their importance to the ISAF mission—and sensitive to direct expressions of disappointment in allies who do not help.

Does this mean that the Americans believe that Europe supports America because it believes in Obama? Oh dear, no. The CIA believes that it can manipulate European loss of favour with Obama to its advantage:

European hand wringing about the President’s lack of attendance at a EU summit and commentary that his absence showed that Europe counted for less suggests that worry about European standing with Washington might provide at least some leverage for sustaining contributions to ISAF.

But I have to admit that it is not the mind-numbing cynicism of this document that worries me most: it is the absence of any need to bolster support in the UK. What does this mean? Does the CIA believe that it has the UK Government so deeply in its pocket that the UK support is a permanent given? Does it mean that the CIA believes it can do no better a job at disinformation and public manipulation than its UK counterparts and the UK government? Think of the Blair/Campbell public manipulation, and any of the lies and falsehoods that have come out of our current government. For me, the biggest concern about this document is that it shows the Americans already consider the UK to be the 51st State.

Categories: All, General Rants

Another sloppy phishing attempt

March 27, 2010 Leave a comment

In business, the ultimate accolade is to be head-hunted. On-line, the ultimate accolade is to be spear-phished.

But this is an insult! They could at least make it believable. It’s not the grammar – anyone can make those mistakes. “We are having congestions…” OK, take some cough mixture.

“Due to the congestion in all hotmail users…” Damn, this cough is contagious!

Nor is it the lack of logic.  “Hotmail would be shutting down all unused accounts…” Not much point in writing to me if it’s unused, because I won’t read it.

Nor even is it the sloppiness. “We apologize for any inconvenien”

No. The thing that gives this away as a very poor phishing attempt is the obvious outright lie. “This Email is from Hotmail Customer Care.” Every hotmail user in the world knows there’s no such thing!

Categories: All, Security Issues

Democracy in the United Kingdom

March 25, 2010 Leave a comment

The democratic basis of the United Kingdom is under threat. This is happening right now.

What is that democratic basis? It is a trias politica; the separation and independence of the three functions of government: executive, legislature and judiciary. In the UK, the executive is the Cabinet; the legislature is parliament; and the judiciary is the Courts. Enforcement is controlled locally by the police, and internationally by the Armed Forces. They exist to enforce the edicts of the legislature, and whether they get it right or wrong is determined independently by the judiciary.

The first breakdown in this democratic principle came with the castration of parliament: it has had its balls removed by the executive. Parliament no longer matters: it is bullied, steamrolled and ignored by the executive. Its wishes are irrelevant. It might as well not exist – its sole remaining purpose is to make us, the people, believe we have a say in government.

The second breakdown came with the castration of the Cabinet. It no longer matters either. It is peopled by self-seeking Yes Men (and Women) who either want the extra pay and prestige or who are actively jockeying to be the next Leader. But they no longer matter or have any real say in policy: they have absolutely no say in government.

Government is now controlled by the Leader, who has an inner clique whose primary function is to maintain his position. This usually comprises the Home Secretary, the Minister of Justice and the Chancellor. The current variation is that it really only comprises the current Secretary of State for Business, Innovation and Skills (worryingly, someone who has no elected mandate from the people whatsoever).

So democracy now comprises an executive of one person and a judiciary.

The independence of the judiciary is also under threat. Judges are ultimately appointed by the executive. Traditionally, they are kept in check by an independent trial jury of twelve citizens – and we already have situations where juries are dispensed with. But the more worrying trend is that the judiciary is being side-lined.

This is being achieved by a gradual (actually, not so gradual) movement away from a presumption of innocence to a presumption of guilt. We are increasingly guilty unless we can prove our innocence; and this in turn is increasingly done either by our presence within, or our absence from, one or more of the increasingly inclusive national databases.

This is most obvious with the National ID Register and ID Cards. Their purpose is to prove that, individually, we are not benefit cheats, illegal immigrants, paedophiles, terrorists or gun-running, money-laundering mafia overlords.

A very close second comes the National DNA database which contains DNA records of more innocent people than criminals. If we haven’t done anything wrong, why should we worry, we are told. But that’s the whole point: in a true democracy it is up to the police to prove to the judiciary that we are guilty of a crime; it is not up to us to prove we are innocent.

This process is now neutering the judiciary. Guilt will no longer be decided by argument and proof, it will be decided by the inclusion or absence of database records. No judgement will be required by judges, merely the pronouncement of statutory sentences. And these sentences come from the executive, which we have already seen to be effectively the Prime Minister.

In other words, shocking as this may sound, we in the UK are on the verge of being ruled by dictatorship. Many of us believe that this is already happening.

Categories: All, General Rants

Sacred cows fall at Pwn2Own

March 25, 2010 2 comments

One thing in this life is certain: if you set something up, someone will knock it down. That is just what has been happening at the Pwn2Own contest run by security company TippingPoint Zero Day Initiative (ZDI) in Vancouver. The impregnability of the iPhone has gone. Security researchers Vincenzo Iozzo from Zynamics GmbH and Ralf-Philipp Weinmann from the University of Luxembourg stole the SMS database from an iPhone that visited a malicious website.

The researchers have declared, under the rules of the competition, that they won’t release details until after Apple has had a chance to patch the vulnerability – but it just proves that nothing is safe, and even iPhone users need to watch where they’re going.

Apple’s Safari browser running on the latest Snow Leopard version of OS/X also fell to proven Mac hacker Charlie Miller, again to the process known as drive-by hacking. When a conference organiser pointed Safari at the poisoned web page, Miller’s exploit took control of the Macbook.

Peter Vreugdenhil took down IE8 running on Windows 7. He managed to by-pass Windows’ DEP (Data Execution Prevention) and ASLR (Address Space Layout Randomization) protections that are specifically intended to prevent such attacks.

Firefox on Windows 7 fell to a German researcher known as Nils, using a new zero-day vulnerability. Firefox has only just released version 3.6 (with commendable speed); so we can expect 3.7 before some laggards even update to 3.6.

All in all, the only target that wasn’t attacked and defeated was Google’s Chrome; presumably because no-one yet has a working exploit. But give it time. What Pwn2Own does is demonstrate that nothing is ultimately secure. We just have to be very, very careful about what we do and where we go whatever we’re using,

Categories: All, Security News

The electricity of the future? Brown says so!

March 22, 2010 1 comment

Well, it’s true. The rumours that have been floating around were right. Gordon Brown has delivered his speech claiming that the internet is the electricity of the future, and that universal access to broadband should be a fundamental right. So what of the Digital Economy Bill? Well, we’ll come back to that. First we should at least briefly talk about the main thrust of this speech.

Because, frankly, if we could believe him, and believe in his intent, then it would be a wonderful thing. There will be a new mygov site, opening both national and local government to the people via the coming semantic web, headed by luminaries including Berners-Lee and Martha Lane Fox. Broadband will be brought to 100% of everybody (funded by a new £6 per year tax on all phone-owners). There will be a new web science institute with £30m funding.The people will be able to shape the government of the future! And it will generate mind-boggling savings in the cost of government.

It sounds wonderful, and there will be many detailed analyses of what he said, and declined to say, over the next few days. But my initial thoughts are that what he described will never happen. Oh, we might get interactive government, but it will actually be more closed than ever. Because those who control the pipes control what goes through them; and the Digital Economy Bill will give government full control over those pipes.

It will also force people to accept national ID cards – because that will be the way in which we authenticate ourselves in order to have access to this wonderful new e-government. And if we haven’t got the card, we cannot access government services (which will include paying taxes and claiming benefits and probably even access to our GP).

Speaking in a different context, John Young (a living legend for free speech) has recently commented about freedom of information, “…once laudable and honorable freedom of information processes have evolved into shrewd and lucrative disinformation distribution tools for authoritatives of all stripes — gov, mil, com, edu, org, MSM.” This is how Brown’s open government will evolve, into something that appears to be open but effectively, and efficiently, hides what we aren’t meant to see. At the same time it provides the means to make the National ID Register necessary for all of us to accept. Ultimately, it just concentrates more power into the hands of government and those who control government. If you don’t do what you’re told, we’ll cut off the pipe!

I see far more dangers than benefits in this.

Categories: All, Security Issues

2012 Armageddon Blamed on Twitter

March 18, 2010 Leave a comment

Search Google with “social network blamed” and you get the following top hits:

  • Online social networking blamed for rise of AIDS cases
  • More teen troubles blamed on social networking
  • Social Networking Sites Increasingly Blamed in Divorces
  • Mother blames social networking website
  • Social Networks Blamed For $2.25B In Lost Productivity
  • 1 in 5 Divorces Blamed on Facebook
  • Facebook fuelling divorce, research claims
  • Texting, online social networking blamed for poor English skills
  • Economist Blames Twitter for Down Economy

All of this comes from the first two pages, and I’ve omitted most of the others simply because they are different takes on the same headline.

Can this be true? Is social networking the cause of everything wrong in society. Is it what lies behind global warming? (Actually, it probably is since the energy consumed by all of Google’s, Twitter’s, FaceBook’s, Bing’s etc servers is quite staggering.)

But isn’t it time to stop the blame game, and take responsibility for ourselves upon ourselves? While there are many individual tragedies behind these headlines, we have to stop trying to blame other people for our own misfortunes. It’s a by-product of socialism gone wrong: the State will provide. If the State provides, where is the incentive for us to provide for ourselves? I’m not saying the State shouldn’t provide in cases of genuine hardship and misfortune; but the by-product is that we have become too happy to say “it isn’t my fault, so it must be someone else’s”.

It’s got to stop. Social ills are caused by social problems; not by social networking. The solution is for us to be more security aware and to teach our children to be aware, not to automatically blame something or someone else.

Having said all this, I am praying for a new headline in a few weeks time: “Brown blames it all on Twitter, as he packs his bags”.

Categories: All, Security Issues

Shocking new survey

March 18, 2010 Leave a comment

A shocking new survey conducted with a number of random members of the public has exposed surprising and worrying details that should concern everyone:

  • More than half of the population has little more than average computer awareness
  • Nearly all parents whose kids have access to the internet admit to having children who are either male or female
  • More than 90% of hacking victims own a computer
  • Most victims of drive-by hacking say it was because they were using the Internet
  • The entire population of the country is less than the sum of its minority groups
  • Statistically, you are either gay, lesbian, an illegal immigrant single parent, benefit-scrounging gang member hoodie, or not.

There is a serious side to this survey. Within any wide-ranging survey, the majority of surveys show us that you can always find statistics to suit your own spin. So please take the survey headlines you read with a pinch of salt. Vendors have products to push; publications have magazines to sell; governments have populations to control. In all cases, it is easier if we are frightened. Take care; but not scare.

Categories: All, General Rants

I wish I could live without FaceBook

March 17, 2010 1 comment

I wish I could live without FaceBook. But it’s become a necessary part of business life. Like many others these days I work from home, and live in the sticks. It has its advantages – but commerce isn’t one of them. I get my work from other people. But these are people I never meet. So I need to use social networks to maintain a visibility with actual and potential clients.

So I use FaceBook, Twitter and LinkedIn. LinkedIn is great for campaigns, but it requires a degree of planning and is not so good at maintaining a continuous presence. Twitter has a different use: it is invaluable as a resource, for hearing news as soon as it happens and discovering research material that would otherwise be lost on the Internet. But it’s not so good at creating a presence (although incredible at maintaining and increasing the visibility of the already visible). And that leaves FaceBook. FaceBook is the medium for seeing and being seen. Of course, you need first persuade potential clients to become actual friends, but once achieved it is ideal for staying in touch and making sure everyone knows you’re still alive (and available).

I wish I could live without FaceBook. You see, I just don’t like its attitudes or ethics. There’s the whole thing about its users’ privacy. And there are rumours of foul play at Harvard. And now there’s this disclosure by the EFF:

EFF has posted documents shedding light on how law enforcement agencies use social networking sites to gather information in investigations. The records, obtained from the Internal Revenue Service and Department of Justice Criminal Division, are the first in a series of documents that will be released through a Freedom of Information Act (FOIA) case…

…The Justice Department released a presentation [that details] several social media companies’ data retention practices and responses to law enforcement requests. The presentation notes that Facebook was “often cooperative with emergency requests” while complaining about Twitter’s short data retention policies and refusal to preserve data without legal process.

For “cooperative with emergency requests” I read “we ask, they give”. Twitter, however, is my kinda guy. I just wish I could live without FaceBook.

Categories: All, Security News

The fully virtualized environment

March 15, 2010 2 comments

This is the decade of virtualization, the decade in which the whole data centre will be integrated into a single virtual machine. This is less of a prediction than a simple statement of what will inevitably happen. Just as 2 + 3 = 5, so the current combination of ageing and increasingly complex data centres running out of capacity and space, plus rapidly increasing energy costs, plus recession, plus relevant technology inevitably equals virtualization. That is the argument. This article was first published by, and is reprinted here with kind permission of, Raconteur (Raconteur on Virtualization, the Times, 2 March 2010). For more information on special reports in The Times Newspaper, call Dominic Rodgers on +44 207 033 2106.The reality is that IT, and therefore the business that it serves, will benefit in many other ways. Virtualization is not merely a defensive action against rising costs, but also an affirmative action to improve the performance of the data centre. Put simply, with virtualization you get much more for much less. And that’s what we’re going to look at: the why, how and what of getting more for less in a virtualized data centre.

Why can’t we just carry on as we are?

The status quo in the traditional data centre cannot continue. There are several reasons. The first is very simple. As business expands, and its reliance on IT intensifies, so the computing capacity of the data center must increase. This has generally meant more equipment occupying more floor space consuming more energy. Many data centres are now physically bursting at the seams. This usually means that a new site must be bought or obtained and established, or that the company must consider a third party hosting option. Neither may be entirely satisfactory, and both can be very expensive.

Kennedys, a law firm that has doubled in size in the last four years

The problem
The firm only had real-time disaster recovery (DR) capabilities for its primary systems, and the hosted backup service that it was using was becoming increasingly expensive due to growing data volumes.

The solution
Intercept, Gold VMware Authorised Consultants, came up with a plan to virtualize everything in Kennedys’ core system. The motivation behind this decision was cost – not just the cost of data center floor space but, in particular, energy costs. As the scope of the data center plans had increased, the anticipated costs had risen substantially. Virtualization – and the related cost savings – was identified as the key to making the entire project pay off over the long term. Intercept set about virtualizing Kennedys’ servers using VMware ESX Server – a phase which saw the number of physical, production servers reduced from 60 to just seven, with all the attendant savings in floor space, energy, and backup costs. Intercept‘s objectives for this stage were fourfold: provisioning a stable, virtualized storage platform to VMware, adding a full range of DR features, protecting data with full transactional consistency, and leveraging service-enabled mirroring to migrate data from the in-house data center to an offsite location.

The result
“We have the infrastructure that we need,” says Kennedys’ IT director Ian Lauwerys, “and we can focus on improving the applications that we run to make us more productive and more competitive. This makes us, as an IT department, more strategic because we’re not worrying about the day-to-day issues any more. We couldn’t do that easily before, but now we can try all sorts of things without the risk of breaking anything. We’ve got the foundation – now we can focus on supporting the business.”

The second reason is also ultimately based on cost. Most existing data centers were built for a different age, one in which energy was relatively cheap. But times have changed. Energy is now expensive and getting more expensive, and we already have warnings of potential power cuts in the relatively near future. So even if the wholesale price of the gas and oil that provides our energy remains stable, which isn’t likely, the increasing demand on our existing energy supply will inevitably force prices further upwards.

This natural price increase is compounded by the data centres’ increasing power consumption. As long ago as 2007, Gartner noted:

These legacy data centers typically were built to a design specification of about 35 to 70 watts per square foot. Current design needs can vary from between 150 to 200 watts per square foot, and by 2011, this could rise to more than 300 watts per square foot. These figures for energy per square foot represent just the energy needed to power the IT equipment; they don’t include the energy needed by air-conditioning systems to remove the heat generated by this equipment. Depending on the tier level and future equipment density plans in the data center, these cooling needs can increase the overall power requirements by an additional 80 per cent to 120 per cent.
U.S. Data Centers: The Calm Before the Storm
: Gartner, September 2007

This combination of rising prices and increasing consumption against a background of either recession or at least economic doldrums means that costs are in danger of spiralling out of control over the next few years. But there is a solution. Traditional data centres are not merely expensive, they are inherently inefficient in their use of resources. Gartner again:

Utilization of infrastructure remains low for most hardware platforms. A typical x86 server uses between five per cent and 10 per cent of its available capacity during a 24-hour period – reduced instruction set computer (RISC) Unix systems are slightly better, at 10 per cent to 20 per cent.
How IT Management Can “Green” the Data Center
: Gartner, January 2008

Fredrik Sjostedt, Director – EMEA Product Marketing at VMware, explains it like this: “Over the last 10 years, data centres have become x86 processor-based. The way organizations have been deploying this has been to install one application per server, so that no single application can bring down another or start to consume too many resources to another application’s detriment. The result, with just one application per server, is that each server tends to operate typically at something between five per cent and 15 per cent capacity. But if we translate this to financial terms, it means that any company with 100 servers has spent 90 per cent too much money in relation to actual requirements, and ends up with a lot of wasted resource in both processing and storage capacity tied up in underused servers.”

Server virtualization, a key element in virtualizing the data centre, places multiple virtual servers on each physical server. This reduces the number of physical servers required, saves on floor space, and cuts down on energy consumption. Virtualizing the data centre alters the position from one of ‘no room to expand’, to one with ‘ample space and existing physical servers to cater for any necessary expansion’. That’s the first stage: consolidation of a large number of servers that are underused to a much smaller number that are efficiently used. But the management layer in the virtualization software takes it to the next level.

“Let’s say,” says Sjostedt, “that you start a new marketing campaign and set up a new website – and the campaign is wildly more successful than you expected. Instead of hundreds of hits, you get hundreds of thousands of hits. The physical serer allocated to that application now needs more resources. This is the Nirvana of virtualization. The management layer in the virtualization software recognises the situation and automatically and seamlessly moves other applications from that physical server to another physical server.” What you have is no longer 100 underused servers, nor really 20 correctly used servers – you have one big virtual machine that uses all of the resources of all of the servers to the best advantage of all of the applications.

How?

OK, so we know we’ve got to do something, and virtualization seems to offer the best possible route. But it’s not a process to take lightly, and we need a plan of action. Many companies will have already made a start, probably with IT development systems. The IT Department may already understand the benefits – now it has to sell those benefits to senior management and develop an implementation plan.

Alstom, a VMware customer since 2002, decided in early 2009 to upgrade to VMware vSphere 4

The problem
“We knew that if we were serious about increasing our virtualization rates—which ranged from 30 per cent to 76 per cent depending on the country—we would need to move our most resource-intensive tasks onto VMware and manage them more efficiently,” said Rob Jones, Director of Technology for Northern Europe.

The solution
VMware consultants began the professional services engagement by working on-site with Alstom’s IT team to kick-start the project, sharing product knowledge, connecting Alstom to experts within the VMware organization, and planning Alstom’s upgrade to
VMware vSphere 4.

Before going live, Jones was eager to test the performance of Alstom’s upgraded virtual infrastructure. The VMware consultants assisted in setting up a test environment in which Alstom initially upgraded six hosts to VMware vSphere 4. There, Alstom gauged the performance of core business applications such as Lotus Notes Domino servers, Blackberry Enterprise servers, and clients for desktop infrastructure, domain controllers, Oracle databases and Citrix Terminal Services.

The result
Just six weeks after launching the engagement with VMware Professional Services, Alstom had finished upgrading its headquarters to VMware vSphere 4. “By upgrading to VMware vSphere 4, we consolidated 21 instances of VMware vCenter Server down to six,” Jones concludes. “We gained the tools to manage our infrastructure more efficiently, and enabled our employees to provision a new virtual machine in as little as 30 minutes.

  1. Don’t assume you can do it on your own with in-house expertise (if you could, you would have already done so). So choose a VM supplier that can provide the complete virtualized data centre – and stick with it. You really don’t want to have to change suppliers half way. Choose a supplier that can provide an experienced consultancy team and all the possible virtual requirements: servers, desktops, storage, and cloud at the least.
  2. Select an in-house Change Agent Team. You need champions who can bridge the gap between IT and senior business management; who can get management on-board and keep them there. “The things that tend to slow down implementation tend to be last minute nerves or technical issues that are actually communication problems – and they are more likely to come from the business side than the technical side of the company,” says Martin Snellgrove, EMC Consulting Global Virtualisation Director. So you also need to get your Change Agents trained and certified with your VM supplier so that they can anticipate and counter nerves with knowledgeable solutions.
  3. Develop an implementation plan. Don’t try to do it all at once; do it in project waves, a bit at a time. Don’t be afraid to cherry-pick. Include a detailed ROI statement, both for individual waves and the full project. Not only will you find predicting future costs can be accurate, you will obtain a more accurate picture of what you’ve currently got: discovering orphaned databases and unused, but still networked, servers is not unusual. Make sure this implementation plan takes account of your existing IT projects – hence the need to virtualize in waves. This will also help in getting management buy-in: they won’t feel that they are irrevocably committing themselves too heavily too soon. Nevertheless, your aim is to be able to complete the full virtualization as rapidly as possible.
  4. Implement the first wave. This becomes your proof of concept. It will confirm to senior management that you were right: it will demonstrate the advantages and confirm the full potential ROI. That’s when you get complete management buy-in.
  5. Be aware that this piecemeal approach does have a potential roadblock. You will have managers of the new virtualized areas and managers of the remaining old physical areas; and the old guard can potentially become entrenched in their old ways. Conversely, you can simultaneously start getting new demands from the business side of the company. “It is essential,” says Snellgrove, “that you must start your change and configuration and overall asset management as you transform the data centre so that you remain in control of how and when you would create a new virtual machine. This will stop a now enthusiastic business side demanding, at speed, a mass of new implementations – resulting in a project running away from you before you’ve got full control of it.” The success of virtualization can become its own enemy.
  6. Go back to the implementation plan, and just do it. Result. You have a fully virtualized data centre that is more efficient, costs less to run, requires less maintenance and now has space and capacity for further expansion at a fraction of the earlier cost.

What you get

The benefits of full virtualization will be apparent almost immediately.

Savings

  • capital costs – fewer servers, less hardware investment
  • running costs – less floor space, less air-conditioning costs, lower maintenance costs; potential software savings in lower OS licenses, bundled ISV licenses
  • manpower costs – what typically took weeks now takes hours: provisioning and testing an OS and new applications; backup; full disaster recovery process testing and validation; and just about everything else that used to tie up IT.

Improved IT services

  • IT staff will be able to achieve results in a fraction of the time they used to take. This means that they will be able to take on more work, test out new suggestions, implement improvements and undertake new projects where before they just didn’t have the time.
  • IT management will be released from mundane maintenance and catch-up. Your most experienced and capable IT people will be able to stop and think, to take on a more strategic role within the business: in short, they will be able to support the business rather than just support their department.
  • private cloud options: one example, already used in some installations, is to provide development machines on licence – they can be licenced for a set period after which they are automatically reclaimed to the pool. This concentrates the minds of the developers and prevents ‘server sprawl’.

Summary

To summarize, in the words of Burton Group’s Chris Wolf: “VMware is data center proven… Virtualization provides too many benefits to stand by and watch others improve their availability and IT processes, while saving on power and server hardware costs as a result of virtualization implementations. What’s virtualization worth? Ask one of your Windows server admins who is struggling to return a critical server to operation on new hardware. Ask a developer who wants to test a piece of his code but is weighing whether the time to stage a system is worth it. Ask a server team in a data center where there is no more physical room or power to add servers… The question should not be what is the cost of virtualization, but rather what is the cost of not incorporating virtualization within your infrastructure.”

Categories: All, Security Issues
Follow

Get every new post delivered to your Inbox.

Join 127 other followers