If you’re in need of some weekend reading. The RAD Lab, has published a new white paper, Above the Clouds: A Berkeley View of Cloud Computing, examining the economics of the cloud model (well worth a read).

Cloud management tools will be high on the list of requirements as companies move to a dynamic infrastructure, putting out online demos such as this can’t do a startup company any good at all regardless of the merits of the product.

I thought it about time that I put together my thoughts on cloud computing (if only so I can remember them), below is a subset of my thoughts based on numerous articles I’ve read on the interweb). Whilst many people may pontificate on what “cloud computing” actually is and where it is applicable to certain businesses i.e. the *aaS’s, the simplest definition is that “the cloud” is simply a bunch of internal or external resources (i.e. hardware or software) that can be utilised on demand to provide services to internal or external users/applications.

Obviously the use of the cloud is dependant on the type of business you run, the benefits of the cloud for a start-up company are fairly well understood and is the most prevalent use in place today. Most start-ups use of the cloud has little need to integrate software running within the cloud with software running outside of it and is therefore ideal for green field applications. The fact that there are a number of pre-built OS/Application stacks including App Servers, DB’s in almost every possible configuration means that it’s fairly trivial to deploy your application onto the cloud for this type of application.

However for the Enterprise CIO, things are somewhat more problematic. Enterprises have a number of requirements that distinguish them from cloud computing’s early adopters. Enterprises typically have a significant existing base of infrastructure and applications, the majority of which are carried as capital assets. It’s out of the scope of this post to get into the arguments of capital expenditure vs. operating expenditure but it’s likely that enterprises will have a need to utilise both in order to “protect their investments”. This means that they’ll need to find ways to integrate and leverage the cloud in order to provide economical ways of dynamically adding capacity to existing environments.

There are however a number of questions that an organisation should be asking themselves before implementing “enterprise cloud computing”.

One: Why – what are the benefits. One of the big wins for an enterprise to use cloud computing is agility, i.e. time to market. There may of course be other benefits, such as reduced costs, but the primary benefit in my view is agility. Enterprises typically have long lead times in provisioning hardware and environments, in many cases these can take weeks or even months to be deployed. In environments where developers have a need for extended capacity (i.e. for scalability, testing, new technology etc), development and testing etc is clearly hindered. Cloud computing brings with it the possibility of provisioning resources on a time based scale, and doesn’t bind developers to needing capital assets. The ability to switch virtual servers on and then off when no longer needed (thereby accruing no further costs) enables enterprises to have significant time advantage in exploiting resources over those who for whatever reason do not.

Two: Security. This is often the biggest obstacle within an enterprise, though talking to developers in a big enterprise about “security”, and they’re usually talking about authentication and authorisation based around Active Directory or LDAP to control access to resources. Considerations like encryption i.e the stuff that IT Security and hackers tend to think about when you say the word “security” to them are usually secondary. Integration with existing access control mechanisms is key as will be things like data encryption, but that will only be one part of the overall solution.

Three: Integration Do you need to integrate cloud-based resources with existing resources. Do you need to hook up an internal data centre to these cloud-based resources, and be able to use them as if they were just another resource in the internal data centre. That has lots of implications, the most significant one being: all of the things that I’ve already done in my data centre to meet the security needs alluded to in point number two need to apply to the new cloud resources as well. Are the means available to transparently integrate internal resources with resources in the cloud — the lower the barrier to adoption is, the better.

Four: Costs Not all applications will have a cost benefit to running within the cloud for reasons detailed below. It’s easy to listen to evangalists espousing the virtues of cloud computing and a number of them seem to think that costs will be a major driver, this may well be true and for applications that may have a cost benefit are those that have a need for more “grunt” at specific times of day such as batch processes. Grid computing is an ideal candidate as (in my experience) there is usually a large requirement for compute power across multiple applications running at the same times of the day. It’s also interesting to note that companies such as GigaSpaces now offer a pay per use license fee as below (note: this is in addition to the Amazon charges).

  1. $0.20 for GigaSpaces running on Amazon Small Instances
  2. $0.80 for GigaSpaces running on Amazon Large Instances
  3. $1.60 for GigaSpaces running on Amazon Extra-Large Instances
  4. $0.20 for GigaSpaces running on Amazon High-CPU Medium Instances
  5. $1.60 for GigaSpaces running on Amazon High-CPU Extra Large Instances
  6. *Pricing is per GigaSpaces Amazon Machine Image (AMI) instance-hour consumed for each instance type. Partial instance-hours are billed as full hours.

One EC2 instance is equivalent to a 1.2GHz CPU, so using 4 extra large EC2 instances would give 4 EC2 compute units (2 cores and 2 instances).

The charge for each large GigaSpaces instance is .80 cents per hour, for a year this would work out as:

$1.6 *7 *24 = $268.8 for 1 week

or $13977.60 per year per EC2 instance – for 4 large instances this is 13977.6 *4 = $55910.4

The in-house subscription cost has support built in whereas there is a $5000 charge for the equivalent Gold support for the cloud. This makes deploying to the cloud using subscription and deploying internally using subscription costs approximately the same.

Although it may seem that utilising the cloud is not necessarily cheaper, the price advantage comes from the utility compute or compute on demand model. You only pay for what you use, when you use it; so in reality using the cloud should work out cheaper as utilisation and usage changes over differing times of the year. This means less hardware and software (i.e. license costs) are used which in effect give a two fold saving. Be aware that there are also bandwidth charges which need to be factored into the equation dependant on your needs.

So the summary of this post is to do your homework before you dive into cloud computing, you really need to work out what benefits you’re trying to acheive and if its cost savings only then you really need to do your homework. Cloud vendors charge for cpu, storage, upload, download and bandwidth and if you’re utilising these 24/7 you could well end up paying more than you do presently.
I personally see the main benefit as agility and see cost savings as a secondary benefit, of course for some enterprises moving from a capital expenditure to an operating expenditure could have enough benefits in itself to make this worthwhile.

I was invited to kidda’s Going Up album launch party yesterday and although I couldn’t make it for family reasons by all accounts it was very well received with the album getting rave reviews.

You can check out the album on his myspace page or visit his personal webpage

Get out / online and spend your money
You can buy the album at any good high street or online record shop

You’ve probably seen more of kiddas work than you realise, he’s worked with a number of top bands (check out youtube for videos) as well as television ( he recently created the music for the new bacardi advert).

hat tip – well done Steve and good luck with the album, hope you get the success you deserve

Netflix have offerred a million dollar prize to anyone who could improve their prediction algorithm 10%.

The leaderboard is already showing a 5.58% improvement, this is likely to be one of the best $1 million Netflix ever spent.

Inventor and singularity Evangelist Ray Kurzweil chats on TED illustrating the increasingly exponential evolution of technology, predicting a sharp rise in computing capability, robotics and life expectancy within the next 15 years. He outlines the “shocking ways” we’ll use technology to augment our own capabilities, forever blurring the lines between human and machine

Just got home from watching Martin Scorseses The Departed and thoroughly recommend it. It’s a real return to form with excellent performances throughout. I’ve never really been a DeCaprio fan and thought Nicholson was past his best but they were both excellent in this , as were the supporting cast.

I gave up playing computer games once it became obvious that I couldn’t get anywhere without reading the instruction manual and then dedicating hour upon hour learning strategy and shortcuts (oh and having three kids didn’t help). However with the upcoming Wii from Nintendo that just may change, an article in this weeks Economist states
“The main problem with modern games, he says, is that they require players to invest enormous amounts of time. As lifestyles have become busier, leaving less time for gaming, the industry has moved towards epic games which take dozens of hours to complete. This is leading some occasional gamers to stop playing and deterring non-gamers from giving it a try, says Mr Iwata. There are other factors too: novices are put off by the need to master complex controllers, festooned with buttons, triggers and joysticks. And not everyone wants to escape into a fantasy gaming world. “That attracts avid gamers,” he says, but can make it “difficult for people to become interested in games”.

Nintendo set out to reach beyond existing gamers and expand the market. This would involve simpler games that could be played for a few minutes at a time and would appeal to non-gamers or casual gamers (who play simple games on the web but would not dream of buying a console). They would be based on new, easy-to-use controls. And they would rely on real-life rather than escapist scenarios. This was not an entirely new approach: dancing games that use cameras or dance mats as controllers have proved popular in recent years. But Nintendo began to design entire games consoles around such ideas.”

Looks like I’ve got a present on my xmas list (for the kids of course 🙂 )

Yes I know iTV is due out early next year but I thought I’d get the jump on this by purchasing the Elgato EyeHome device.

I managed to get one with a wireless bridge for 80 GBP, probably due to Elgato dropping the EyeHome as soon as Apple announced iTV (theres been some talk on the forums that Apple and Elgato have partnered to produce the iTV)

Set up of the EyeHome was extremely simple and within 5 minutes I was streaming movies, itunes and iphoto from my iMac to my TV. It’s also fairly easy to set up streaming from my EyeTV DTT by getting VLC to do the streaming to a web browser

Some EyeHome drawbacks I’ve can see are:

• general UI weaknesses (e.g. sorting limitations, awkward menu navigation)
• no menu/navigation with DVD content (not really EyeHome’s problem)
• no H.264 or DRM support (hurray – I dont buy DRM’d media )
• can’t remotely control/program EyeTV
• clumsy content navigation (e.g. fast forward/rewind)
• no stop/resume memory

The combination of those last two items makes it frustrating to return to previous locations after stopping playback for any reason.

Hopefully iTV will support the iTunes equivalent of “remember playback position”. And being able to interact with EyeTV would be ideal, probably dependent on Apple/Elgato cooperation.

EyeHome has no trouble handling any supported media streamed over my not-fully-optimized 802.11g WLAN

I don’t want a multiple-purpose computer and digital media mass storage devices in the living room so EyeHome is serving me well enough even with its flaws, especially as I dont have an HD Television.
Right now it would be foolish to say for sure that iTV won’t be a worthwhile upgrade but I doubt I’ll be in any hurry.

Where I work requires that I logon onto Windows to access applications that just won't run under WINE. This was a complete pain as I found myself constantly rebooting between UBUNTU and windows. However I've recently been using the excellent coLinux

"Cooperative Linux is the first working free and open source method for optimally running Linux on Microsoft Windows natively. More generally, Cooperative Linux (short-named coLinux) is a port of the Linux kernel that allows it to run cooperatively alongside another operating system on a single machine"

As I already had a Linux partition I decided against running the provided Gentoo or Debian images they provide. Here's the steps I had to go through to get this up and running.

  1. Downloaded the Cooperative Linux installation executable v. 0.62 (which uses the 2.6 version of the kernel)
  2. Downloaded and installed WinPCap
  3. Installed into C:\CoLinux
  4. Download dmdiag from the Microsoft Windows 2000 Resource Kit to find out your Linux partition anme. The tool can be downloaded for free from Microsoft
  5. \Device\Harddisk0\DP(2)0x843fbb800-0x577374c00+2 (Device)
    \Device\Harddisk0\DP(3)0xdbb338200-0x3dc57e00+3 (Device)
    \Device\Harddisk0\DR0 (Device)
    \Device\Harddisk0\Partition0 (SymbolicLink) -> \Device\Harddisk0\DR0
    \Device\Harddisk0\Partition1 (SymbolicLink) -> \Device\HarddiskVolume1
    \Device\Harddisk0\Partition2 (SymbolicLink) -> \Device\HarddiskVolume2
    \Device\Harddisk0\Partition3 (SymbolicLink) -> \Device\HarddiskVolume3

  6. 6. Change your config file to include your Linux partition ( you can also add your cd
    drive at this point ).I'm using dhcp and simply set up a network bridge between my existing network adaptor and the tap virtual adaptor installed by coLinux.Started up coLinux from windows command line, it basically starts up a colinux console where you can find out your ip addressStarted up a putty session, export HTTP_proxy as I'm behind a firewall and ran apt-get update

    Hey presto – worked a treat, recommended