In their 2010 Handbook of Cloud Computing, Furht and Escalante defined the technology as being “a new style of computing in which dynamically scalable and often virtualized resources are provided as services over the internet.” Cloud computing is considered to be the next, natural step in the evolution of on-demand information technology services and products (Vouk 2008) and it’s development has become a significant technology trend that many experts expect will reshape information technology (IT) processes and the IT marketplace (Furht & Escalante 2010). Essentially, cloud computing technology enables users to use a variety of devices including; PC’s, laptops, smartphones, and PDA’s to access programs, storage and application development platforms over the internet, via services offered by cloud computing providers.
Although the concept of cloud computing is considered to be a relatively new offering, the idea of providing computer resources through a global network is rooted in the sixties. The notion of an "intergalactic computer network" was first introduced in late sixties by American Computer Scientist J.C.R. Licklider, who was responsible for enabling the development of ARPANET (Advanced Research Projects Agency Network) in 1969 (Kandukuri 2009). Licklider’s vision was to create a technology through which everyone on the globe could be interconnected and accessing programs and data at any site, from anywhere (Kandukuri 2009).
Although Licklider’s vision seems to clearly describe the notion of cloud computing, other experts attribute the introduction of cloud computing as a concept to computer pioneer John McCarthy who wrote that “computation may someday be organized as a public utility” (Sourya 2011). In the mid-1990’s the term ‘grid’ was coined to describe technologies that would allow consumers to obtain computing power on demand (Foster 2008). Grid