Wednesday, August 31, 2011

Cloud Computing : Languages Supported by Google App Engine

Languages Supported by Google App Engine

Author: Kaushik

As we learn about the basics of writing a application on Google App Engine a critical aspect to understand about the App Engine is its language support. It is critical to understand that being a Cloud environment there are limitations on what the App Engine supports. It is critical to understand the difference between a Hosting Provider like Rackspace which can provide support for any platform, language or environment as against a Cloud Platform like the App Engine which would have limitations in terms of what it supports.

[caption id="attachment_265" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing - Languages Supported by Google App"]Cloud Computing Australia : Cloud Computing - Languages Supported by Google App [/caption]

 Google currently supports the following languages:


  • App Engine runs the Java Web Applications in a Java 6 environment

  • App Engine supports frameworks like Struts 2 and Spring MVC

  • The App Engine will invoke the Servlet class of the application to handle requests

  • The Java application executes in a Sandbox environment

  • While the Application can execute code and store data in the App Engine Data Store, service the web request and prepare responses

  • However the Application cannot write to the filesystem, open a socket or access another host directly, spawn a sub-process or thread or make system calls

  • The supported set of classes are here in the JRE Class Whitelist

  • Download the App Engine Java SDK which includes a development server and includes all the services on the App Engine including the Data Store.

  • If you are using Eclipse download the App Engine plugin for Eclipse


  • App Engine runtime environment uses Python version 2.5.2

  • The python environment includes support for standard python libraries

  • Third party frameworks like Django are supported

  • Since the App Engine provides a sandbox environment the following are not supported, opening a socket, writing to the file system and making system calls

  • Only pure python is supported, extensions written in C are not supported

  • APIs are available for access to the Datastore, Google Accounts and Email services

  • The App Engine Python SDK includes a server applications which provides a complete App Engine environment on your computer.

  • Download the App Engine SDK for Windows

Other JVM Languages (Groovy, Scala, JRuby, Clojure):
All aspects mentioned against Java applies to all the JVM based languages as well.

More details available in the article on the Technology Trends Blog - Languages Supported by Google App Engine

Article Source: articles/languages-supported-by-google-app-engine-2797427.html

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Technology Trends Blog

Cloud Computing : 'Hello World' Google App Engine Application in 5 mins

Your First Hello World Google App Engine Application in 5 mins


Moving with our series on getting started with Google App Engine in this article we will talk about getting your first Hello World Application running and deployed on the App Engine. We will use python for this example and if you are already well versed in python you can skip the basics and move directly to the deployment steps for App Engine.

[caption id="attachment_258" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing - Hello World Google App Engine Application in 5 mins"]Cloud Computing Australia : Cloud Computing - Hello World Google App Engine Application in 5 mins[/caption]

Step 1: Download and install the Google App Engine SDK for Python
Here is the link for the windows version.

Step 2: Create a simple helloworld python program

  • Create a directory named as helloworld

  • Create a file in this directory as which contains the code below

 print \'Content-Type: text/plain\'
print \'\'
print \'Hello, world!\'

  • Create a configuration file called app.yaml in the directory with the following

 application: mytestapphelloworld
version: 1
runtime: python
api_version: 1

- url: /.*

Step 3: Run the Hello World Application using Google App Launcher

  • Click on Programs Menu and start Google App Launcher

  • Click onFile Menu -> Add Existing Application

  • Add the HelloWorld directory as the application

  • Click Run to start the application and click on Browse to open the web page on the browser

  • You can also check http://localhost:8080/ to check the live running application

Step 4: Deploy your app on App Engine

  • Go to the App Engine url at

  • Login using your App Engine account that you had created from our earlier exercise

  • Create a new application by clicking on Create Application button

  • Use mytestapphelloworld as the application ID. If this is not available use a unique ID that you can remember

  • Update the app.yaml file with your application ID from above

  • Change the line below and replace mytestapphelloworld to your application ID.

application: mytestapphelloworld

  • Run the following command update helloworld/

  • Enter your Google ID/password at the prompt.

  • Your deployed application is available at
Learn more at Technology Trends Blog

Article Source:

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Cloud Computing : Getting a Google App Engine Account

Getting a Google App Engine Account


Now that you have a basic understanding of what is Google App Engine and where and how you can use it, lets get started with how to get setup with an account for Google App Engine.

[caption id="attachment_253" align="aligncenter" width="300" caption="Cloud Computing : Getting a Google App Engine Account"]Cloud Computing : Getting a Google App Engine Account[/caption]

Its actually pretty simple. As with all Google Products if you have a Gmail ID you already have access to a Google App Engine. You can use your Gmail ID and password to access the the App Engine Account. Makes it really simple doesn't it. Many folks underestimate the impact of the universal access the Gmail ID provides to so many services and products. If you are like me you hate to register at every different service provider and site that is there. This easily is one reason you should try out App Engine. You already have access to it.

Step 1. Go the url for App Engine:

Step 2. If you already have a Google Account and are signed in. You will see something like the screen above with your userid already populated. All you need to do is to type in your password and login.

Step 3. Login to the dashboard.
After you login you should something like the screen below. You may not see any of the applications since you have non deployed yet.

That's it. You are all set to get started with building your first Google App Engine App.

Get More Details at our blog TheTechTrendz Blog is a Technology Blog focussed on tracking cutting Technology Trends and the overall direction of the Technology. Follow the Blog for interesting articles on Technology Trends.

Article Source:

About the Author
Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Tuesday, August 30, 2011

Cloud Computing: What is Google App Engine ?

Cloud Computing: What is Google App Engine ?


Google App Engine is an online platform to host your web application. It is what is popularly called a Cloud Computing platform to host and enable your web applications on the cloud. Essentially having a cloud platform for hosting your applications abstracts you from worrying about the various platform setup and environment related issues like server configuration, network configuration, bandwidth requirements among others.

[caption id="attachment_247" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing - What is Google App Engine ?"]Cloud Computing Australia : Cloud Computing - What is Google App Engine ?[/caption]

 What languages are supported ?

If there is one limitation from the platform it is the language support. Recently by adding Java support has been a huge plus and should significantly improve adoption of the platform.

Currently the App Engine supports Python and Java (JVM).

What this means is that App Engine applications can also be written in Java or any JVM-compatible language (e.g. JRuby, Groovy, Scala, etc.). It supports Java runtime 6

App Engine's Python runtime supports Python 2.5

What Frameworks are supported ?

Using frameworks to be able speed up the development cycle has become such a key necessity that it is critical to check support for frameworks in any cloud environment that you want to work with. Google App Engine supports Struts 2 and Spring MVC for Java and most Python web frameworks including Django versions 1.0.2.

How much does it cost ?

This is the greatest thing about the App Engine. Its free if you use less than 500 MB space and have less than 5 million page views a month. This is the reason we rank this as the #1 platform for building cloud applications especially for student projects or small POC applications.

You can build around 10 applications for each google account that you have, and nobody stops you from creating any number of google logins, so you can always get as many free apps as you want.

For Apps where you expect more usage and need more space you have to pay per unit of the resource that you are consuming. So really it is a pay based on usage model which is still highly cost effective. I do not see a need for this unless you are writing commercial apps and for most cases the free limits are good enough.

How to get started ?

All you need is a gmail or a google account. That\'s it. If you have one you are ready to get started with building your first cloud app. If not you can always get an account, it just takes a couple of minutes.

More Details at our blog

Article Source:

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Cloud Computing : Save cost on development and test environments

Save significant cost on development and test environments using Cloud Computing


Cloud Computing is reported as set to revolutionise the business environment, particularly, the way we use information technology products and services.

Could it be the next stage in outsourcing, which certainly gripped the Information technology arena in the last ten years and saw companies essentially outsourcing all their software development/IT support requirements to off shore and even on shore companies to allegedly reduce cost.

Cloud Computing Australia : Cloud Computing - Save cost on development and test
Cloud Computing Australia : Cloud Computing - Save cost on development and test
Whether these companies derived the so called cost benefits and still maintained quality is highly debatable.

 Along comes Cloud Computing which essentially postulates that you can pay for IT services as a utility, on a pay as you use basis hosted in a cloud. Essentially another form of outsourcing where this time you pay a provider to either use their software applications via the Internet (Google APPS is an example) and is known as Software as a service (SaaS). Or to use their Platform as a service (PasS) to build and host your web based application (Google Engine is an example). Or to use their Infrastructure as a Service (IaaS) which essentially allows you to pay a provider to host an image of your application negating the need for you to own your own servers, (Amazon EC2 is an example of this kind of service).

Before signing up to any of the services, it is important that a company clearly understand the costs benefits, identify and implement an entry and exit strategy or plan capturing how to move to another vendor if it becomes necessary or to bring back the service in house.

Hosting sensitive data off site must also need to be considered very carefully and the need to ensure that no regulatory requirements are breached or the customer data act is adhered to in any such implementation. Non mission critical systems are probably the best ones to consider first to be implemented in the cloud.

Test and Development Environments

Of particular interest must be the IaaS, because companies spend a significant part of their yearly IT budget (sometimes as much 30 to 40 on purchasing new hardware to host new applications, in response to new business requirements.

The cost of maintaining such hardware can also be quite expensive from software licensing to support costs etc.

Test and Development environments which are not necessarily mission critical, (since they are not live environments), can certainly be procured on a pay as you use basis with little or minimal risk, which could significantly reduce IT spending by as much as half or a quarter.

As proven with virtualisation, building test and development environments becomes an easily repeatable process and likened to a factory floor where environments are rolled out on demand.

The same principle will apply to having your development and test environments in the Cloud using the IaaS module and the benefits are as follows;

  • Test and development environments tend to be required for fixed periods of time through the lifecycle of a Project and are left idle at other times. You only pay to use these environments when you need to, which can significantly reduce IT spending.

  • The effort and time required to set up test and development environments is significantly reduced, as you only need to access your server via the cloud and create your own environments. There is no need to wait for three weeks for the server to be delivered, another two weeks to be set up in the data centre etc.

  • The cost of support, from setting up the server in the data centre, configuring it, installing the operating system, installing other security and monitoring software, setting it up on your LAN etc

  • The effort required to decommission and re-use these test environments is also minimal.

  • Since live data (de – sensitised data can be used on the other hand) is normally not used for testing or development there are no regulatory requirements to adhere to.

  • Most IaaS cloud computing vendors offer 99/100up time, back up facilities, monitoring etc which ensures that once your environments are created, you only need to focus on carrying out your development and testing.

  • A company can embark on several IT Projects at any one time, because there is no longer the constraint of available hardware (or rather the need to provision more hardware) to host test and development environments.

Efficiently provisioning, building, managing and scheduling your test/development environments certainly contributes significantly to the success or failure of IT Software Projects.

Paying for your infrastructure as a service at least, eliminates the effort and cost of provisioning, installing and supporting the hardware to host your test and development environments.

Author: V.C.Waturuocha MSC – Freelance IT Environments/Infrastructure Project Manager.

Article Source:

About the Author

  • Freelance IT Environments Manager & Technical Project Manager

  • Previously Global FX technology manager at Citibank International

  • Worked in several sectors to include energy, utilities, Investment banking, railway transport, software development, healthcare, manufacturing, research, insurance and the gaming sectors.

  • Currently focussed on implementing Cloud Computing solutions for a UK based retail bank to reduce their spending on test and development environments.

Private Cloud Computing: A Game Changer for Disaster Recovery

Private Cloud Computing: A Game Changer for Disaster Recovery

By Mike T Klein

Private cloud computing offers a number of significant advantages - including lower costs, faster server deployments, and higher levels of resiliency. What is often over looked is how the Private Cloud can dramatically changes the game for IT disaster recovery in terms of significantly lower costs, faster recovery times, and enhanced testability.


[caption id="attachment_240" align="aligncenter" width="428" caption="Cloud Computing Australia : Private Cloud Computing - A Game Changer for Disaster"]Cloud Computing Australia : Private Cloud Computing - A Game Changer for Disaster [/caption]


Before we talk about the private cloud, let's explore the challenges of IT disaster recovery for traditional server systems.

Most legacy IT systems are comprised of a heterogeneous set of hardware platforms - added to the system over time - with different processors, memory, drives, BIOS, and I/O systems. In a production environment, these heterogeneous systems work as designed, and the applications are loaded onto the servers and maintained and patched over time.

Offsite backups of these heterogeneous systems can be performed and safely stored at an offsite location. There are really 2 options for backing up and restoring the systems:

1) Back up the data only - where the files are backed up from the local server hard drives to the offsite location either through tapes, online or between data centers over a dedicated fiber connection. The goal is to assure that all of the data is captured and recoverable. To recover the server in the case of a disaster, the operating system needs to be reloaded and patched to the same level as the production server, the applications need to be reloaded, re-patched, and configured, and then the backed up data can be restored to the server. Reloading the operating system and applications can be a time consuming process, and assuring that the system and applications are patched to the same levels as the production server can be subject to human memory and error - both of which can lengthen the recovery time. (This is why I hate upgrading my laptop hardware. I have to invest days to get a new laptop to match the configuration of my old laptop).

2) Bare Metal Restore - a much faster way to recover the entire system. BMR creates an entire snapshot of the operating system, applications, system registry and data files, and restores the entire system on similar hardware exactly as it was configured in the production system. The gotcha is the "similar hardware" requirement. This often requires the same CPU version, BIOS, and I/O configuration to assure the recovery will be operational. In a heterogeneous server environment, duplicate servers need to be on-hand to execute a bare metal restoration for disaster recovery. As a result, IT disaster recovery for heterogeneous servers systems either sacrifice recovery time or requires the hardware investment be fully duplicated for a bare metal restoration to be successful.

Enter disaster recovery for private cloud computing. First, with all of the discussion about "cloud computing", let me define what I mean by private cloud computing. Private Cloud computing is a virtualized server environment that is:

Designed for rapid server deployment - as with both public and private clouds, one of the key advantages of cloud computing is that servers can be turned up & spun down at the drop of a hat.

Dedicated - the hardware, data storage and network are dedicated to a single client or company and not shared between different users.

Secure - Because the network is dedicated to a single client, it is connected only to that client's dedicated servers and storage.

Compliant - with the dedicated secure environment, PCI, HIPAA, and SOX compliance is easily achieved.

As opposed to public cloud computing paradigms, which are generally deployed as web servers or development systems, private cloud computing systems are preferred by mid and large size enterprises because they meet the security and compliance requirements of these larger organizations and their customers.

When production applications are loaded and running on a private cloud, they enjoy a couple of key attributes which dramatically redefine the approach to disaster recovery:

1) The servers are virtualized, thereby abstracting the operating system and applications from the hardware.

2) Typically (but not required) the cloud runs on a common set of hardware hosts - and the private cloud footprint can be expanded by simply adding an additional host.

3) Many larger private cloud implementations are running with a dedicated SAN and dedicated cloud controller. The virtualization in the private cloud provides the benefits of bare metal restoration without being tied to particular hardware. The virtual server can be backed up as a "snapshot" including the operating system, applications, system registry and data - and restored on another hardware host very quickly.

This opens up 4 options for disaster recovery, depending on the recovery time objective goal.

1) Offsite Backup - The simplest and fastest way to assure that the data is safe and offsite is to back up the servers to a second date center that is geographically distanced from the production site. If a disaster occurs, new hardware will need to be located to run the system on, which can extend the recovery time depending on the hardware availability at the time of disaster.

2) Dedicated Warm Site Disaster Recovery - This involves placing hardware servers at the offsite data center. If a disaster occurs, the backed up virtual servers can be quickly restored to the host platforms. One advantage to note here is that the hardware does not need to match the production hardware. The disaster recovery site can use a scaled down set of hardware to host a select number of virtual servers or run at a slower throughput than the production environment.

3) Shared Warm Site Disaster Recovery - In this case, the private cloud provider delivers the disaster recovery hardware at a separate data center and "shares" the hardware among a number of clients on a "first declared, first served" basis. Because most disaster recovery hardware sits idle and clients typically don't experience a production disaster at the same time, the warm site servers can be offered at a fraction of the cost of a dedicated solution by sharing the platforms across customers.

4) Hot Site SAN-SAN Replication - Although more expensive than warm site disaster recovery, SAN-SAN replication between clouds at the production and disaster recovery sites provides the fastest recovery and lowest data latency between systems. Depending on the recovery objectives, the secondary SAN can be more cost effective in terms of the amount and type of storage, and the number and size of physical hardware servers can also be scaled back to accommodate a lower performance solution in case of a disaster.

Conclusion: An often overlooked benefit of private cloud computing is how it changes the IT disaster recovery game. Once applications are in production in a private cloud, disaster recovery across data centers can be done at a fraction of the cost compared to traditional heterogeneous systems, and deliver much faster recovery times.

Mike Klein is the President and Chief Operating Officer at Online Tech, a leading managed data center operator in the Midwest. Online Tech offers a full range of colocation, managed server and private cloud hosting in their SAS-70 secure and reliable multi-tenant data centers across the Midwest. Visit for more information.

Article Source:

Cloud Computing : What is a Virtual Private Cloud?

What is a Virtual Private Cloud?

Author: OnlineTech

For many enterprises, the private cloud is the first choice when it comes to cloud computing. Private cloud computing offers a hardware and network environment that is dedicated solely to the enterprise.  Whether delivered through a hosted private cloud or internal private cloud, there is an assurance that the hardware, data storage and network security are dedicated to a single organization and not shared with other users.  When it comes to highly sensitive corporate data, personal customer lists, PCI or HIPAA certified environments, the private cloud is preferred.


[caption id="attachment_234" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing - What is a Virtual Private Cloud?"]Cloud Computing Australia : Cloud Computing - What is a Virtual Private Cloud?[/caption]


At the same time, businesses are recognizing the benefit of the public cloud for some of their applications.  Many small and mid-size businesses need the quality of an enterprise private cloud in terms of high availability, redundancy and security with the affordability of a shared cloud.  Enterprises need the ability to move non-critical, non-sensitive computing and development work out of their private cloud into a shared cloud but need to achieve a similar level of availability and security.

There are a number of public cloud options available.  Some like Amazon's EC2 are basic cloud servers designed to provide computing services for non-critical applications like web or development servers.  Others are designed to deliver the high availability and security that enterprises require in their private clouds.  The later, we call a Virtual Private Cloud.
Wikipedia defines a Virtual Private Cloud as 'a private cloud existing in a shared or public cloud'.

To expand on that, we consider a Virtual Private Cloud as one or more cloud servers in a shared environment that deliver the same level of data protection and network security as a dedicated private cloud.  The virtual private cloud can sit on same VLAN with a dedicated private cloud and managed servers to provide a completely integrated system.

In addition, by connecting a virtual private cloud into a hosted private cloud on a dedicated VLAN behind a dedicated firewall, the virtual private cloud provides a tremendous level of flexibility and expandability beyond the private cloud's dedicated hardware, without compromising the data integrity or network security of the private cloud.

Article Source:

About the Author

Online Tech ( owns and manages premier Michigan data centers. Offering managed dedicated servers, private cloud hosting, IT disaster recovery, and Michigan colocation, Online Tech helps their clients dramatically reduce IT data center costs, operational risks and downtime. Industry leaders trust Online Tech's HIPAA Compliant, SAS 70 certified data centers to ensure their servers are always on, always online, and always safe.

Monday, August 29, 2011

Cloud Computing : Transition to Private Cloud Computing

5 Reasons Why Your Company Should Transition to Private Cloud Computing


One of the biggest IT buzz words of the past few years, ‘cloud computing' has brought incredible promise to the world of information technology. Cloud computing interest has lead to widespread curiosity and awareness of public cloud services. Pooled computing resources, divided and allocated to different users, have been proven in the development and testing realm of IT to have tremendous benefits. Those benefits however have been overshadowed by concerns of a lack of security and uptime assurances among Enterprise IT executives


[caption id="attachment_226" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing - Transition to Private Cloud Computing"]Cloud Computing Australia : Cloud Computing - Transition to Private Cloud Computing[/caption]


The private cloud aims to lessen, if not eliminate, those concerns by dedicating exclusive hardware to each user. Instead of your data being stored off 'in the cloud' somewhere, one can point to a group of servers in a data center and say, 'that is your data, and only your data, on those servers,' alleviating security concerns. It is a 'best of both worlds' solution that can bring important benefits like improved uptime and reliability at a lower cost than on traditional IT infrastructure.

In fact, according to Yankee Group's recent survey on Cloud Computing, private cloud computing is preferred 2:1 over fully managed public cloud solutions. 67of respondents preferred the private cloud, whereas only 28preferred a fully managed public cloud, 21preferred an unmanaged public cloud, and 8were looking to a hybrid cloud solution.

That is why we compiled the top 5 reasons why your company should rethink your IT infrastructure and look into the benefits of private cloud computing.

5. Dedicated hardware means increased security. Much like a dedicated or colocated server, the security of your private cloud depends on a variety of factors. However, if you have the proper physical security, anti-virus software, and firewall rules in place, you can rest assured your data as safe as if it were sitting right next to your desk. With a private cloud, you know where your servers are located and that the proper physical and network security is in place. You can meet and talk to those in charge of providing support for your hardware and come visit it if you like.

4. The transition from physical to virtual servers leads to better flexibility. This is one of the most alluring benefits of cloud computing. The ability to spin up and tear down a server in a matter of minutes is incredibly powerful and useful. No longer is there any wasted effort in trying to size a server beforehand when you can create a server on the fly. Need more disk space? More RAM? More CPU? No problem. With private cloud computing, you can reallocate resources in moments without worrying about finding a physical server that will have the resources your new server needs.

3. Fully utilize your hardware with better resource management. Virtualization significantly increases the value of your physical server hardware. Instead of having 5 servers that average 10CPU utilization, you can virtualize the 5 servers on one physical server, sharing the resources. This decreases rack space, power usage, and is easier to manage. This also allows you to create copies of your servers and have them up and running very quickly, now that they have been virtualized. If you have the proper resource management tools installed on your server, you can automatically allocate the appropriate resources to a server when it needs it or turn off unused servers during low usage; an extraordinarily powerful and efficient way to manage your servers.

2. Virtual servers combined with a SAN allow for improved protection against disasters. When you connect a SAN to your private cloud, incredible redundancy can be achieved. Not only can you load balance between servers, automatically shifting server resources between servers on the fly, but in an N 1 environment (having at least 1 extra server than absolutely necessary), you can shut down one server without causing downtime. Imagine performing maintenance on your server like adding more RAM, replacing a hard drive, or upgrading software, without experiencing any downtime. When configured correctly you could power off one server and it would automatically shift the virtual servers over to an available server in your cloud. Taking your disaster protection up one level, you could have another SAN in another data center and perform SAN to SAN replication for a hot site DR environment capable of full recovery in less than an hour.

1. Switching to private cloud computing will save you time and money. The best part about a private cloud is that not only do you get all of the great benefits of virtualization and security, but it can be cheaper and less of a hassle then hosting your own servers or buying dedicated servers. If your company has more than 2 servers, it could benefit from virtualization. If your company has more than 10 servers, it could benefit from private cloud computing with a dedicated SAN and multiple physical host servers. The public cloud revolutionized Information Technology forever; the private cloud brings the benefits to the masses.

Conclusion: Save Money, Save Time, Sleep Easy: Transition to Private Cloud Computing.
The private cloud hosting is still a relatively new concept, but is based on some technology that has been around for a while and has proven itself for years. Besides providing some tremendous benefits of pooled computing resources and virtualization, it maintains the security and reliability of a normal dedicated server. The private cloud is not for everyone but a managed data center operator can make the transition to private cloud computing an affordable process and enable you to fully experience all of the great benefits of the private cloud.

Article Source:

About the Author

Online Tech offers a full spectrum of hosting solutions including basic colocation, managed colocation, managed dedicated servers and private cloud hosting in our Michigan data centers. We can deliver our data center solutions more cost effectively, with lower overhead, less risk and better support than most IT departments can do themselves.

Cloud Computing : Building the Private Cloud

Building the Private Cloud

Author: OnlineTech
A private cloud combines the benefits of cloud computing, flexibility and cost effectiveness, with the security, data integrity, and service level agreements (SLAs) of SAS 70-certified, managed dedicated server environment.

[caption id="attachment_219" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing - Building the Private Cloud"]Cloud Computing Australia : Cloud Computing - Building the Private Cloud[/caption]

 Private cloud hosting delivers dedicated servers, storage, and networks, so each client is assured an exclusive framework that offers data integrity and network security. This eliminates the concerns inherent to a public cloud environment, such as other clients' applications running on the same server.

Your company can share applications, storage, and network resources in a secure, dedicated environment that is physically separated from your provider's other client environments. The hardware, SAN, virtualization, and operating systems are configured and managed by the provider to your unique specifications.

Some providers' private cloud offerings utilize VMware as the engine to run its virtualization platform. VMware is the industry leader in virtualization, considered enterprise-grade, and is fourth generation vitualization technology. VMware and private cloud solution together provide the following benefits:

Reduce costs while increasing operational efficiencies - Quickly provision cloud resources for business critical applications as needed, and save your company the cost and inefficiency of additional underutilized hardware purchases, data center builds, and staffing.

Easily add and tune computing resources - Virtualization services allow easy scaling for each private cloud environment. A provider can dynamically add and tune computing resources for our private cloud clients as the demands on their applications grow and change. It eliminates the need for permanent investment when a temporary demand spike can be accommodated by accessing the resource pool.

Use experience to build your customized environments - A provider's experience in multi-server environments, shared data storage, network security, and virtualization mean the operations team is ready to deliver private cloud solutions to meet your specific needs.

Article Source:

About the Author

Online Tech offers a full spectrum of hosting solutions including basic colocation, managed colocation, managed dedicated servers and private cloud hosting in our Michigan colocation data centers. We can deliver our data center solutions more cost effectively, with lower overhead, less risk and better support than most IT departments can do themselves.

Cloud Computing : Amazon EC2 and Microsoft Azure

Comparing Amazon EC2 and Microsoft Azure


In one of our earlier posts we did a comparison between Google App Engine and EC2 based on some of the key parameters driving a Cloud Computing platform. In this post we will focus on a comparison between Amazon EC2 and Microsoft Azure.

[caption id="attachment_210" align="aligncenter" width="323" caption="Cloud Computing Australia : Cloud Computing - Comparing Amazon EC2 and Microsoft"]Cloud Computing Australia : Cloud Computing - Comparing Amazon EC2 and Microsoft [/caption]

 Language/Platforms Supported

  • Amazon EC2 provides support for setting virtualized instances of operating systems/applications and databases which essentially implies that in can support pretty much all the standard environments like Windows(Server 2000/2008), Linux flavors, OpenSolaris, multiple Databases (DB2, Oracle, SQL Server) and various Web and application servers.

  • Microsoft Azure on the other hand is purely a Windows environment with Windows Server 2008 the primary environment and all languages/applications that are available on Windows Server is supported. While they have released SDKs to support interop for other languages like Java, PHP and Ruby to run on Azure, i would still consider it primarily a first choice .NET environment.

  • Amazon clearly is a much more flexible environment and can support a wide variety of applications. On the other if you have a Microsoft based platform/application Azure will be a strong consideration.

Deployment/Setup Complexity

  • Setup for Amazon EC2 involves setting up an image and configuring a virtual instance of the instance to include the Operating System/Application. The setup and deployment activities is similar in nature to setup a virtualized environment. Amazon provides command line tools which would help support this.

  • Windows Azure offers to variations in terms of Deployment. A hosted model similar to Amazon EC2 and an alternate Azure Platform Appliance which allows a local deployment of windows azure within a company premise on local servers to setup a local virtual cloud. Azure provides an integrated approach of deployment directly from Visual Studio which would directly allow you to deploy the solution on Azure.

  • An integrated deployment options with Developer tools significantly reduces the deployment complexity for Azure based apps. In addition a lot of the configuration aspects of a VM needed to be setup on Amazon is not required on Azure.

Performance and Scalability

  • Amazon EC2 is a mature platform and has been proven to provide significant scalability options for cloud applications. Apart from auto scaling options, custom configuration and options allows you to pretty much scale to any level. Also since EC2 deploys applications as Virtual Instances it provides an opportunity to completely control and scale your instance and application.

  • Azure while not a traditional VM, provides a VM like environment for hosting applications. It provides auto scaling options and also ability to choose various configurations for the VMs based on the resource needs. SQL Azure functions as a Cloud Database in a shared environment as against Amazon RDS can be controlled per instance.

  • EC2 clearly provides a more configurable environment to scale up and scale out your applications with much more granular control over your applications and environment.

Stay posted for a blog post coming up on a 3 way comparison/cheatsheet on how to make a choice between the 3 major players in PaaS Amazon EC2, App Engine, Azure.

Check our Blog Technology Trendz for more details.


Article Source:

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Check our Blog Technology Trendz for more details.

Key Events in the history of Cloud Computing

5 Key Events in the history of Cloud Computing


While we have been evaluating in our blog posts the various features available on popular Cloud Computing platforms today, i thought it might be a good idea to understand when and how all this started and look back at where this began and trace some of the key events in the progress of cloud computing. Amazon like all other Internet companies in the period of the dot com bubble were left with large amounts of underutilized computing infrastructure, reports suggest less than 10of the server infrastructure of many companies were being used. Amazon may have use cloud computing as a way to provide this unused resources as utility computing service when they launched S3 as the first true cloud computing service in March 2006.

[caption id="attachment_206" align="aligncenter" width="300" caption="Cloud Computing Australia : 5 Key Events in the history of Cloud Computing"]Cloud Computing Australia : 5 Key Events in the history of Cloud Computing[/caption]

1. Launch of Amazon Web Services in July 2002
The initial version of AWS in 2002 was focused more on making information available from Amazon to partners through a web services model with programmatic and developer support and was very focused on Amazon as a retailer. While this set the stage for the next steps the launch of S3 was the true step towards building a cloud platform.

2. S3 Launches in March 2006
Here are some interesting articles on the launch of S3 in 2006. The real breakthrough however was the pricing model for S3 which defined the model of  'pay-per-use'  which has now become the defacto standard for cloud pricing. Also the launch of S3 really defined the shift of Amazon from being just a retailer to a strong player in the technology space.

3. EC2 Launches in August 2006
EC2 had a much quieter launch in August 2006 but i would think had the bigger impact by making core computing infrastructure available. This completed the loop on enabling a more complete cloud infrastructure being available. In fact at that time analysts had some difficulty in understanding what the big deal is, and thought it looks similar to other hosting services available online only with a different pricing model.

4. Launch of Google App Engine in April 2008
The launch of Google App Engine in 2008 was the entry of the first pure play technology company into the Cloud Computing market. Google a dominant Internet company entering into this market was clearly a major step towards wide spread adoption of cloud computing. As with all their other products they introduced radical pricing models with a free entry level plan and extremely low cost computing and storage services which are currently among the lowest in the market.

5. Windows Azure launches Beta in Nov 2009
The entry of Microsoft into Cloud Computing is a clear indication of the growth of the space. Microsoft for long has not accepted the Internet and the web as a significant market and has continued to focus on the desktop market for all these years. I think this is a realization that a clear shift is taking place. The launch of Azure is a key event in the history of cloud computing with the largest software company making a small but significant shift to the web.

Check our Blog Technology Trendz for more details.

Article Source:

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Technology Trendz Blog

History of Cloud Computing

History - Discovery of Cloud Computing

Author:Tyler Farell

Widespread study is the reason for the breakthrough of cloud computing. Perhaps a lot of of us inquire if what time and what place it was truly established or began. Believe it or not! But the original thought and notion about it  can be seen in the year of 1960s.

[caption id="attachment_201" align="aligncenter" width="300" caption="Cloud Computing Australia : History - Discovery of Cloud Computing"]Cloud Computing Australia : History - Discovery of Cloud Computing[/caption]

The discovery of cloud computing is a product of the partnership of some IT professionals to improve the existing computer hosting service before. Extensive research is the culprit of the discovery of it. Maybe many of us ask if when and where its really starts? Believe it or not! But the fundamental idea and concept of this can be trace back on the year 1960s, when John McCarthy speak out that 'computation may someday be organized as a public utility.'    More or less all the recent features of it, the similarity to the electricity industry and the use of private, public, community , and government forms, were systematically investigated in the book The Challenge of the Computer Utility by Douglas Park hill in the year 1966.

The definite word 'cloud' started on 1990 wherein one telecommunication company offers a service that is dedicated in point-to-point data routes and Virtual Private Network (VPN) services with analogous feature of service but at a much lesser charge. By controlling traffic to steadiness consumption as they saw in shape, they were capable to make use of their general system bandwidth more efficiently. The cloud representation was used to signify the separation peak linking between the task of the cloud provider and the accountability of the cloud user. Cloud computing broaden this boundary to cover up servers over and above the network infrastructure. It is an expected advancement of the widespread adoption of virtualization, service-oriented design, autonomic, and service computing. Fine points are explained from potential cloud users, who no longer have need for proficiency and technical knowledge in, or direct control over, the technology infrastructure of cloud computing.

In 2006, Having establish and set up that the new cloud design can produced a major internal competence enhancements, Amazon kicked off a new product expansion endeavor to give it to external clients, and commenced Amazon Web Service (AWS) on a service and application computing. By mid-2008, Gartner perceived a break for it 'to form the connection among clients of IT services, IT services users and those who trade them'. Gartner detected that many groups are doing some changes from company-owned hardware and software property to per-use service-based models. Because of this great shifting it result to increase popularity of cloud computing.

Article Source:

About the Author

Tyler Farell is a auto enthusiast who writes on Cloud Computing related news and topics. For more information visit  :  Cloud Computing

Cloud Computing : Google App Engine

Cloud Computing, Google App Engine: How big is the market Really ?


This year has been the year of Cloud Computing. You need to be literally hiding under a rock if you have not yet heard about it. Its everywhere, it seems enterprises are moving to the Cloud, small startup companies are moving their services to the cloud and it seems like the whole world will pretty much be on the cloud soon. Really is it this big ? It might be a good time to evaluate how big the adoption really is. Who are the big players and what does the competition look like.

[caption id="attachment_197" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing, Google App Engine: How big is the market Really ?"]Cloud Computing Australia : Cloud Computing, Google App Engine: How big is the market Really ?[/caption]

Google App Engine, Microsoft Azure,, and in some ways Amazon EC2 all fall into a specific sub category within cloud computing known as Platform as a Service(PaaS). Platform as a Service typically applies to platforms which support the deployment of applications without the underlying cost of server and other hosting infrastructure underlying. To differentiate PaaS against a regular hosting it would involve using specific proprietary platform APIs and development tools which typically provide a level of abstraction which reduces the development and maintainability requirement significantly.

Platform as a Service Adoption
Based on a survey from Forrester Platform as a Service is a measly 2of the overall application/platform landscape. Check this graphic for more details

While the survey is a little dated i do think it accurately reflects the current adoption levels in the market and most of the entire discussion of rapid adoption of the cloud is more around hosting and may not be around PaaS. The rest of it is all the typical Hype Cycle in play.

App Engine/Azure/EC2 Adoption

Now within the entire Cloud Platform enabling space it will be interesting to evaluate how the various players stack up against each other. A similar survey by Forrestor of developers using the various stacks break up as Amazon EC2 - 41 Microsoft Azure - 10.2and App Engine - 8.2

What does this all mean ?
Well it means that adoption is still in its infancy. While there is a lot of excitement around Cloud Platforms and the platforms show a lot of promise there is still some way to go before these platforms become a mainstream choice. It also means that this is right time to get started and playing around with the tools and understanding the platforms while the standards are still evolving. While the enterprises may still take a while to catch up these are great platforms for small companies, individual developers, students and education to adopt and learn while the platforms evolve.

Check more on the Technology Trends Blog

Article Source:

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Technology Trends

Cloud Computing : MicroSoft Windows Azure

What is Windows Azure?

Author: dwspriya

Windows Azure is a Microsoft cloud service operating system involved in service hosting, service management, and development for the platform of Windows Azure. It enables clients to develop, manage, and host web applications online with the help of Microsoft's data centers.

[caption id="attachment_193" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing - MicroSoft Windows Azure"]Cloud Computing Australia : Cloud Computing - MicroSoft Windows Azure[/caption]

Windows Azure is able to support several languages and can integrate with the environment of your existing premises. At present, it is sold commercially in 41 countries including United States, Canada, UK, and Australia.

There are three main components in Windows Azure. These are compute, storage, and fabric. Compute provides an environment for computation while storage deals mainly on providing storage for large-scale data. Fabric refers to the physical make up of the platform as it is composed of a network of servers and switches. A content delivery (CDN) service is also offered along with Windows Azure to allow content delivery from Azure storage to end users.

With the use of Windows Azure, developers can run applications and even store data on servers that are owned and operated by Microsoft. Such cloud applications can be valuable for both consumers and businesses.

There are several benefits associated with the use of Windows Azure. For one thing, it helps improve efficiency. It does not only improve productivity of a business but it can also reduce up-front costs. In fact, Windows Azure can help reduce the total cost of operations by as much as 40within a period of three years.

Since Windows Azure supports common languages, protocols, and standards, it is easy to operate and you can still make use of you skills in .NET, PHP, and Java in creating and managing applications on the web.

Another advantage of using Windows Azure is that you can get to focus more on giving quality services to your clients instead of worrying about technological and operational obstacles. You can also make the most out of the development tools provided as well as the automated service management to meet the needs of your customers at a faster rate. Of course, Windows has always provided reliable services and you can be sure of first rate experience by using Windows Azure.

Article Source:

About the Author

In order to find out more on bluehost hosting and similar website and webmaster related guides, check out hosting reviews

Sunday, August 28, 2011

Cloud Computing : Who Should Use Amazon Ec2?

Who Should Use Amazon Ec2?

Author:Zach Dexter

Amazon’s Elastic Compute Cloud, or EC2 for short, is a network of Amazon’s server hardware that customers may access as a web service.  EC2 geographically distributes the resources needed to serve your application, and it is capable of rapidly bringing new server instances online.  Customers only pay for bandwidth and space that they actually use; this, Amazon says, is more cost-effective for customers of in-house or hosted servers.  Amazon’s foray into cloud computing may seem like a blessing for customers and a breakthrough in the web hosting industry.  The technology is certainly impressive.  But for the average small business or personal website, traditional non-cloud web hosting is less time-consuming and more economical.

[caption id="attachment_188" align="aligncenter" width="300" caption="Cloud Computing Australia : Cloud Computing - Who Should Use Amazon Ec2 ?"]Cloud Computing Australia : Cloud Computing - Who Should Use Amazon Ec2 ?[/caption]

EC2 works by creating multiple instances of an Amazon Machine Image - essentially, the files that comprise a web application - from a base copy that resides on Amazon’s Simple Storage Service.  Via API or one of the prebuilt API implementations available from, users may create huge numbers of instances of the original Amazon Machine Image.  Each instance runs on a different server.  These servers are based on multiple locations around the world so that your application does not go down in the event of a natural disaster or other crippling event at a single location.

These features are ideal for high-demand web applications with many thousands of users.  Business-critical applications and high-traffic web sites may find significant cost savings in EC2 because it reduces overhead.  Instead of physically setting up new servers in the company datacenter or coordinating the physical installation of new servers at a remote datacenter, companies signed up for EC2 simply ask an employee to log on and press a few buttons via an API implementation - such as the EC2 extension for Mozilla Firefox.  Check out this screencast on YouTube that shows just how easy it is to bring a new Linux instance online (opens in a new window):

But say that you’re a small business with one employee in charge of managing your servers.  In this case, you should not use EC2 unless your application is extremely large.  What overhead will you really eliminate?  Dedicated server providers like SoftLayer already make it easy enough to bring a new server online in an hour or two.  Small businesses can usually scale up at a slower rate without losing revenue.  Most significantly, however, bandwidth charges at EC2 actually exceed those of many dedicated hosting companies, though disk space is reasonably priced.

See for yourself at Amazon's website:  calculate whether you would save money with Amazon’s EC2 (opens in a new window).

If you run a moderately-trafficked website, you will likely find that EC2 actually costs more - and that signing up will not do much to reduce your overhead.  If you run a simple business about-us site or a personal portfolio, you will probably end up with a tiny charge - a dollar or even less - per month.  But EC2 doesn’t come with the friendly control panels and easy site-management tools that shared hosts have.  You would also have to purchase extra software (like cPanel or DirectAdmin if you want a control panel), download an API implementation, and go through the hassle of learning to properly configure network settings and work with the quirks of virtual computing.

In the end, the best value for small and medium-sized web applications is a virtual private server or a physical dedicated server.  Unless you are familiar with configuring web servers, the best value for personal pages and portfolios is shared hosting.  And for the big companies, business-critical applications and very large websites, the Amazon Elastic Compute Cloud will likely save money in the long term by reducing overhead and increasing reliability and scalability.

Article Source:

About the Author is a blog about trends and news in the web hosting industry. The Top Three Awards, given or renewed each July, are a resource for people who are searching for tested, reliable web hosts.

Cloud Computing Green

Is Cloud Computing Green?

Author: Rick Blaisdell

The promised IaaS article is currently in progress. I blame the delay on my friend Greg from the Green industry :) .

[caption id="attachment_184" align="aligncenter" width="300" caption="Cloud Computing Australia : Is Cloud Computing Green ?"]Cloud Computing Australia : Is Cloud Computing Green ?[/caption]

Last week, I met up with Greg and gave him my usual cool-aid cloud speech on cloud savings, scalability, built in reliability, disaster recovery and security. He looked at me and said, "that sounds exciting" but looking at his facial expression there did not seem to be much excitement, so to get some life out of him, I said, "of course there is significant energy savings!" Now "Green" Greg sparked to life and for the next two hours all he wanted to hear were details on how cloud saves energy.

When making the decision whether an organization is ready or not for adopting cloud technologies, most people will take into consideration aspects such us financial savings, how well it suits their company, and how much it will simplify the IT processes, etc.

What many decision makers do not take into account happens to be a great benefit of cloud computing: it is green! Cloud computing comes with the great advantage of providing great energy savings, a fact which translates into being environmentally friendly.

Are you aware of how much energy IT systems consume and what impact they have on the environment? Did you know that the simple act of performing a Google search can be quantified in terms of carbon footprints? It is easy to imagine then that the maintenance of complex IT infrastructures consumes a significant amount of energy.

According to Gartner, the IT industry consumes at least 2% of all global energy use. Since IT processes have negative effects in the real world, you might want to consider minimizing those by taking into account the implementation of cloud computing. Yes, another great reason to do so…

Here are just a few motives why cloud computing technologies are more environmentally friendly than other types of IT infrastructures and how they contribute to sustainable living:

  • Recent studies have demonstrated that the environmental footprint of using cloud computing as compared to having an internal IT system can be reduced by 90%. This is definitely the case in my experience: with a recent IT migration to SaaS, I have been able to eliminate over 55 servers, onsite backup and cooling systems, thus saving over $700 a month by reducing servers, Storage Area Networks and cooling systems.

  • Data centers that utilize Cloud technologies are more efficient than traditional data centers. Energy is usually lost though server underutilization, because most of the time these servers are just in idle mode, but in a cloud environment the system is managed to run at the highest efficiency. It would be like 80 trucks driving down the road, all driving to the same location and each carrying 1% of their capacity, instead of one truck loaded up with 80% capacity. In this scenario, the remaining 79 trucks are just wasting energy.

  • In addition, data center planning allows better power utilization. In traditional data centers, there can be cooling problems and you can run out of space for more servers. There is also a consortium of cloud providers – the Green Grid – , which assures that its members optimize their data centers to minimize power consumption.

  • According to a recent Microsoft report, cloud computing can help with energy reductions through the employing of large scale virtualization. Also, software architecture can be optimized so that it provides the same functionality with less energy.

  • Service providers in cloud computing need to keep their expenses down, so they must ensure there is no waste of energy. Their focus is on performance so they provide the maximum of services with the least resources, energy included, which ultimately results in lower costs to you, the customer.

In the last years, technology has improved immensely, taking the environment into account and providing a solution for those worrying about carbon footprints and the impact of technology into the environment. And with the booming development of cloud computing technologies, now it is easier to do so.

Achieving energy savings and carbon emissions reduction is possible with the help of technologies such as SaaS, IaaS and PaaS and the on demand allocation of resources in cloud computing.

"Green" Greg now feels much better about the cloud! :) For those of you who have implemented these technologies, how much did the estimated energy savings impact the decision to do so? Are you interested in knowing how you can calculate energy savings?

Article Source:

About the Author

Cloud Computing Expert, Entrepreneur and Strategy Consultant

Cloud Computing : Closer Look at the iCloud

Taking a Closer Look at the iCloud

Author: Rick Blaisdell

Apple is set to change the technology world again. And this time it is entering the cloud computing realm. With the recent announcement of the iCloud service, it has embarked in a crusade to change the way customers store and access their files and applications.

[caption id="attachment_179" align="aligncenter" width="300" caption="Cloud Computing Australia : Taking a Closer Look at the iCloud"]Cloud Computing Australia : Taking a Closer Look at the iCloud[/caption]

iCloud is an Internet-based service which stores and automatically synchronizes content, in order for it to become available on all Apple devices: iMacs, iPods, iPads, etc. The supported content ranges from music, photos, books and documents, to email, contacts, calendar and bookmarks.

Below I take a look at the major benefits and what I consider to be disadvantages of the new service:

  • The synchronizing of documents is the obvious advantage of the iCloud. Since no one is using just one digital device, this step is quite necessary, in order to allow people to have access to files easily and conveniently. Having to remember what files you stored on what device is not very practical.

  • Another advantage is that Apple offers 5 GB of free storage space, which does not include music, apps, books and photos. Personally, I'm curious how fast people will use up those 5GB.

  • Furthermore, it syncs documents created with third-party apps. Will this have any impact on developing apps for iCloud? Or is this all handled at the IOS level?

  • On the other side of the coin, being able to store singles not bought from the iTunes in exchange for an annual fee, through the iTunes Match services (with a 25000 song limit) has brought up the subject of piracy. Here is an interesting point of view: However, I would expect they have probably worked this out.

  • Moreover, there is no iCloud support for XP users. The service requires either Vista or Windows 7. So in the end, Apple I actually pushing Windows users to upgrade to Microsoft's latest OS.

All in all, iCloud provides easy access to the entire content one might need, while taking advantage of the mobility, reliability, and scalability characteristic to cloud computing. Also, if you lose one device, you can take advantage of the backup capabilities offered by the cloud – no files are lost.

People are quite skeptic regarding whether the iCloud will prove to be a popular product, since this is not the first time Apple has tried to achieve success in this area (MobileMe was quite disappointing). I believe that this service will boost the popularity of cloud computing even more and will prove to bring a major impact in the life of the average Internet user. Cloud computing becomes more and more a part of everyday life.

Article Source:

About the Author

Cloud Computing Expert, Entrepreneur and Strategy Consultant

Cloud Computing Trends

Top 5 Cloud Computing Trends

Author:Rick Blaisdell

Since cloud computing seems to be on everyone's lips, it is quite hard to realize what the latest trends are and in what direction it is developing. Sometimes it seems that there is more confusion than clarification. Should the cloud be private or hybrid? Should you choose IaaS, SaaS or PaaS?

[caption id="attachment_176" align="aligncenter" width="300" caption="Cloud Computing Australia : Top 5 Cloud Computing Trends"]Cloud Computing Australia : Top 5 Cloud Computing Trends[/caption]

In past articles I have discussed the advantages and disadvantages of each cloud technology. Now, let's take a look at the latest trends in cloud computing. It is expected that both large enterprises and SMBs will adopt cloud computing technology at an exponential rate.

I am generalizing here what I believe to be the top 5 cloud computing trends:

  1. Cloud security will no longer be an issue. 

This is related directly to the first point, as IT professionals discover the fact that the managed cloud can be more secure than a physical  environment that is managed by your own IT staff that are responsible for many IT projects.

  1. Custom cloud computing services: 

Cloud migrations span from migrating from physical to SaaS, PaaS and IaaS.  This is a lot of ground to cover for an IT firm trying to be your cloud experts. Outsourced IT organizations will concentrate on automating very specific migrations and become the experts in those types of migrations. An example is outsourcing your exchange environment. This is one of the most painful cloud migrations and IT companies focus just on this type of migration offering services and automated software to make sure the migration is smooth and painless.

  1. Custom software development will shift towards the cloud.

Legacy software applications will need to be refactored to run more efficiently on cloud environments. This will increase software development and outsourcing will experience a boom.

  1. Innovation 

    probably the most important one :) Innovation will drive down cloud computing costs, increase security and help with migration from physical to cloud. As cloud computing innovation continues it will be difficult to make the argument that companies should not move to the cloud.

Moreover, I also believe that an alignment of standards is necessary – so far there are organizations such as The Green Grid and Cloud Security Alliance, but a comprehensive guide/entity to which most cloud providers to adhere is yet to be created.

All in all, I believe that more businesses to will get over the fear of embracing cloud computing as IT directors start to fully understand how their businesses could benefit from this new technology. I am expecting an even wider cloud adoption with a more accelerated increase.

Article Source:

About the Author

Cloud Computing Expert, Entrepreneur and Strategy Consultant

Adoption of the Cloud Computing

How Colocation Can Lead to Widespread Adoption of the Cloud

By Yan Ness

Cloud computing is a type of shared computing where one utilizes a hosting provider's large-scale computing infrastructure. In other words, a provider with thousands of servers interlinked together, will rent out their massive computing capabilities to customers looking to only pay for the computing that they use.

[caption id="attachment_166" align="aligncenter" width="300" caption="Cloud Computing Australia : Adoption of the Cloud Computing"]Cloud Computing Australia : Adoption of the Cloud Computing[/caption]

One of the benefits of cloud computing is the flexibility it provides. In theory, you can use 1 unit (CPU, storage, RAM or data transfer) one second and 100s of units the next second and you only pay for the units you use with no minimum or fixed fee. There is no commitment and you can stop using/paying whenever you want.

The only investment and commitment you must make is writing your software to utilize the cloud providers computing architecture. Each time you switch cloud providers you have to re-write your software to match the cloud provider's application programming interface (API).

The three enticing promises of cloud computing are:

Expandability - You can rapidly expand your computing infrastructure as the demand for your site or application expands.

Pay for what only what you use - You only have to pay for the processing capabilities you use.

Flexibility - Most cloud services require no minimum use commitment, so you can stop the service at any time.

The Problem with the Cloud

Cloud providers have two disadvantages that must be overcome to enable the three benefits above for just about ALL computing. In essence, until they solve these problems, adoption will be narrow. Once they solve these challenges, we believe cloud computing will be ubiquitous.

The two primary issues are:

The service level agreements offered with cloud computing are lacking in substantial guarantees or assurances. Some of this has to do with the immaturity of the cloud computing model as the providers still figure out how they can deliver performance guarantees in a shared environment that can be impacted by macro events.

Internet and computing usage spikes on Black Monday or catastrophic news events are unpredictable - and planning enough computing overhead for these non-normal events can be costly for cloud vendors.

We believe, this SLA issue can be solved by technology. Using sophisticated monitoring and metering, and automated price bidding for computing cycles in heavy demand periods, cloud providers will be able to manage the allocation of real-time compute resources and throttle back second tier customers to deliver against a SLA for top tier users.

The second challenge is much tougher. For most businesses, cloud computing is too new and untried to risk their business life on it. It reminds me of when the Internet first appeared. People were very nervous about putting their data "on the net" or providing personal information on a web page. Now you can't live without it (ever booked an airline ticket?).

Once the SLA problem is addressed, the benefits may outweigh this trust issue but until then adoption will be very sparse. And then again maybe the Cloud will never overcome the trust issue.

How to Solve The Trust Issue

To accelerate the adoption of the cloud in corporate computing we need to overcome the trust issue. The colocation operator is the answer to this challenge - they can provide a very powerful and cost-effective channel to accelerate adoption.

Why do people colocate with their local managed data center operator, rather than go with the cloud, if it's an option? Trust.

So, the question becomes how can the cloud leverage the trust delivered by colocation operators and use colocation as a channel to rapidly grow the market?

The answer is to develop an appliance that colocation operators can sell and manage for their clients. One that provides a certain amount of "umph" (CPU, memory, storage, # of servers etc.). The appliance would have the cloud vendors API to easily run cloud designed applications and could be designed with built-in metering that allows the client to only pay for what he uses.

But the real opportunity is the capability to allow an application running in the colocated appliance to "spill over" into the cloud by creating new servers and using resources in the cloud, outside the colocation data center.

Imagine if you will - an Amazon cloud appliance. A Software-as-a-Service vendor can design their prototype on Amazon's cloud to Amazon's API. When the time comes to production, the vendor can chose to continue running in the cloud, or move to a colocated cloud appliance if the location and security of their data becomes a trust issue for their clients, such as SOX, HIPAA or PCI requirements. Should the demand exceed their computing capability, they could make the call on which part, if any, they want to source to the Cloud.

It's at that point that the client will really get the value of expandability. If they've had experience with the appliance and with the colocation operator, that can bridge the trust gap. Now they are in a private cloud and the colocation partner is there do to the support and provider services that mainstream applications and users demand at this early stage of cloud computing.


Some view the Cloud as a threat to colocation operators' business model.

However, colocation is a service business that adapts to provide an expanding range of services and solutions - by managing hardware, network and software infrastructures on a scale with a level of responsiveness and support that is difficult for most companies to do themselves. The cloud is just another evolution in the process.

The real question is how can the cloud leverage the colocation model to accelerate the cloud market?

Yan Ness is CEO of Online Tech, the Midwest's premier managed data center operator, and has more than 20 years of experience launching and managing high tech companies, from startup to scale. In 2003, Yan led a group of investors to acquire Online Tech and has since delivered a range of data center services from midwest colocation to private cloud hosting from our SAS 70-audited data centers.

Article Source: