Aug 312011
 

Your First Hello World Google App Engine Application in 5 mins

Author:Kaushik

Moving with our series on getting started with Google App Engine in this article we will talk about getting your first Hello World Application running and deployed on the App Engine. We will use python for this example and if you are already well versed in python you can skip the basics and move directly to the deployment steps for App Engine.

Cloud Computing Australia : Cloud Computing - Hello World Google App Engine Application in 5 mins

Cloud Computing Australia : Cloud Computing - Hello World Google App Engine Application in 5 mins

Step 1: Download and install the Google App Engine SDK for Python
Here is the link for the windows version.

Step 2: Create a simple helloworld python program

  • Create a directory named as helloworld
  • Create a file in this directory as helloworld.py which contains the code below

 print \’Content-Type: text/plain\’
print \’\’
print \’Hello, world!\’

  • Create a configuration file called app.yaml in the directory with the following

 application: mytestapphelloworld
version: 1
runtime: python
api_version: 1

handlers:
– url: /.*
script: helloworld.py

Step 3: Run the Hello World Application using Google App Launcher

  • Click on Programs Menu and start Google App Launcher
  • Click onFile Menu -> Add Existing Application
  • Add the HelloWorld directory as the application
  • Click Run to start the application and click on Browse to open the web page on the browser
  • You can also check http://localhost:8080/ to check the live running application

Step 4: Deploy your app on App Engine

  • Go to the App Engine url at https://appengine.google.com/
  • Login using your App Engine account that you had created from our earlier exercise
  • Create a new application by clicking on Create Application button
  • Use mytestapphelloworld as the application ID. If this is not available use a unique ID that you can remember
  • Update the app.yaml file with your application ID from above
  • Change the line below and replace mytestapphelloworld to your application ID.

application: mytestapphelloworld

  • Run the following command

appcfg.py update helloworld/

  • Enter your Google ID/password at the prompt.
  • Your deployed application is available at

http://mytestapphelloworld.appspot.com
Learn more at Technology Trends Blog

Article Source: http://www.articlesbase.com/online-education-articles/your-first-hello-world-google-app-engine-application-in-5-mins-2786396.html

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

http://www.peepaal.org

Aug 312011
 

Getting a Google App Engine Account

Author:Kaushik

Now that you have a basic understanding of what is Google App Engine and where and how you can use it, lets get started with how to get setup with an account for Google App Engine.

Cloud Computing : Getting a Google App Engine Account

Cloud Computing : Getting a Google App Engine Account

Its actually pretty simple. As with all Google Products if you have a Gmail ID you already have access to a Google App Engine. You can use your Gmail ID and password to access the the App Engine Account. Makes it really simple doesn’t it. Many folks underestimate the impact of the universal access the Gmail ID provides to so many services and products. If you are like me you hate to register at every different service provider and site that is there. This easily is one reason you should try out App Engine. You already have access to it.

Step 1. Go the url for App Engine: https://appengine.google.com/

Step 2. If you already have a Google Account and are signed in. You will see something like the screen above with your userid already populated. All you need to do is to type in your password and login.

Step 3. Login to the dashboard.
After you login you should something like the screen below. You may not see any of the applications since you have non deployed yet.

That’s it. You are all set to get started with building your first Google App Engine App.

Get More Details at our blog TheTechTrendz.com. TheTechTrendz Blog is a Technology Blog focussed on tracking cutting Technology Trends and the overall direction of the Technology. Follow the Blog for interesting articles on Technology Trends.

Article Source: http://www.articlesbase.com/online-education-articles/getting-a-google-app-engine-account-2780210.html

About the Author
Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

http://www.peepaal.org

Aug 312011
 

Cloud Computing: What is Google App Engine ?

Author:Kaushik

Google App Engine is an online platform to host your web application. It is what is popularly called a Cloud Computing platform to host and enable your web applications on the cloud. Essentially having a cloud platform for hosting your applications abstracts you from worrying about the various platform setup and environment related issues like server configuration, network configuration, bandwidth requirements among others.

Cloud Computing Australia : Cloud Computing - What is Google App Engine ?

Cloud Computing Australia : Cloud Computing - What is Google App Engine ?

 What languages are supported ?

If there is one limitation from the platform it is the language support. Recently by adding Java support has been a huge plus and should significantly improve adoption of the platform.

Currently the App Engine supports Python and Java (JVM).

What this means is that App Engine applications can also be written in Java or any JVM-compatible language (e.g. JRuby, Groovy, Scala, etc.). It supports Java runtime 6

App Engine’s Python runtime supports Python 2.5

What Frameworks are supported ?

Using frameworks to be able speed up the development cycle has become such a key necessity that it is critical to check support for frameworks in any cloud environment that you want to work with. Google App Engine supports Struts 2 and Spring MVC for Java and most Python web frameworks including Django versions 1.0.2.

How much does it cost ?

This is the greatest thing about the App Engine. Its free if you use less than 500 MB space and have less than 5 million page views a month. This is the reason we rank this as the #1 platform for building cloud applications especially for student projects or small POC applications.

You can build around 10 applications for each google account that you have, and nobody stops you from creating any number of google logins, so you can always get as many free apps as you want.

For Apps where you expect more usage and need more space you have to pay per unit of the resource that you are consuming. So really it is a pay based on usage model which is still highly cost effective. I do not see a need for this unless you are writing commercial apps and for most cases the free limits are good enough.

How to get started ?

All you need is a gmail or a google account. That\’s it. If you have one you are ready to get started with building your first cloud app. If not you can always get an account, it just takes a couple of minutes.

More Details at our blog TheTechTrendz.com

Article Source: http://www.articlesbase.com/online-education-articles/cloud-computing-what-is-google-app-engine-2762392.html

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

http://www.peepaal.org

Aug 312011
 

Save significant cost on development and test environments using Cloud Computing

Author:V.C.Waturuocha

Cloud Computing is reported as set to revolutionise the business environment, particularly, the way we use information technology products and services.

Could it be the next stage in outsourcing, which certainly gripped the Information technology arena in the last ten years and saw companies essentially outsourcing all their software development/IT support requirements to off shore and even on shore companies to allegedly reduce cost.

 

Cloud Computing Australia : Cloud Computing - Save cost on development and test
Cloud Computing Australia : Cloud Computing – Save cost on development and test

Whether these companies derived the so called cost benefits and still maintained quality is highly debatable.

 Along comes Cloud Computing which essentially postulates that you can pay for IT services as a utility, on a pay as you use basis hosted in a cloud. Essentially another form of outsourcing where this time you pay a provider to either use their software applications via the Internet (Google APPS is an example) and is known as Software as a service (SaaS). Or to use their Platform as a service (PasS) to build and host your web based application (Google Engine is an example). Or to use their Infrastructure as a Service (IaaS) which essentially allows you to pay a provider to host an image of your application negating the need for you to own your own servers, (Amazon EC2 is an example of this kind of service).

Before signing up to any of the services, it is important that a company clearly understand the costs benefits, identify and implement an entry and exit strategy or plan capturing how to move to another vendor if it becomes necessary or to bring back the service in house.

Hosting sensitive data off site must also need to be considered very carefully and the need to ensure that no regulatory requirements are breached or the customer data act is adhered to in any such implementation. Non mission critical systems are probably the best ones to consider first to be implemented in the cloud.

Test and Development Environments

Of particular interest must be the IaaS, because companies spend a significant part of their yearly IT budget (sometimes as much 30 to 40 on purchasing new hardware to host new applications, in response to new business requirements.

The cost of maintaining such hardware can also be quite expensive from software licensing to support costs etc.

Test and Development environments which are not necessarily mission critical, (since they are not live environments), can certainly be procured on a pay as you use basis with little or minimal risk, which could significantly reduce IT spending by as much as half or a quarter.

As proven with virtualisation, building test and development environments becomes an easily repeatable process and likened to a factory floor where environments are rolled out on demand.

The same principle will apply to having your development and test environments in the Cloud using the IaaS module and the benefits are as follows;

  • Test and development environments tend to be required for fixed periods of time through the lifecycle of a Project and are left idle at other times. You only pay to use these environments when you need to, which can significantly reduce IT spending.
  • The effort and time required to set up test and development environments is significantly reduced, as you only need to access your server via the cloud and create your own environments. There is no need to wait for three weeks for the server to be delivered, another two weeks to be set up in the data centre etc.
  • The cost of support, from setting up the server in the data centre, configuring it, installing the operating system, installing other security and monitoring software, setting it up on your LAN etc
  • The effort required to decommission and re-use these test environments is also minimal.
  • Since live data (de – sensitised data can be used on the other hand) is normally not used for testing or development there are no regulatory requirements to adhere to.
  • Most IaaS cloud computing vendors offer 99/100up time, back up facilities, monitoring etc which ensures that once your environments are created, you only need to focus on carrying out your development and testing.
  • A company can embark on several IT Projects at any one time, because there is no longer the constraint of available hardware (or rather the need to provision more hardware) to host test and development environments.

Efficiently provisioning, building, managing and scheduling your test/development environments certainly contributes significantly to the success or failure of IT Software Projects.

Paying for your infrastructure as a service at least, eliminates the effort and cost of provisioning, installing and supporting the hardware to host your test and development environments.

Author: V.C.Waturuocha MSC – Freelance IT Environments/Infrastructure Project Manager.

Article Source: http://www.articlesbase.com/information-technology-articles/save-significant-cost-on-development-and-test-environments-using-cloud-computing-3458580.html

About the Author
V.C.Waturuocha

  • Freelance IT Environments Manager & Technical Project Manager
  • Previously Global FX technology manager at Citibank International
  • Worked in several sectors to include energy, utilities, Investment banking, railway transport, software development, healthcare, manufacturing, research, insurance and the gaming sectors.
  • Currently focussed on implementing Cloud Computing solutions for a UK based retail bank to reduce their spending on test and development environments.
Aug 312011
 

Private Cloud Computing: A Game Changer for Disaster Recovery

By Mike T Klein

Private cloud computing offers a number of significant advantages – including lower costs, faster server deployments, and higher levels of resiliency. What is often over looked is how the Private Cloud can dramatically changes the game for IT disaster recovery in terms of significantly lower costs, faster recovery times, and enhanced testability.

 

Cloud Computing Australia : Private Cloud Computing - A Game Changer for Disaster

Cloud Computing Australia : Private Cloud Computing - A Game Changer for Disaster

 

Before we talk about the private cloud, let’s explore the challenges of IT disaster recovery for traditional server systems.

Most legacy IT systems are comprised of a heterogeneous set of hardware platforms – added to the system over time – with different processors, memory, drives, BIOS, and I/O systems. In a production environment, these heterogeneous systems work as designed, and the applications are loaded onto the servers and maintained and patched over time.

Offsite backups of these heterogeneous systems can be performed and safely stored at an offsite location. There are really 2 options for backing up and restoring the systems:

1) Back up the data only – where the files are backed up from the local server hard drives to the offsite location either through tapes, online or between data centers over a dedicated fiber connection. The goal is to assure that all of the data is captured and recoverable. To recover the server in the case of a disaster, the operating system needs to be reloaded and patched to the same level as the production server, the applications need to be reloaded, re-patched, and configured, and then the backed up data can be restored to the server. Reloading the operating system and applications can be a time consuming process, and assuring that the system and applications are patched to the same levels as the production server can be subject to human memory and error – both of which can lengthen the recovery time. (This is why I hate upgrading my laptop hardware. I have to invest days to get a new laptop to match the configuration of my old laptop).

2) Bare Metal Restore – a much faster way to recover the entire system. BMR creates an entire snapshot of the operating system, applications, system registry and data files, and restores the entire system on similar hardware exactly as it was configured in the production system. The gotcha is the “similar hardware” requirement. This often requires the same CPU version, BIOS, and I/O configuration to assure the recovery will be operational. In a heterogeneous server environment, duplicate servers need to be on-hand to execute a bare metal restoration for disaster recovery. As a result, IT disaster recovery for heterogeneous servers systems either sacrifice recovery time or requires the hardware investment be fully duplicated for a bare metal restoration to be successful.

Enter disaster recovery for private cloud computing. First, with all of the discussion about “cloud computing”, let me define what I mean by private cloud computing. Private Cloud computing is a virtualized server environment that is:

Designed for rapid server deployment – as with both public and private clouds, one of the key advantages of cloud computing is that servers can be turned up & spun down at the drop of a hat.

Dedicated – the hardware, data storage and network are dedicated to a single client or company and not shared between different users.

Secure – Because the network is dedicated to a single client, it is connected only to that client’s dedicated servers and storage.

Compliant – with the dedicated secure environment, PCI, HIPAA, and SOX compliance is easily achieved.

As opposed to public cloud computing paradigms, which are generally deployed as web servers or development systems, private cloud computing systems are preferred by mid and large size enterprises because they meet the security and compliance requirements of these larger organizations and their customers.

When production applications are loaded and running on a private cloud, they enjoy a couple of key attributes which dramatically redefine the approach to disaster recovery:

1) The servers are virtualized, thereby abstracting the operating system and applications from the hardware.

2) Typically (but not required) the cloud runs on a common set of hardware hosts – and the private cloud footprint can be expanded by simply adding an additional host.

3) Many larger private cloud implementations are running with a dedicated SAN and dedicated cloud controller. The virtualization in the private cloud provides the benefits of bare metal restoration without being tied to particular hardware. The virtual server can be backed up as a “snapshot” including the operating system, applications, system registry and data – and restored on another hardware host very quickly.

This opens up 4 options for disaster recovery, depending on the recovery time objective goal.

1) Offsite Backup – The simplest and fastest way to assure that the data is safe and offsite is to back up the servers to a second date center that is geographically distanced from the production site. If a disaster occurs, new hardware will need to be located to run the system on, which can extend the recovery time depending on the hardware availability at the time of disaster.

2) Dedicated Warm Site Disaster Recovery – This involves placing hardware servers at the offsite data center. If a disaster occurs, the backed up virtual servers can be quickly restored to the host platforms. One advantage to note here is that the hardware does not need to match the production hardware. The disaster recovery site can use a scaled down set of hardware to host a select number of virtual servers or run at a slower throughput than the production environment.

3) Shared Warm Site Disaster Recovery – In this case, the private cloud provider delivers the disaster recovery hardware at a separate data center and “shares” the hardware among a number of clients on a “first declared, first served” basis. Because most disaster recovery hardware sits idle and clients typically don’t experience a production disaster at the same time, the warm site servers can be offered at a fraction of the cost of a dedicated solution by sharing the platforms across customers.

4) Hot Site SAN-SAN Replication – Although more expensive than warm site disaster recovery, SAN-SAN replication between clouds at the production and disaster recovery sites provides the fastest recovery and lowest data latency between systems. Depending on the recovery objectives, the secondary SAN can be more cost effective in terms of the amount and type of storage, and the number and size of physical hardware servers can also be scaled back to accommodate a lower performance solution in case of a disaster.

Conclusion: An often overlooked benefit of private cloud computing is how it changes the IT disaster recovery game. Once applications are in production in a private cloud, disaster recovery across data centers can be done at a fraction of the cost compared to traditional heterogeneous systems, and deliver much faster recovery times.

Mike Klein is the President and Chief Operating Officer at Online Tech, a leading managed data center operator in the Midwest. Online Tech offers a full range of colocation, managed server and private cloud hosting in their SAS-70 secure and reliable multi-tenant data centers across the Midwest. Visit http://www.onlinetech.com for more information.

Article Source: http://EzineArticles.com/?expert=Mike_T_Klein
http://EzineArticles.com/?Private-Cloud-Computing:-A-Game-Changer-for-Disaster-Recovery&id=5105962

Aug 302011
 

What is a Virtual Private Cloud?

Author: OnlineTech

For many enterprises, the private cloud is the first choice when it comes to cloud computing. Private cloud computing offers a hardware and network environment that is dedicated solely to the enterprise.  Whether delivered through a hosted private cloud or internal private cloud, there is an assurance that the hardware, data storage and network security are dedicated to a single organization and not shared with other users.  When it comes to highly sensitive corporate data, personal customer lists, PCI or HIPAA certified environments, the private cloud is preferred.

 

Cloud Computing Australia : Cloud Computing - What is a Virtual Private Cloud?

Cloud Computing Australia : Cloud Computing - What is a Virtual Private Cloud?

 

At the same time, businesses are recognizing the benefit of the public cloud for some of their applications.  Many small and mid-size businesses need the quality of an enterprise private cloud in terms of high availability, redundancy and security with the affordability of a shared cloud.  Enterprises need the ability to move non-critical, non-sensitive computing and development work out of their private cloud into a shared cloud but need to achieve a similar level of availability and security.

There are a number of public cloud options available.  Some like Amazon’s EC2 are basic cloud servers designed to provide computing services for non-critical applications like web or development servers.  Others are designed to deliver the high availability and security that enterprises require in their private clouds.  The later, we call a Virtual Private Cloud.
Wikipedia defines a Virtual Private Cloud as ‘a private cloud existing in a shared or public cloud’.

To expand on that, we consider a Virtual Private Cloud as one or more cloud servers in a shared environment that deliver the same level of data protection and network security as a dedicated private cloud.  The virtual private cloud can sit on same VLAN with a dedicated private cloud and managed servers to provide a completely integrated system.

In addition, by connecting a virtual private cloud into a hosted private cloud on a dedicated VLAN behind a dedicated firewall, the virtual private cloud provides a tremendous level of flexibility and expandability beyond the private cloud’s dedicated hardware, without compromising the data integrity or network security of the private cloud.

Article Source: http://www.articlesbase.com/information-technology-articles/what-is-a-virtual-private-cloud-4252996.html

About the Author

Online Tech (www.OnlineTech.com) owns and manages premier Michigan data centers. Offering managed dedicated servers, private cloud hosting, IT disaster recovery, and Michigan colocation, Online Tech helps their clients dramatically reduce IT data center costs, operational risks and downtime. Industry leaders trust Online Tech’s HIPAA Compliant, SAS 70 certified data centers to ensure their servers are always on, always online, and always safe.

Aug 302011
 

5 Reasons Why Your Company Should Transition to Private Cloud Computing

Author:OnlineTech

One of the biggest IT buzz words of the past few years, ‘cloud computing’ has brought incredible promise to the world of information technology. Cloud computing interest has lead to widespread curiosity and awareness of public cloud services. Pooled computing resources, divided and allocated to different users, have been proven in the development and testing realm of IT to have tremendous benefits. Those benefits however have been overshadowed by concerns of a lack of security and uptime assurances among Enterprise IT executives

 

Cloud Computing Australia : Cloud Computing - Transition to Private Cloud Computing

Cloud Computing Australia : Cloud Computing - Transition to Private Cloud Computing

 

The private cloud aims to lessen, if not eliminate, those concerns by dedicating exclusive hardware to each user. Instead of your data being stored off ‘in the cloud’ somewhere, one can point to a group of servers in a data center and say, ‘that is your data, and only your data, on those servers,’ alleviating security concerns. It is a ‘best of both worlds’ solution that can bring important benefits like improved uptime and reliability at a lower cost than on traditional IT infrastructure.

In fact, according to Yankee Group’s recent survey on Cloud Computing, private cloud computing is preferred 2:1 over fully managed public cloud solutions. 67of respondents preferred the private cloud, whereas only 28preferred a fully managed public cloud, 21preferred an unmanaged public cloud, and 8were looking to a hybrid cloud solution.

That is why we compiled the top 5 reasons why your company should rethink your IT infrastructure and look into the benefits of private cloud computing.

5. Dedicated hardware means increased security. Much like a dedicated or colocated server, the security of your private cloud depends on a variety of factors. However, if you have the proper physical security, anti-virus software, and firewall rules in place, you can rest assured your data as safe as if it were sitting right next to your desk. With a private cloud, you know where your servers are located and that the proper physical and network security is in place. You can meet and talk to those in charge of providing support for your hardware and come visit it if you like.

4. The transition from physical to virtual servers leads to better flexibility. This is one of the most alluring benefits of cloud computing. The ability to spin up and tear down a server in a matter of minutes is incredibly powerful and useful. No longer is there any wasted effort in trying to size a server beforehand when you can create a server on the fly. Need more disk space? More RAM? More CPU? No problem. With private cloud computing, you can reallocate resources in moments without worrying about finding a physical server that will have the resources your new server needs.

3. Fully utilize your hardware with better resource management. Virtualization significantly increases the value of your physical server hardware. Instead of having 5 servers that average 10CPU utilization, you can virtualize the 5 servers on one physical server, sharing the resources. This decreases rack space, power usage, and is easier to manage. This also allows you to create copies of your servers and have them up and running very quickly, now that they have been virtualized. If you have the proper resource management tools installed on your server, you can automatically allocate the appropriate resources to a server when it needs it or turn off unused servers during low usage; an extraordinarily powerful and efficient way to manage your servers.

2. Virtual servers combined with a SAN allow for improved protection against disasters. When you connect a SAN to your private cloud, incredible redundancy can be achieved. Not only can you load balance between servers, automatically shifting server resources between servers on the fly, but in an N 1 environment (having at least 1 extra server than absolutely necessary), you can shut down one server without causing downtime. Imagine performing maintenance on your server like adding more RAM, replacing a hard drive, or upgrading software, without experiencing any downtime. When configured correctly you could power off one server and it would automatically shift the virtual servers over to an available server in your cloud. Taking your disaster protection up one level, you could have another SAN in another data center and perform SAN to SAN replication for a hot site DR environment capable of full recovery in less than an hour.

1. Switching to private cloud computing will save you time and money. The best part about a private cloud is that not only do you get all of the great benefits of virtualization and security, but it can be cheaper and less of a hassle then hosting your own servers or buying dedicated servers. If your company has more than 2 servers, it could benefit from virtualization. If your company has more than 10 servers, it could benefit from private cloud computing with a dedicated SAN and multiple physical host servers. The public cloud revolutionized Information Technology forever; the private cloud brings the benefits to the masses.

Conclusion: Save Money, Save Time, Sleep Easy: Transition to Private Cloud Computing.
The private cloud hosting is still a relatively new concept, but is based on some technology that has been around for a while and has proven itself for years. Besides providing some tremendous benefits of pooled computing resources and virtualization, it maintains the security and reliability of a normal dedicated server. The private cloud is not for everyone but a managed data center operator can make the transition to private cloud computing an affordable process and enable you to fully experience all of the great benefits of the private cloud.

Article Source: http://www.articlesbase.com/technology-articles/5-reasons-why-your-company-should-transition-to-private-cloud-computing-3811960.html

About the Author

Online Tech offers a full spectrum of hosting solutions including basic colocation, managed colocation, managed dedicated servers and private cloud hosting in our Michigan data centers. We can deliver our data center solutions more cost effectively, with lower overhead, less risk and better support than most IT departments can do themselves.

Aug 302011
 

Building the Private Cloud

Author: OnlineTech
A private cloud combines the benefits of cloud computing, flexibility and cost effectiveness, with the security, data integrity, and service level agreements (SLAs) of SAS 70-certified, managed dedicated server environment.

Cloud Computing Australia : Cloud Computing - Building the Private Cloud

Cloud Computing Australia : Cloud Computing - Building the Private Cloud

 Private cloud hosting delivers dedicated servers, storage, and networks, so each client is assured an exclusive framework that offers data integrity and network security. This eliminates the concerns inherent to a public cloud environment, such as other clients’ applications running on the same server.

Your company can share applications, storage, and network resources in a secure, dedicated environment that is physically separated from your provider’s other client environments. The hardware, SAN, virtualization, and operating systems are configured and managed by the provider to your unique specifications.

Some providers’ private cloud offerings utilize VMware as the engine to run its virtualization platform. VMware is the industry leader in virtualization, considered enterprise-grade, and is fourth generation vitualization technology. VMware and private cloud solution together provide the following benefits:

Reduce costs while increasing operational efficiencies – Quickly provision cloud resources for business critical applications as needed, and save your company the cost and inefficiency of additional underutilized hardware purchases, data center builds, and staffing.

Easily add and tune computing resources – Virtualization services allow easy scaling for each private cloud environment. A provider can dynamically add and tune computing resources for our private cloud clients as the demands on their applications grow and change. It eliminates the need for permanent investment when a temporary demand spike can be accommodated by accessing the resource pool.

Use experience to build your customized environments – A provider’s experience in multi-server environments, shared data storage, network security, and virtualization mean the operations team is ready to deliver private cloud solutions to meet your specific needs.

Article Source: http://www.articlesbase.com/technology-articles/building-the-private-cloud-3780595.html

About the Author

Online Tech offers a full spectrum of hosting solutions including basic colocation, managed colocation, managed dedicated servers and private cloud hosting in our Michigan colocation data centers. We can deliver our data center solutions more cost effectively, with lower overhead, less risk and better support than most IT departments can do themselves.

Aug 302011
 

Comparing Amazon EC2 and Microsoft Azure

Author:Kaushik

In one of our earlier posts we did a comparison between Google App Engine and EC2 based on some of the key parameters driving a Cloud Computing platform. In this post we will focus on a comparison between Amazon EC2 and Microsoft Azure.

Cloud Computing Australia : Cloud Computing - Comparing Amazon EC2 and Microsoft

Cloud Computing Australia : Cloud Computing - Comparing Amazon EC2 and Microsoft

 Language/Platforms Supported

  • Amazon EC2 provides support for setting virtualized instances of operating systems/applications and databases which essentially implies that in can support pretty much all the standard environments like Windows(Server 2000/2008), Linux flavors, OpenSolaris, multiple Databases (DB2, Oracle, SQL Server) and various Web and application servers.
  • Microsoft Azure on the other hand is purely a Windows environment with Windows Server 2008 the primary environment and all languages/applications that are available on Windows Server is supported. While they have released SDKs to support interop for other languages like Java, PHP and Ruby to run on Azure, i would still consider it primarily a first choice .NET environment.
  • Amazon clearly is a much more flexible environment and can support a wide variety of applications. On the other if you have a Microsoft based platform/application Azure will be a strong consideration.

Deployment/Setup Complexity

  • Setup for Amazon EC2 involves setting up an image and configuring a virtual instance of the instance to include the Operating System/Application. The setup and deployment activities is similar in nature to setup a virtualized environment. Amazon provides command line tools which would help support this.
  • Windows Azure offers to variations in terms of Deployment. A hosted model similar to Amazon EC2 and an alternate Azure Platform Appliance which allows a local deployment of windows azure within a company premise on local servers to setup a local virtual cloud. Azure provides an integrated approach of deployment directly from Visual Studio which would directly allow you to deploy the solution on Azure.
  • An integrated deployment options with Developer tools significantly reduces the deployment complexity for Azure based apps. In addition a lot of the configuration aspects of a VM needed to be setup on Amazon is not required on Azure.

Performance and Scalability

  • Amazon EC2 is a mature platform and has been proven to provide significant scalability options for cloud applications. Apart from auto scaling options, custom configuration and options allows you to pretty much scale to any level. Also since EC2 deploys applications as Virtual Instances it provides an opportunity to completely control and scale your instance and application.
  • Azure while not a traditional VM, provides a VM like environment for hosting applications. It provides auto scaling options and also ability to choose various configurations for the VMs based on the resource needs. SQL Azure functions as a Cloud Database in a shared environment as against Amazon RDS can be controlled per instance.
  • EC2 clearly provides a more configurable environment to scale up and scale out your applications with much more granular control over your applications and environment.

Stay posted for a blog post coming up on a 3 way comparison/cheatsheet on how to make a choice between the 3 major players in PaaS Amazon EC2, App Engine, Azure.

Check our Blog Technology Trendz for more details.

 

Article Source: http://www.articlesbase.com/information-technology-articles/comparing-amazon-ec2-and-microsoft-azure-2912388.html

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Check our Blog Technology Trendz for more details.

Aug 302011
 

5 Key Events in the history of Cloud Computing

Author:Kaushik

While we have been evaluating in our blog posts the various features available on popular Cloud Computing platforms today, i thought it might be a good idea to understand when and how all this started and look back at where this began and trace some of the key events in the progress of cloud computing. Amazon like all other Internet companies in the period of the dot com bubble were left with large amounts of underutilized computing infrastructure, reports suggest less than 10of the server infrastructure of many companies were being used. Amazon may have use cloud computing as a way to provide this unused resources as utility computing service when they launched S3 as the first true cloud computing service in March 2006.

Cloud Computing Australia : 5 Key Events in the history of Cloud Computing

Cloud Computing Australia : 5 Key Events in the history of Cloud Computing

1. Launch of Amazon Web Services in July 2002
The initial version of AWS in 2002 was focused more on making information available from Amazon to partners through a web services model with programmatic and developer support and was very focused on Amazon as a retailer. While this set the stage for the next steps the launch of S3 was the true step towards building a cloud platform.

2. S3 Launches in March 2006
Here are some interesting articles on the launch of S3 in 2006. The real breakthrough however was the pricing model for S3 which defined the model of  ‘pay-per-use’  which has now become the defacto standard for cloud pricing. Also the launch of S3 really defined the shift of Amazon from being just a retailer to a strong player in the technology space.

3. EC2 Launches in August 2006
EC2 had a much quieter launch in August 2006 but i would think had the bigger impact by making core computing infrastructure available. This completed the loop on enabling a more complete cloud infrastructure being available. In fact at that time analysts had some difficulty in understanding what the big deal is, and thought it looks similar to other hosting services available online only with a different pricing model.

4. Launch of Google App Engine in April 2008
The launch of Google App Engine in 2008 was the entry of the first pure play technology company into the Cloud Computing market. Google a dominant Internet company entering into this market was clearly a major step towards wide spread adoption of cloud computing. As with all their other products they introduced radical pricing models with a free entry level plan and extremely low cost computing and storage services which are currently among the lowest in the market.

5. Windows Azure launches Beta in Nov 2009
The entry of Microsoft into Cloud Computing is a clear indication of the growth of the space. Microsoft for long has not accepted the Internet and the web as a significant market and has continued to focus on the desktop market for all these years. I think this is a realization that a clear shift is taking place. The launch of Azure is a key event in the history of cloud computing with the largest software company making a small but significant shift to the web.

Check our Blog Technology Trendz for more details.

Article Source: http://www.articlesbase.com/information-technology-articles/5-key-events-in-the-history-of-cloud-computing-2912369.html

About the Author

Kaushik Raghupathi is a senior IT Professional and Project Manager working out of India. Over the years he has worked on numerous IT projects with large sized teams. He is personally very fascinated around Learning methodologies in general and specifically around Community Based Learning. He is currently experimenting the concepts by working with students in this area.

Technology Trendz Blog