Friday, March 30, 2012

liferay installation on cloud

Installing Liferay 6.0.5 Community Edition Bundled With Tomcat

1. Install Java JDK
a) First we will create a directory for java:
mkdir /usr/java
b) Now we must download the JDK.
For this tutorial I will be using the 64bit version, if you are using the 32bit version of Ubuntu you will need to download the i586 version.
Go to and download 'jdk-6u21-linux-x64.bin' to the java directory you created in step 1.
You can directly get using
Yum install java
c) Now we will make the bin file executable and run it:
cd /usr/java
chmod +x jdk-6u21-linux-x64.bin
d) We must insert the following lines inside /etc/profile for both JDK6 and Liferay Portal.
Open /etc/profile:
vi /etc/profile
Insert the following lines on a new line at the bottom of the file:
export JAVA_HOME=/usr/java/jdk1.6.0_21
export LIFERAY_HOME=/usr/liferay/liferay-portal-6.0.5/tomcat-6.0.26
e) Ensure that JAVA_HOME and LIFERAY_HOME environment are correctly set. To do this, open a new terminal and type the following commands:

If mysql is not installed then we have to first install it.
Yum install mysql-server mysql-client mysql
/etc/init.d/service mysql start

 Install Liferay
Create a directory for Liferay
mkdir /usr/liferay
Download and extract the Liferay with Tomcat bundle community edition to /usr/liferay
As I am primarily a Windows user I download and extract Liferay within Windows and copy the extracted folder to /usr/liferay using WinSCP.
b) If you plan to use this installation in a production environment remove the default bundled sample data from $LIFERAY_HOME/webapps
Delete these folders:
 Create the Portal-Ext.Properties File
cd $LIFERAY_HOME/webapps/ROOT/WEB-INF/classes
Insert the following:
Change the username and password as desired.

Some files need to be executable:
chmod +x *.sh
Run Liferay
The following command starts Liferay, initial startup may take some time (10 to 15 mins depending on hardware) as the database is created etc.
To access Liferay navigate to http://<Liferay Server IP ADDRESS>:8080(default)

Alfresco installation on cloud

For Installing alfresco we need to perform the following:-
[root@ip-10-12-59-120:/opt] ./alfresco-community-4.0.d-installer-linux-x64.bin
Language Selection

Please select the installation language
[1] English - English
[2] French - Français
[3] Spanish - Español
[4] Italian - Italiano
[5] German - Deutsch
[6] Japanese - ???
[7] Dutch - Nederlands
Please choose an option [1] : 1 (Chose a language for the installation,Once you select a language then the setup wizard will get started)
Welcome to the Alfresco Community Setup Wizard.

Installation Type

[1] Easy - Installs servers with the default configuration
[2] Advanced - Configures server ports and service properties.: Also choose optional components to install.
Please choose an option [1] : 2 (Here it asks for  the installation type,we have to select Advanced type)

Select the components you want to install; clear the components you do not want
to install. Click Next when you are ready to continue.(Here it asks for the components to be selected)
Java [Y/n] :Y  (alfresco installs it’s on jdk)

PostgreSQL [Y/n] :n (By default it installs postgreSQL,as we are using mysql so we have selected no)

Alfresco : Y (Cannot be edited) (By default it is selected)

SharePoint [Y/n] :y (Select yes for share point)

Web Quick Start [y/N] : y (For installing Web Quick Start)

OpenOffice [Y/n] :y (For installing Open Office)

Is the selection above correct? [Y/n]: Y (Asks for the confirmation of the above installation is correct)

Installation folder  (Here it asks for the installation folder for alfresco, just press enter )

Please choose a folder to install Alfresco Community

Select a folder [/opt/alfresco-4.0.d]:

Database Configuration

JDBC URL: [jdbc:mysql://localhost/alfresco]: jdbc:mysql://localhost/alfresco (Set the jdbc url to point to mysql)
JDBC Driver: []:  (Set the JDBC Driver Path)
Database name: [alfresco]: alfresco (Set the database name to alfresco)
Username: [root]: root (User name for the mysql access)
Password: : (Set the password for the root user)
Verify: :               
Tomcat Port Configuration (Configure the tomcat )
Please enter the Tomcat configuration parameters you wish to use.
Web Server domain: []: (Set the web server domain)
Tomcat Server Port: [8080]: (Set it the default as it is)
Tomcat Shutdown Port: [8005]: (Set it the default as it is)
Tomcat SSL Port [8443]: (Set it the default as it is)
Tomcat AJP Port: [8009] (Set it the default as it is)
Alfresco FTP Port  (Set the FTP Port)
Please choose a port number to use for the integrated Alfresco FTP server.
Port: [21]:
Alfresco RMI Port
Please choose a port number for Alfresco to use to execute remote commands.
Port: [50500]:

Admin Password (Set the admin password for the alfresco access)
Please give a password to use for the Alfresco administrator account.
Admin Password: :
Repeat Password: :
Alfresco SharePoint Port
Please choose a port number for the SharePoint protocol.
Port: [7070]:
Alfresco SharePoint Port
Please choose a port number for the SharePoint protocol.
Port: [7070]:
Install as a service
You can optionally register Alfresco Community as a service. This way it will
automatically be started every time the machine is started.
Install Alfresco Community as a service? [Y/n]: y  (Asks for installing the alfresco community service)
Service script name
The alfresco service script file exists. Please insert a different name for the
service script.
Service script name: [alfresco]: alfresco (Provide an appropriate name for the alfresco)

OpenOffice Server Port (Set the open office port as it is)
Please enter the port that the Openoffice Server will listen to by default.
OpenOffice Server port [8100]:
Setup is now ready to begin installing Alfresco Community on your computer.
Do you want to continue? [Y/n]:y (Confirmation for the contunity for installation)
Setup is now ready to begin installing Alfresco Community on your computer.
Do you want to continue? [Y/n]: y
Please wait while Setup installs Alfresco Community on your computer.
 0% ______________ 50% ______________ 100%
Setup has finished installing Alfresco Community on your computer.
View Readme File [Y/n]: y
Launch Alfresco Community Share [Y/n]:y

Setup has finished installing Alfresco Community on your computer.
View Readme File [Y/n]: y
Launch Alfresco Community Share [Y/n]: y (Asks for launching the alfresco in the web browser)
Using CATALINA_BASE:   /opt/alfresco/tomcat
Using CATALINA_HOME:   /opt/alfresco/tomcat
Using CATALINA_TMPDIR: /opt/alfresco/tomcat/temp
Using JRE_HOME:        /opt/alfresco/java
Using CLASSPATH:       /opt/alfresco/tomcat/bin/bootstrap.jar
/opt/alfresco/tomcat/scripts/ : tomcat started
Alfresco Community 4.0
For Enterprise subscribers, refer to for release
notes and detailed information on this release.
For Community members, refer to the Alfresco wiki for more information on this
Press [Enter] to continue :
[root@ip-10-12-59-120:/opt] (Once  the installation is finished then we will be prompted the cmd,if we want to check whether the alfresco is installed and up we can just type the ip and the port assigned for the tomcat server)

Difference Between Cloud Computing and Utility Computing?

In many ways, cloud computing and utility computing are very similar. Both concepts revolve around the leasing of computing technology. In the past, companies were required to invest heavily in technology upfront, making it difficult for small and new companies to have the equipment needed to attain their business goals. Through services like utility computing and cloud computing, that upfront cost is largely offset, since companies lease what they need from month to month. As the need grows, so does the amount leased, therefore making it possible to customize computing costs at all points in time.

There are many parallels between these two computing styles. However, these two terms are not interchangeable, because there are key differences between them. Without understanding the depth of these differences, it is impossible to be fully aware of how either style can be most productive and beneficial in the computing realm, especially for specific computing tasks.

One of the fundamental differences between cloud computing and utility computing relates to nature of the leasing. While both styles utilize a third party for their software and infrastructure, utility computing involves much more direct access to these services. It is a straightforward rental, where the business is fully aware of the source of the services they are leasing. In essence, this style of computing makes the technology involved like another utility, and at the end of each month, businesses would be billed for their usage, just like water or electricity.

In this way, utility computing is relatively straightforward. Cloud computing, in contrast, is much less direct. While all the services are still being rented, the company knows far less about the source of the services. Users still pay for what they use, but the company providing the services utilizes a much more complex system of infrastructure and software, usually involving grid networks that support multiple tasks at once. In this way, cloud computing is actually more powerful, since it does not rely on any one source. By spreading out the task load, cloud computing can be a fast and effective means of computing, often with simplified troubleshooting and less maintenance overall.

This difference between cloud computing and utility computing is substantial, since it reflects a difference in the way computing is approached. Utility computing relies on standard computing practices, often utilizing traditional programming styles in a well-established business context. Cloud computing, on the other hand, involves creating an entirely distinctive virtual computing environment that empowers programmers and developers in new ways.

Even normal business computing tasks can look drastically different through these two computing styles. One such example is in costumer relationship management (CRM). This routine task involves the storage and use of client information, including contact details, contract specifics, and other related content. Through utility computing, businesses can easily maintain a traditional approach to CRM, and even companies that lack resources to invest heavily in infrastructure and software can still have a booming CRM program. This can be especially powerful for up and coming businesses, which may lack the capital needed to develop their own infrastructure but still need a way to maintain their thriving clientele base.

Through cloud computing, CRM can look radically different. While there is still less upfront cost through cloud computing, the approach to maintaining CRM changes drastically. The way the information is filed and accessed is enhanced through cloud computing, making the process faster and more accessible overall. All of this can be accomplished without any specific understanding of the technology that supports this interface, which allows for all attention to be diverted to the CRM processes themselves.

Ultimately, while utility computing and cloud computing both rely on a third party for much of their computing infrastructure, they reflect very different approaches to computing overall. Both can be a powerful method, but all businesses need to assess the details of these computing styles before deciding which method best suits their needs.

Why Cloud Computing Is Better Than Grid Computing

Several web developers, especially the new ones, have continuously misunderstand grid computing and cloud computing as one and the same. Both concepts, when compared to other solutions, are relatively new concepts in computing. Grid computing is a component of cloud computing to work perfectly, along with thin clients and utility computing. It serves as a link among different computers so that they form a large infrastructure thereby permitting sharing of resources. Utility computing, on the other hand, allows a user to pay for what he actually used. Cloud computing allows for on-demand resource provisioning and takes out over-provisioning when paired with utility computing so that the demands of a multitude of users are met.

Cloud computing allows companies to scale instantaneously. These corporations do not need to buy infrastructure, software licenses, or train personnel. Cloud computing is of primary importance to small and medium sized enterprises because they can outsource the computing requirements to data centers. In some instances, large companies can also benefit from cloud computing when they desire peak load capacity without spending on enlarging their internal data centers. Cloud computing allows its users to access their data and applications through the internet. The users are also charged for what they actually use.

With cloud computing, users have the option to choose whatever device they want to access their data and applications. Aside from the personal computers, they can now use PDAs, smart phones, and other devices. They don’t own any platform, software, and infrastructure. They also have to pay less upfront costs, operating expenses, and even capital expenses. With cloud computing, users need not know about network and server maintenance. They can access different servers around the world without necessarily knowing where the servers are located.

Grid computing, on the other hand, is a backbone of cloud computing. It allows provision of on-demand resource. Actually, grid computing can also be implemented outside of the cloud environment. For system integrators and administrators, they are particularly interested in grid computing while users need not know about grid computing as long as they are assured their interfaces work when they need them. Grid computing is possible through software which can control all the computers connected to a grid.

With grid computing, tasks are divided into smaller ones and are sent to different servers connected to a main machine. Once a particular server is done with its computing task, the results are sent to the main machine. As soon as all the other computing tasks are received by the main server, the result is then provided to the user. Through grid computing, there is effective use of all processing power of connected servers. Processing time is also greatly reduced when the tasks are divided and assigned to various servers.

Both grid computing and cloud computing are scalable due to application load balancing. Network bandwidth and CPU can be apportioned and un-apportioned. Depending on how large the customer base is, and the amount of data transfer and instances, storage capacity can move. Both concepts can handle multitasking and multi-tenancy. They both guarantee uptime to as much as 99%.

Cloud computing and Grid computing differ in the method they use in computing. For grid computing, a huge task is divided into smaller tasks and distributed to various servers. When the tasks are completed, the results are sent back to the main machine which will then provide a single output. Cloud computing offers various services to users which grid computing can’t offer like web hosting.

Cloud computing is also environment friendly as it reduces the need for more hardware components to run applications. Because of less servers needed, energy use is also reduced. Cloud computing also offers telecommuting opportunities which reduce requirement of office space, disposal of old furniture, provisioning of new furniture, and cleaning and disposal of office trash.

Difference Between Cloud & Virtualization

Cloud Computing and Virtualization are completely two different technologies. Both of them are considered as future saving technologies and that is the only similarity between them. In the process of Cloud Computing easily scalable and mostly virtualized data resources are provided to the users over the internet. Using Cloud computing colocation provider or Data Centers providers offer managed IT services via a hosted software as a service model. In Cloud Computing a database can be physically located somewhere in secure remote location and data can be accessed through clients computer using the database server to analyze and recover the data. This eliminates the need of a costly in-house equipments and department. Cloud Computing make use of virtualized resources such as servers, computing devices and networks. Cloud Computing provider own the hardware and the providing host manage all the services to the clients according to their usage basis. On the other hand Virtualization is creating virtual versions of technologies like servers, operating system, network resources and storage devices. Virtualization basically enables a single user to access multiple physical devices. In virtualization, either its one operating system using multiple computers to evaluate database or a single computer controlling several machines. Here we will learn how Cloud Computing and virtualization are different from each other.
Cloud Computing
The easily availability of computers has opened new doors in the field of information technology. Instead of creating own Data Center all the Data Center services can be acquired from IT server providers with outstanding IT infrastructure which is strong, flexible and safe. The cost which either wise would have been invested in building a Data Center can be avoided with acquiring related services on a usage basis. Fees for the managed IT services is similar to the operating costs of a Data Center. Cloud Computing seems attractive option, since it cuts a lot costs and also remove operating costs. In a traditional IT department equipments and staff are occupied in the peak hours, however in off hours they sat idle. Most of the servers are not operable in off hours, even when they are operational its not necessary that they are giving their 100% of their potential. Also a Data Center services provider also needs to upgrade latest technologies in order to keep the data safe and secure. On the other side these services are provided by Cloud Computing provider.

There are many types of virtualization, but all of them focus on usage format and control that improves efficiency. What this efficiency means is a single task running over multiple computers through unoccupied computer or a single terminal running multiple tasks. One more advanced advantage of virtualization is hosting an application to many users, thus preventing several needed software’s from being installed again and again. Data is combined into a central computer from databases, hard drives and USB drives. This process increase security and accessibility by using replication. In an IT company physical resources can be divided into many virtual networks allowing the Central IT resources to be available to all the departments via local networks. Computer devices which are allocated to individual staff members or which are only dedicated to a specific software application are highly efficient and they are much more cheaper.

Top Cloud Computing Companies

Who are the Top 10 Cloud Computing Companies?
If you’re a business owner and are thinking about joining the Cloud Revolution, you may be overwhelmed by the sheer number of cloud computing companies that offer services. In the past several years, this number has steadily multiplied, and will continue to do so for years to come. So, which companies are considered to be the leaders in this market you might ask? The following list represents what most industry insiders consider to be the Top 10 cloud computing companies as of 2011.

Amazon Web Services
The cloud computing industry leader in the opinion of most experts, Amazon has consistently outranked the competition in both innovation and customer service over the past two years.

Verizon (Terremark)
With the acquisition of Terremark, Verizon was able to expand its cloud services portfolio into the enterprise market and transform itself into, potentially, the biggest provider in the industry.
Thanks to a focused commitment to expanding its cloud services menu, long-time IT behemoth IBM has been busy eating up a massive chunk of the enterprise market this year.
Already highly successful in the Software as a Service market, guaranteed itself a continued place at the big boys’ table for the foreseeable future when it acquired Platform as a Service giant Heroku.

Specializing in IT integrations, CSC recently launched BizCloud – a unique private cloud service that integrates Infrastructure as a Service into your legacy IT system and interlinks it with Software as a Service providers.

Ranking second only to the mighty Amazon in cloud-based revenue, Rackspace is poised to remain with the pack leaders thanks to its recent acquisition of cloud management technology specialist Cloudkick.

The undisputed king of search engines spent the first half of 2011 adding attractive features to its Google App Engine service in a bid to win a greater share of the enterprise market.

When it pioneered a breakthrough improvement in vCloud resources for VMware, BlueLock immediately established itself as one of the leading VCE providers in the world.

Enjoying tremendous success with its Azure cloud service, Microsoft owns a significant share of the cloud services market among mobile companies, web companies and social networking firms.

Teaming up with Dell to create pre-configured, ready-out-of-the-box cloud infrastructure packages, Joyent has positioned itself as a leader in private cloud technology.

Cloud Computing Companies – Which One is Right for You?
When choosing a cloud computing provider, it is important to realize that no company is considered to be “the best” at providing every type of cloud computing service. Instead, each firm tends to specialize in either Cloud Management, Infrastructure as a Service, or Platform as a Service. Being aware of this fact, and your business’ requirements, will help you to find cloud computing companies that are the best at providing the specific service your company needs.

Security Issues for cloud computing

SaaS Security Issues
As interest in software-as-a-service grows, so too do concerns about SaaS security.  Total cost of ownership used to be the most frequently cited roadblock among potential SaaS customers. But now, as cloud networks become more frequently used for strategic and mission-critical business applications, security tops the list.

1.  Cloud Identity Management is lacking
Companies that have existing identity services running behind their firewalls may not find SaaS integration an easy proposition. Compatibility in this regard is a little behind the curve. Some companies are working on this, developing third-party applications that will allow IT departments to extend authentication into the cloud through a single log-on. Ping Identity and Simplified are two examples of this. This leads to another problem as well. The whole point of moving to SaaS is to reduce complexity. Buying more applications from more vendors only reintroduces the complexities that you’re probably hoping to avoid, not only for your infrastructure but your users as well.

2. Industry Secrecy
While vendors for cloud software naturally argue that their systems are far more secure than traditional infrastructures, they are disquietingly secretive about their security procedures. When questioned about this, the common response is that this is – oddly enough – done to protect the security of their systems. This may sound innocent enough, but several analysts claim this is a bad sign.
Specifically, analysts from the Burton Group have challenged Amazon’s Chief Technical Officer (CTO) with not being forthcoming enough about the company’s security practices, stating that when customers don’t know enough, they should assume the worst. Microsoft, on the other hand, has done a reasonable job of proving their security according to the analysts.

3. Open Access Increases Convenience but also Risk
One major benefit of software-as-a-service -- that business applications can be accessed wherever there is Internet connectivity -- also poses new risks. Coupled with the proliferation of laptops and smart phones, SaaS makes it even more important for IT shops to secure endpoints."Because of the nature of SaaS, its accessible anywhere," Senior Vice President Rowan Trollope of Symantec Hosted Services notes. "If I decide to put my e-mail on Gmail, an employee could log in from a coffee shop on an unsecured computer. It's one of the benefits of software-as-a-service, but it's also one of the downsides. That endpoint isn't necessarily secure. The data is no longer in your walls in the physical sense and in the virtual sense.

4. Cloud standards are weak
When you’re shopping around for a software vendor, one of the first things you want to see is the vendor’s security qualifications. Passing various standards makes this process easy – if a company boasts certain credentials, you can immediately understand the measures that company has taken to secure your data. Unfortunately, there aren’t any standards built around cloud software just yet.
SAS 70 is an auditing standard designed to show that service providers have sufficient control over data. The standard wasn’t crafted with cloud computing in mind, but it’s become stand-in benchmark in the absence of cloud-specific standards.
ISO 27001 "is not perfect but it's a step in the right direction," MacDonald says. "It's the best one out there, but that doesn't mean it's sufficient."There's no guarantee that your data will be safe with an ISO 27001-compliant vendor, however.

5. Your Data May Move Without Your Knowledge
A big perk of cloud computing is the lack of local storage. All of your files are stored on a remote, centralized server, which means you can access and modify your data from anywhere. However, one technical step that makes this possible is “load balancing.” If you access your files from a geographically distant place – say you’re on a business trip in Europe and trying to access your files from an American server – your cloud network will actually copy those files to a closer server to you to improve performance. This is a great feature, but it can run afoul of certain regulations – such as the Federal Information Security Management Act (FISMA) that requires companies to keep sensitive data inside the US.
Some companies offer guarantees that they can lock your data down to a particular geographic nation, but this is still a rare feature in SaaS vendors. Until vendors can reliably guarantee the geographic location of your data, or until a third-party vendor can accurately track the migration of that data, companies with sensitive data will need to make extra preparations before jumping into SaaS.

PaaS Security Issues

SOA related security issues
The PaaS model is based on the Service-oriented Architecture (SOA) model. This leads to inheriting all security issues that exist in the SOA domain such as DOS attacks, Man-in-the-middle attacks, XML-related attacks, Replay attacks, Dictionary attacks, Injection attacks and input validation related attacks. Mutual authentication, authorization and WS-Security standards are important to secure the cloud provided services. This security issue is a shared responsibility among cloud providers, service providers and consumers.

Vendor Lock In
Platform as a Service (PaaS) vendors tend to dictate the database, storage and application framework used, so what about those legacy applications? Enterprises will still require the skills and infrastructure to be able to run them.

Business Continuity Planning and Disaster Recovery with PAAS vendor
As for Example: Windows Azure platform, Microsoft's cloud computing platform, suffered an outage one weekend in March, 2009. Had your enterprise been using the service, how would the outage have affected the organization's ability to conduct business? Alternatively, it would have been Microsoft's responsibility to fix it, not your IT teams.

The mobility of an application deployed atop a PaaS is exceedingly limited, Lori says, first because "you are necessarily targeting a specific platform for applications: .NET, Java, PHP, Python, Ruby." The second limiting factor with PaaS "is the use of proprietary platform services such as those offered for data, queuing and e-mail. Image." What happens, then, is customers are locked into cloud platform service providers.

IaaS Security Issues

1. Trust and Transparency
One problem associated with cloud computing is that corporations are required to give up transparency into their IT resources. The promise of the cloud is that you pay for services and do not have to worry about the implementation behind the scenes. This “black box” concept in cloud computing creates issues that corporations will have to overcome if cloud computing is going to continue in its present course of growth. An organizations resources and data are essentially at the mercy of the cloud providers employees and policies. Organizations must adhere to regulations on certain types of data and with cloud computing, it’s almost impossible to tell if these regulations are being satisfied by the cloud service provider. Monitoring the internal operations of the cloud is very difficult and reliant upon the cloud service provider.

2.  Data Location
Certain types of businesses may need to follow additional regulations with regard to the location of their data. The term “location” can apply in two different ways; not only the logical location on the storage device, but also in what geographic location the storage resides. When data is stored in the cloud, you only know what your CSP has promised contractually. There may be very little information available about how they actually go about meeting those agreements. You might know a size limit on your account or even the fileshare name, but you probably won’t know who else is using that disk in the same storage device. Your data might very well be living next to your biggest competitor. This might make a company think twice about migrating to the cloud. In looking for commonalities among these operator-induced failures, the study found that system Configuration errors far outnumbered other types of errors (e.g., accidental deletion of data, moving users to the wrong fileserver, etc.). The study also found that many operator-induced failures could have been prevented had the operator better understood the system’s configuration, and in some cases how the system evolved to that configuration. This study definitely put a dent in the concept of trust, and it reveals an availability-related weakness in the cloud computing as it relates to errors that could be possibly made by data center personnel, and that could have serious operational impacts on a customer organization, if there are no appropriate safeguards in place.
In addition to operator errors, Denial of Service (DoS) and Distributed Denial of Service (DDoS) attacks. Present a serious threat to system availability. According to Roland Dobbins, solutions architect for network security specialist Arbor Networks, distributed denial of service attacks are one of the most under-rated and ill-guarded against security threats to corporate IT, and in particular the biggest threat facing cloud computing.
Now as IT organizations consolidate data centers, the problems to be addressed are getting bigger as well. One of the first things that many IT organizations will discover is that once you consolidate data centers or when you transfer these services to large third parties providing cloud services, data centers become bigger security targets.
While one of the cloud’s biggest selling features is reliability, this begs the question of whether the replicated copies of data are given the same care with regards to security as the originals. Dev Central points out that knowing the location within a country of the data center is just as important as certain locations could be prone to more natural disasters. This, combined with international concerns, makes it even more imperative to know the physical locations of all copies of company data.

3.  Availability and Denial of Service Attacks
There are many definitions of system availability, and one of them defines availability as “The degree to which a system or subsystem, or equipment is in a specified operable and committable state at the start of a mission, when the mission is called for at an unknown, i.e., a random, time.” Simply put, availability is the proportion of time a system is in a functioning condition. In a non time-critical environment, availability may not be a significant factor for an organization but what if continuous system availability is critical to the existence of an organization?
Companies whose entire existence depends on on-line sales, government organizations that depend heavily on IT services for communications and mission critical applications (Department of Defense) are typical examples of entities that can fully appreciate an uninterruptable availability of IT services, and that can suffer serious financial and operational consequences from their abrupt and long-term unavailability.

4. Insecure APIs
There are many organizations that build upon cloud interfaces to offer new or enhanced services to
Customers. From the cloud provider’s side, it is important to monitor the access to the interfaces as it leverages
And introduces a potential high risk threat to the cloud and its customers. Because the IaaS model contains several
 Components that are shared and sometimes outsourced, the entire system (the cloud) is exposed to great risk because at the end, one component can impact the other and consequently the whole cloud can be threatened.

Tiered Security Issues

The biggest issue in my opinion with cloud computing is what would happen if we lose the service for any reason. I believe it’s important for any company making use of such a service to have an effective disaster recovery plan.  Granted that this event occurring is not very likely; however, one has to have a contingency plan nonetheless. Ask yourself what it will mean if your business were to lose the cloud service? Can the business carry on? What will the cost per day of downtime be? And most importantly if the service is gone for good how long will it take to bring the system back up to operational level?
It is very important that, should the service close down or be temporarily disabled, the business has an effective disaster recovery plan that can help bring the business back up.  While this can sound complicated and expensive to carry out it doesn’t necessarily have to be that way.
When storing data in the cloud one has to be aware that data travels through a number of network points before reaching its destination, and at any one of these points (as well as in-betweens) it is possible to spy on any data going through there. It is very important to protect this data against prying eyes and this can easily be achieved if the link between your company and the cloud is encrypted.
Despite the entire buzz the cloud is generating, the nasty issue that nimbus advocates are glossing over is what happens when a service provider goes down. It’s happened with Google. It’s happened with NetSuite. And you can be certain that it will happen again. Those failures by service providers may be small potatoes, however, compared to the really frightening prospect of an Internet failure. In that scenario, the service providers’ systems would be humming away, but it won’t do their clients any good because they’ll be no Internet to access those systems. Any enterprise that is thinking seriously about the cloud should make sure they have a Plan B that will keep them up and running should their cloud disappear behind the horizon.

Cloud services are amazing enough to attract more and more users yet, like any other web server; it’s prone to cyber attacks.
In cloud computing, anybody can register and place their data to the cloud, paying a small fee. Cyber criminals often plant malicious codes to their hosting account and carry attacks on others account hosted on the same server.
Such attacks can only be avoided by repeated checks for loopholes and flaws in the security. Though, Cloud data of all users are placed in separate compartments but black-hats find their way to others account. With new technologies like CAPTCHA solving farms, automated brute force attack and others infiltration into user accounts have gone very easy. Since cloud server is much like your laptop containing your entire private and business details, losing such vital stuff is enough to take your sleep away. Most low cost cloud servers have a weak registration process. One can start using cloud hosting just by providing name, email-id and payment, hence there is no genuine way to trace the wrong doers.
Apart from this, cloud servers are in limited numbers, insufficient to meet the increasing requirements. Due to overflowing demand, most cloud servers provide services with less concern over persisting flaws and vulnerabilities. This can come out as a big security issue making the system more vulnerable to cyber attacks.


Data Loss or Destruction: Infrastructure Admin Deletes Data
A user with administrative privileges to the data storage infrastructure may maliciously or accidentally destroy the encrypted cardholder data.

One or All Data Center is temporarily or Permanently Unavailable
a natural disaster, power failure, or other situation could render one data center unavailable.

"Hacker" Destroys Data
a hacker could exploit vulnerability within a system or network device that allows access to or control of data (remember, we are intentionally ignoring application level issues in this particular article). The attacker may choose to delete, destroy, or render that data inaccessible.

Infrastructure Admin Accesses Data
a user with administrative privileges to the data storage infrastructure may attempt to access cardholder data for the purpose of financial gain.

Miscellaneous: Data Stored in Hostile Country
It is important for customers to ensure sensitive data or intellectual property is properly protected. Some countries do not have adequate laws or security standards in place to protect sensitive data or to restrict seizure of data by the government.

Availability Issues
when data or services are unavailable, companies may lose revenue during the down time and may lose customers concerned with reliability of data or applications.

Other issues

Bandwidth and latency issues
Bandwidth and latency issues arise from the need to move data in and out of the cloud. We assume (for now) that the Cloud is located outside any campus (i.e. either "Public Clouds" or a "Private Cloud" located at a shared data center/colocation facility). We assume that researchers would move data in/out of the cloud from/to a campus-resident storage facility, but note that there will likely be cases where this is not true: where researchers will want to transfer data from storage outside the campus directly to the cloud. There are additional bandwidth and latency issues that arise from the possibility that applications may be "pulled apart" and run on different clouds.
EECS paper reported average bandwidths of 5 to 18 Mb/s writing to Amazon's S3 cloud. Based on those results, they postulated a "best case" scenario in which a researcher writes 1TB of data to S3 at 20Mb/s on average: it would take approximately 45 days to complete the data transfer. We consider this scenario unacceptable and identify the issues that need to be addressed in order to resolve this problem.

Network capacity/capabilities
At present, most UC campuses have multiple Gigabit Ethernet (GE) connections to the CalREN networks, CalREN-HPR and CalREN-DC. A few campuses have upgraded one (or more) connections to CalREN-HPR to 10Gigabit Ethernet (10GE); no campus (to our knowledge) has plans to upgrade its CalREN-DC connections to 10GE. At the larger campuses, these GE connections are largely consumed with normal day-to-day campus traffic demands; a significant increase in bandwidth due to cloud computing would require funding additional connectivity between the campus and the CalREN backbone(s).
Many campuses operate border firewalls or packet filters that restrict various network protocols (SNMP, SMB, NFS, etc.).
It may not be possible to utilize or manage some cloud computing services, especially storage-based services, if these restrictions are not relaxed or removed. In the case of computing services that demand very high levels of performance, these devices might introduce performance penalties that compromise the cloud computing service.

The CENIC backbone is believed to have sufficient capacity to support the initial stages of a cloud computing roll-out, but would likely require augmentation to support large scale use of cloud computing (either internal or external). If increasing capacity can be done within the existing footprint of the [CalREN] networks (e.g. by adding line cards and transponders), this work could proceed relatively quickly given available funding. However, if increasing capacity to meet the needs of cloud computing were to require additional rack space and/or power at the CENIC POPs, significant delays could occur as space and power are not readily available at all colocation facilities.
UC campuses have connections to both the CalREN-DC and CalREN-HPR networks. In general, higher capacity and performance is available via the [CalREN]-HPR network, and it is assumed that most cloud computing connectivity should be provided via that network. In the case of internal clouds, this is relatively easy to ensure. However, in the case of externally provided clouds, it is most likely that the external provider will be connected to the CalREN-DC network and not the [CalREN]-HPR network. Heavy utilization of external cloud computing services might require significant re-engineering to match traffic patterns to the available network topology.

Exchange/peering points
If cloud computing services are provided by institutions not directly connected to [CalREN] networks, the traffic to and from those clouds will need to pass in and out of the [CalREN] networks through established peering and transit points. CENIC maintains a large number of peering relationships; however these connections are sized to cover existing traffic loads. Assuming the large scale use of external cloud facilities, the bandwidth provided at these peering facilities will need to be increased to avoid negatively impacting existing use of the network. Additionally, the geographic/topological placement of these facilities might need to be reviewed to address latency or other network performance related issue. Both increasing bandwidth and changing peering locations involve often significant, costs to CENIC, and by extension the CENIC membership. Settlement free peering (via the [CalREN]-DC network; see above) is in place for connectivity to the largest cloud computing providers (Google, Amazon, Microsoft); the use of providers for which settlement free peering is not available will require the payment of ISP bandwidth fees.

L2/L3 issues
The preceding sections address capacity on the campus and CENIC Layer 3 (routed IP) networks. Applications requiring very high bandwidth (i.e. approaching 10Gbps) or very low latency/jitter might be better served by a Layer 2 connection: either a dedicated wave on an optical network with Ethernet presented at both ends, or a VLAN configured on a switched Ethernet network running on top of an optical network. CENIC offers both types of two campuses connected to the HPR network. Thus, an L2 connection between any UC campus and a cloud that is directly connected to CENIC's optical network will be relatively easy and inexpensive.
Additionally, L2 connection capabilities are present in both national R&E networks, the Internet2 Network and National Lambda Rail (NLR). An L2 connection between any UC campus and a cloud connected to either the Internet2 Network or NLR will be relatively easy to set up, but will involve additional costs that may be significant.
L2 connections to a cloud not directly connected to the CENIC, Internet2 or NLR optical networks will likely be challenging and very expensive.
However, campus security concerns arising from these kinds of connections remain largely unresolved, since in most cases they will bypass existing campus firewalls and intrusion detection/prevention systems. Addressing security concerns on such "bypass networks" will require additional campus resources, both human and machine. These concerns already exist in the larger context of research computing, of course; they are not confined to cloud computing.
It should be noted that we are unaware that any existing cloud provider has been asked if it would support an L3 connection. The large public clouds have clearly made large investments in their L3 connectivity and might be understandably reluctant to consider alternatives. That seems likely to translate to a requirement that the organization requesting an L2 connection pay the entire cost of building and operating the connection. Even so, use of an L2 connection over the CENIC operated portion(s) of the path, and possibly over any NLR or Internet2 portions, could provide sufficiently improved performance to make it worthwhile.

Cloud Deployment Implications

Irrespective of the deployment model, in general any organization opting for cloud must consider the following implications:

Network Dependency – Whether you choose, on-site or off-shore, a reliable and secure network is highly desirable for good performance.

Subscribers still need IT skills – You can’t just offer a pink-slip to all your IT resources. To manage various user devices that access cloud, resources with traditional IT skills are required, though in lesser number. Additionally, your existing resources may need to update themselves with new skills for working in cloud.

Risk from multi-tenancy – On-site private cloud mitigates this security risk by restricting the number of possible attackers as all the clients are typically the members of one subscriber organization. In a public cloud scenario, a single machine may be shared by the workloads of any combination of subscribers. This indeed raises the security risk as the number of potential attackers increases with number of subscribers. Therefore we can safely conclude that risk due to multi-tenancy increases in an order which can be stated as Private, Community, Hybrid, Public cloud.

Data import/export and performance limitations – Generally the on-demand bulk data import/export is limited by the cloud’s network capacity. In the on-site
private cloud scenario, however, these limits may be adjusted, although not eliminated, by provisioning high-performance and/or high-reliability networking within the subscriber's infrastructure.

Workloads Locations – Workloads refers to managing hardware resources efficiently. Generally, cloud migrates workloads between machines without any inconvenience to the clients, i.e., it’s hidden from the client. Generally, the cloud vendors take care of this but you must explicitly check with your vendor if it manages the resources efficiently.
The implications described here are general in nature. Before making any decision in favor of a specific deployment model, study the detailed implications of that particular deployment model. For details, please check the reference section.

Deployment models of cloud

How are Cloud Computing Solutions deployed? 
What are the general implications for different deployment options? 

This post will cover another basic of Cloud Computing, popularly known as Cloud Deployment Models.
The content of this post is based on the recommendations of the National Institute of Standards and Technology (NIST) - Special Publication 800-146. The credit for the images used in this article goes to NIST 

Following are the four types of Cloud Deployment Models identified by NIST.

Private cloud
Community cloud
Public cloud
Hybrid cloud
Private Cloud
"The cloud infrastructure is operated solely for an organization."
Contrary to popular belief, private cloud may exist off premises and can be managed by a third party. Thus, two private cloud scenarios exist, as follows:
On-site Private Cloud
Applies to private clouds implemented at a customer’s premises.
Outsourced Private Cloud
Applies to private clouds where the server side is outsourced to a hosting company.
Examples of Private Cloud:
Ubuntu Enterprise Cloud - UEC (powered by Eucalyptus)
Amazon VPC (Virtual Private Cloud)
VMware Cloud Infrastructure Suite
Microsoft ECI data center.

Community Cloud
The cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns
 (e.g., mission, security requirements, policy, and compliance considerations).  Government departments, universities, central banks etc. often find this type of cloud useful. Community cloud also has two possible scenarios:
On-site Community Cloud Scenario
Applies to community clouds implemented on the premises of
the customers composing a community cloud
Outsourced Community Cloud
Applies to community clouds where the server side is
outsourced to a hosting company.
Examples of Community Cloud:
Google Apps for Government
Microsoft Government Community Cloud

Public Cloud
The most ubiquitous, and almost a synonym for, cloud computing. The cloud infrastructure is made available to the general public or a large industry group and is owned by an organization selling cloud services.
Examples of Public Cloud:
Google App Engine
Microsoft Windows Azure
IBM Smart Cloud
Amazon EC2

Hybrid Cloud
The cloud infrastructure is a composition of two or more clouds (private, community, or public) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load-balancing between clouds).
Examples of Hybrid Cloud:
Windows Azure (capable of Hybrid Cloud)
VMware vCloud (Hybrid Cloud Services)

Your Reviews/Queries Are Accepted