Posts Tagged ‘cloud computing’

On CIOL News,

L&T Infotech, a global IT services provider, today announced that it has teamed with SEEBURGER Inc., a provider of global business integration solutions, for a strategic partnership in order to increase U.S. implementation resources for the latter’s electronic data interchange (EDI) and business-to-business integration (B2B) software.

With this, L&T Infotech would provide both sales and deployment services for the SEEBURGER Business Integration Server and associated solutions, said a press release.

L&T Infotech has nine U.S. offices with dedicated teams in key industry sectors with EDI/B2B needs, including technology, manufacturing, finance, healthcare and energy/petrochemicals. The firm has extensive SAP and Oracle expertise as well as B2B systems integration experience, making it possible to support customers who are deploying the SEEBURGER platform in conjunction with an update, migration or implementation of a new ERP system, according to the release.

“Much of our systems integration business is ERP-focused, and many of our ERP customers need B2B integration as well,” said Sudip Banerjee, CEO, L&T Infotech. “Adding SEEBURGER technology to our portfolio will allow us to serve that need with what we consider as a robust, advanced and scalable EDI/B2B platform,” he added.

“Partnering with L&T Infotech expands our services capacity and provides an additional expert implementation resource for our U.S. customers, particularly for crossover deployments involving tandem ERP/EDI upgrades,” said Wesley Thompson, VP of Business Development, SEEBURGER Inc.

SEEBURGER’s EDI/B2B solution suite includes multiple B2B gateways and related products for disparate enterprise needs, including specialized solutions that automate document exchange with non-EDI-enabled trading partners via e-mail, spoke units and partner portals, the release added.


Synergetics is Awarded as the “Best. NET Training Service Provider” by Microsoft.

AT&T to introduce new cloud computing service” on Siliconindia News Bureau

Global telecom company AT&T has expanded its portfolio of cloud-based services to include on-demand compute capacity.

The addition of Synaptic Compute as a Service offering strengthens AT&T’s position in competing with other large cloud-based services providers like Amazon Web Services, Microsoft and Google. The telecom company already offers cloud-based storage and hosting services.

“As companies increasingly move to cloud-based environments, AT&T Synaptic Compute as a Service provides a much-needed choice for IT executives who worry about over-building or under-investing in the capacity needed to handle their users’ traffic demands,” said Roman Pacewicz, Vice President of Strategy and Application Services, AT&T.

The service, expected to launch in the fourth quarter of 2009, will feature a Web-based interface, pay-as-you-go billing structure and multiple storage options for use with the existing Synaptic Storage offering. AT&T said that there will be no up-front fees, long-term obligations or early-termination penalties.

The company partnered with the leading virtualization software developer VMware and multi-faceted technology company Sun to develop its newest offering. The product is using VMware’s vSphere hypervisor and vCloud API.

The company will deploy the service in the U.S, but it will be accessible from anywhere through the internet. It Plans to expand the offering globally in the future.

Synergetics is Awarded as the “Best. NET Training Service Provider” by Microsoft.

On InfoWorld, Paul Krill Writes,

Windows Azure, Microsoft‘s fledgling cloud computing platform, is piquing the interest of IT specialists who see it as a potential solution for dealing with variable compute loads. But an uptick in deployments for Azure, which becomes a fee-based service early next year, could take a while, with customers still just evaluating the technology.

“We’d be targeting applications that have variable loads” for possible deployment on Azure, said David Collins, a system consultant at the Unum life insurance company. The company might find Azure useful for an enrollment application. “We have huge activity in November and December and then the rest of the year, it’s not so big,” Collins said. Unum, however, is not ready to use Azure, with Collins citing issues such as integrating Azure with IBM DB2 and Teradata systems.

“From a scale-out perspective and for the future, it’s kind of interesting to hear” about Azure, said Michael Tai, director of development at Classified Ventures. But his company is probably not looking to use Azure in the short term, he said.

Meanwhile, an advertising agency that has done ads for Windows 7 already has used Azure. An official of that company also cited benefits in offloading of compute cycles to the cloud. “We’ve used Azure on a couple of projects already and had great success with it,” said Matthew Ray, technical director at Crispin Porter + Bogusky. “I think what helps us is we don’t have all the time and money” to build huge server clusters for projects that get a lot of traffic but only live for a month, Ray said. Using traditional platforms, “you can spend inordinate amounts of money — hundreds of thousands of dollars — to support something like the Super Bowl, something like that, and you’re done in a day, basically,” he said.

Microsoft has improved Azure since the last time the company looked at it. “It wasn’t as rich as it looks now,” said Sean Gordon, an architect in the strategy architecture emerging technology team at Chevron. “We’re looking at offloading compute resources, potentially, into the cloud,” he noted.

A Microsoft SharePoint software vendor sees Azure‘s potential for purposes such as extranets. “A lot of applications I can see being extended to the cloud,” said Stephen Cawood, community director at Metalogix. “For big companies, they’re still going to want to have their own datacenters and host things like SharePoint, but I can see them using cloud computing possibly for extranet scenarios where they’re working with partners or even customers.”

David Nahooray, software developer for the Organization for Economic Cooperation and Development (OECD), an international intergovernmental agency, said any decision to go to the cloud would be made a higher level. “[Azure] looks interesting, but it’s probably up to my boss to decide if we can go and put stuff outside in the cloud,” Nahooray said. Data such as economic indicators could be deployed on Azure for access by other organizations, he said.

Cloud computing is here. Running applications on machines in an Internet-accessible data center can bring plenty of advantages. Yet wherever they run, applications are built on some kind of platform. For on-premises applications, this platform usually includes an operating system, some way to store data, and perhaps more. Applications running in the cloud need a similar foundation. The goal of Microsoft’s Windows Azure is to provide this. Part of the larger Azure Services Platform, Windows Azure is a platform for running Windows applications and storing data in the cloud.

Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. To deploy a new solution, most of your time and energy is spent on defining the right infrastructure, hardware and software, to put together to create that solution, cloud computing allows people to share resources to solve new problems. cloud computing users can avoid capital expenditure (CapEx) on hardware, software, and services when they pay a provider only for what they use.

Synergetics is a premium brand in the Indian IT industry with an experience base of over 15 years in the area of people   competency development;   engaged in delivering it thru  its   training and consulting interventions , primarily focusing on their productivity with regards to the project and deliverables on hand.Its primary differentiator has been its solution centric approach and its comprehensive client focused    service portfolio.

It’s always interesting to me to see the job growth in emerging spaces, such as cloud computing. Typically, the hype is huge around a concept (such as SOA, client/server, or distributed objects) about 8 to 12 months before there is notable job growth. This is often due to companies not understanding the value of the new technology, as well as to the lag in allocating budget and creating job reqs.

Cloud computing seems to be a different beast. After no fewer than four calls last week from headhunters looking for cloud architects, cloud engineers, and cloud strategy consultants, I decided to look at the job growth around cloud computing, using my usual unscientific measurements. This included a visit to the cloud job postings at indeed.com, which provides search and alerts for job postings and tracks trends.

I figured that I would see a line that looks like the bunny slope, gradually sloping up from left to right. Instead, as you can see below, I saw extreme heli-skiing: Since January 2008, the growth in job postings that mention cloud computing has hit 350,000 percent. (Of course, those are all kinds of job postings that mention cloud computing, and some are perhaps not cloud computing jobs. But still.)

While I just have my personal experience to draw upon, this seems to be the largest inflection around a hyped space in IT that I’ve ever seen, especially considering we’ve been in a downturn in which many companies have reduced IT jobs.

There are only a handful of qualified people out there who actually understand the basics of cloud computing, much less the details behind cloud computing architecture, implementation, development, testing, and security. Thus, I suspect we’ll see many jobs filled by the wrong people — and the bad results that come from that. The larger issue is that the people doing the hiring also don’t understand cloud computing, so they don’t realize that the “expert in Amazon cloud service” claim on a résumé actually means the candidate can purchase books and shoes using the site’s “one click” feature.

First, and foremost: I will stay fully employed. 🙂

Second, the salaries of cloud computing experts will be driven up significantly as too many jobs chase too few qualified candidates.

Third, the need for cloud computing training will explode, including architecture, planning, testing, security, and deployment. There’s lots to learn there, and it’s very different than on-premise systems, trust me.

Finally, we’ll have to deal with the many positions that will be taken by less than qualified staff. Thus, there will be some frustration around the productivity of cloud computing that in most cases will be traced back to a talent issue, not the technology itself. We saw the same thing with SOA.

Start that training and update your résumés, people.

Cloud computing is here. Running applications on machines in an Internet-accessible data center can bring plenty of advantages. Yet wherever they run, applications are built on some kind of platform. For on-premises applications, this platform usually includes an operating system, some way to store data, and perhaps more. Applications running in the cloud need a similar foundation. The goal of Microsoft’s Windows Azure is to provide this. Part of the larger Azure Services Platform, Windows Azure is a platform for running Windows applications and storing data in the Cloud.

Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. To deploy a new solution, most of your time and energy is spent on defining the right infrastructure, hardware and software, to put together to create that solution, Cloud computing allows people to share resources to solve new problems. Cloud computing users can avoid capital expenditure (CapEx) on hardware, software, and services when they pay a provider only for what they us.

I was taken back a bit by this recent article talking about some big predictions from Gartner around the adoption of cloud computing:

Cloud Computing will become so pervasive that by 2012, one out of five businesses will own no IT assets at all, the analyst firm Gartner is predicting.

The shift toward cloud services hosted outside the enterprise’s firewall will necessitate a major shift in the IT hardware markets, and shrink IT staff, Gartner said.

This is very interesting to me, considering that many new and small businesses are finding a great deal of value in moving to cloud computing. However, I’m not sure I agree with Gartner over the amount of movement that will occur by 2012. Sorry to once again be the buzzkill, but a sure way to bury a space is to overhype and under deliver.

Don’t get me wrong: Cloud Computing will have an impact. I suspect that most midsize and small businesses will use e-mail and document management systems that are outside their firewalls. We’ve seen a lot of movement in this direction in 2009, and with the rapid expansion of Google Enterprise services and the emerging online version of Microsoft Office, this trend will only accelerate.

At the same time, major enterprise systems are now SaaS-delivered, platform-as-a-service is giving open source platforms a run for their money, and infrastructure-as-a-service is becoming much more compelling when considering the technology, as well as the business case. Things are actually moving along nicely.

However, “no IT assets at all” by 2012 in one out of five businesses? That’s a huge shift in a short amount of time. While analysts and thought leaders love to make revolutionary statements such as this because they are provocative, in the real world most businesses, large and small, are still struggling with the place that Cloud Computing will have in their IT strategy, and they are far away from complete outplacement of major IT assets. In other words, I appreciate Gartner’s enthusiasm, but I don’t see it based on what I’m seeing with my clients or in the industry in general.

By Kevin Fogarty, on 26th Jan 2010

CIO – IT people with skills and experience in server virtualization, cloud computing or both have a far greater chance of getting and keeping jobs than most other IT people now, according to recruiters and analysts. But what do you call these gurus? There’s no accepted standard for what to call either virtualization or cloud-computing specialists, so jobseekers will have to look for a range of keywords-and include those in their resumes-to find a match with particular employers, says Dice.com spokesperson Jennifer Bewley.

If you are searching for a virtualization or cloud role, watch your search terms, she says.

Just using “virtualization” as a keyword, for example, pulled up 880 jobs on Dice.com on one day last week, for example, according to Bewley.

“However, there are another 900 jobs that include ‘VMware’ as a keyword with no mention of virtualization,” Bewley found. “That leads us to conclude that searching based on vendor is particularly important in virtualization jobs.”

Common terms for virtualization specialists include: Architect SAN/Virtualization; Citrix / VMware specialist or administrator; Data Center Virtualization Systems Analyst; and Product Manager for Large Scale Virtualization.

Cloud Job Searching Tricky People looking for new jobs described using “cloud computing” may not be completely out of luck, but they’re not far off, according to M. Victor Janulaitis, CEO of IT job-market researchers Janco Associates, which published this month a survey of CIO hiring plans for 2010.

“I have seen some people just use ‘cloud specialist’ to describe themselves,” Janulaitis says. “There’s not really a set of terms yet that are common to refer to cloud computing skills-they just refer to them as architecture or infrastructure skill sets.”

[For timely cloud computing news and expert analysis, see CIO.com’s cloud computing Drilldown section. ]

Other people just add “cloud” or “virtualization” onto more common titles such as system administrator, systems engineer, architect, or network engineer, Bewley says.

Remember, it’s not unusual for employers or prospective employers to use the name of a particular vendor or new technology as a primary identifier in a job ad, especially if they’re looking for someone certified in that vendor’s technology, according to Tom Silver, senior vice president of tech-job ad site Dice North America.

Normally a company would look for technical skills in a particular technology, not just one vendor’s products, Janulaitis says.

“With the economy the way it is, and now people are talking about the possibility of a double-dip recession, a lot of companies are just looking for basic skills and experience,” Janulaitis says.

That means plugging holes where they have to-by hiring one person with experience managing VMWare servers, for example-or a lot of junior-level generalists who can be trained in the skills that company needs, Janulaitis says.

Virtualization Salaries Flattening? Salaries for virtualization specialists have also hit a plateau, though the ads for them increased 30 percent in 2009 compared to 2008, Silver says.

After surging 10 percent in 2008 and into 2009, salaries for virtualization experts were flat this year at an average of $84,777, Dice.com data shows.

That’s still a premium compared to the national average of $78,845 per year for other tech workers, however, Bewley says.

LOS ANGELES — Silverlight and Windows Azure headlined Microsoft’s Professional Developers Conference last week, but Microsoft also made a number of significant announcements concerning interoperability, identity management, Surface touch technology, and SharePoint.

Microsoft delivered a Java SDK for Windows Azure storage, with additional tools and guidance for deploying the Tomcat application server in Windows Azure. Microsoft has enabled external endpoints to allow applications that are not running on Internet Information Services to receive traffic, said open-source community manager Peter Galli in a blog post.

Microsoft will also allow Azure to be used as “Infrastructure as a Service” with Windows Server Virtual Machine support. SugarCRM, an open-source customer relationship management software maker, announced that it would offer its applications on Azure.

Azure is designed to be open and interoperable from its basic protocols, said Jean Paoli, general manager of interoperability strategy at Microsoft. When asked about portability, he said that Microsoft was participating in the Distributed Management Task Force’s Cloud Computing standards efforts, and it was “trying to make sense” of cloud scenarios. The Distributed Management Task Force is a standards body.

Microsoft also announced an identity management solution that works in the cloud as well as on premises.

Stuart Kwan, group program manager for Microsoft’s Federated Identity team, unveiled the final version of Windows Identity Foundation, a product for identity and access management for .NET applications. The technology is a core element of Microsoft’s “Geneva” platform, and it is designed to interact with outside identity systems, he said.

Geneva enables identities to be federated to new services in the cloud and in a service-oriented architecture.

Identity Foundation is designed to help developers write identity into an application without being identity experts, Kwan said. “Developers only think about claims and do not need to be concerned about how they get them, as long as they trust who they get them from.”

On the desktop, Microsoft‘s development efforts are focusing on developing touch-screen user interfaces. The company gave each PDC attendee a touch-screen-enabled laptop. Its Surface team developed Microsoft’s touch-screen technology.

The Surface team also delivered a public SDK and technical documentation to the Microsoft Developer Network. The documentation focuses on how developers can design valuable applications with the multi-touch interaction paradigm, said Brad Carpenter, general manager of software and user experience for Surface.

The number of Surface development partners has increased from 60 last year to 250 today, Carpenter said. Technology from Surface is used in Windows 7, and the team produces Windows Presentation Foundation controls that OEMs can install on PCs, he added.

The company is also investing in broadening development tools for Office SharePoint Server.

Microsoft announced beta releases of Office 2010 for public download. It is also offering Web editions of Office applications, also in beta.

Developers can integrate social networking into Office Outlook 2010 with an SDK to connect with LinkedIn and other networks. There is out-of-the-box integration with Windows Live and SharePoint.

What’s more, Microsoft delivered new tooling for SharePoint 2010, as well as Business Connectivity Services, a set of features to connect SharePoint to Web services.

“Developers can write code and deploy on-premises, but also put partially trusted code into SharePoint online,” said SharePoint director Arpan Shah. He explained that applications cannot access resources outside of the site container, and Microsoft provides governance over application resources.

A new version of Microsoft‘s “Duet” SAP integration software was re-architected to utilize Business Connectivity Services, Shah said.

He said that Microsoft has more than 4,000 SharePoint development partners, and he expects that number to double with the next release. Partners generate US$5.6 billion in services revenue, he added.

On SDTimes, Microsoft unveils a bevy of supplemental software – By David Worthington – November 24, 2009

 

 

 

 

 

 

About Azure Seminar

The seminar is designed for Senior IT Professional to understand the Microsoft Cloud offering – Windows Azure. The information will enable you to prepare for the new upcoming culture in the IT world called Cloud computing. This seminar will also deal with a comparison of various vendors offerings under the Cloud Platform.

 

By Eric Lai,

Microsoft said today that its upcoming Windows Azure cloud computing platform will come with marketplaces for both online apps built to run on Azure as well as datasets that companies can use to build their own apps.

PinPoint.com will host business-oriented apps developed by Microsoft partners, chief software architect Ray Ozzie said during a keynote speech at Microsoft’s Professional Developers Conference 2009 (PDC09) in Los Angeles.

[Get the no-nonsense explanations and advice you need to take real advantage of cloud computing in the InfoWorld editors’ 21-page Cloud Computing Deep Dive PDF special report. | Stay up on the cloud with InfoWorld’s Cloud Computing Report newsletter. ]

PinPoint will compete with Salesforce.com’s four-year-old AppExchange online marketplace and other more recently emerging app stores.

Azure will also host “an open catalog and marketplace for public and commercial data” code-named Dallas, Ozzie said. Developers can use the data to build their own services and mashups. Dallas is now in Commercial Technical Preview.

Microsoft is also bolstering Azure with management tools for developers running .Net apps on-premises and with Azure that are less sexy, but arguably more essential.

Windows Azure , meanwhile, will officially go into production on January 1, but customers won’t be billed until February 1, Ozzie said. Azure will be hosted at three pairs of data centers: Chicago and San Antonio for North America, Dublin and Amsterdam for Europe, and Singapore and Hong Kong for Asia.

Azure will compete with Salesforce, Amazon.com, and many other cloud platform providers. The key difference is that Azure, rather than dumping the desktop entirely for the Web, keeps the Windows operating system in the equation.

This vision of “three screens and the cloud” will allow developers to build apps that can be reused and delivered via the cloud (Windows Azure), on-premises server (Windows Server), or desktop (Windows 7), depending on what is most convenient or offers the best performance, Ozzie said.

To demonstrate how far Windows Azure has come, Microsoft enlisted the aid of some traditional antagonists: Silicon Valley startups and the federal government. San Francisco-based Automattic is using Azure to host parts of its popular WordPress blogging platform, said founder Matt Mullenweg. Another San Franciso startup, Seesmic, is building a Twitter app running on Windows using Microsoft’s Silverlight rich media player, said CEO Loic LeMeur.

NASA is releasing 3D imagery from the Mars rover vehicle for free to the general public via the Dallas data feed. Federal CIO Vivek Kundra said the government plans to accelerate the release of more data to the public. He likened the potential “explosion” of apps to the one that followed after the U.S. government liberalized the availability of GPS data.

To demonstrate that Azure can scale to needs, Microsoft’s president of its Server & Tools Division, Bob Muglia, cited its Bing search app, which runs on more than 100,000 servers. Muglia also announced Project Sydney, which will allow companies to connect their own servers to Azure-based services. Sydney will go into beta next year.

Finally, Muglia announced a beta of an application server for Windows Server called AppFabric. AppFabric will help developers manage both on-premises servers and Azure cloud-based services. It includes features from the Dublin app server and the Velocity caching technology. AppFabric will go into beta next year.

One of the biggest technical obstacles in the world of cloud computing is integrating cloud applications with each other and with on-premise systems. Data integration software developer Informatica released a package of software tools the company said can help businesses overcome those hurdles.

 

For Informatica’s channel partners and systems integration allies, the new cloud 9 toolset offers a means of building customized data integration software for customers and assembling data integration links that can be reused in multiple deployments, said Darren Cunningham, senior marketing director for Informatica’s on-demand products.

As more businesses adopt Software-as-a-Service applications and other cloud computing technologies, they find themselves wrestling with the problem of how to link those applications with their existing IT systems. Informatica, a longtime player in the data integration arena, is a natural to fill that role, company executives argue. A number of younger companies are also jumping into the on-demand data integration space, including Boomi and Cast Iron.

Informatica Cloud 9 includes a multitenant, Platform-as-a-Service (PaaS) data integration system that developers and systems integrator partners can use to build and reuse custom data integration services and run them in the cloud, according to the company. Business users can configure data rules or run data mappings built by IT using Informatica Cloud Services for data integration.

Cloud 9 incorporates Informatica’s Cloud Services Winter ’09 release of purpose-built Software-as-a-Service data integration applications for nontechnical users. It also provides Address Quality Cloud services based on technology Informatica acquired when it bought AddressDoctor in June. A new sandbox feature includes data synchronization and replication capabilities for software development and testing projects.

The package also includes new and expanded offerings through Amazon’s Elastic cloud computing (EC2) service. The new Informatica Data Quality cloud Edition, which runs on Amazon EC2, offers data quality services such as profiling, cleansing and matching. Informatica Cloud 9 also supports the recently released Amazon Relational Database Service.

Earlier this month Informatica unveiled Informatica 9, a new release of its core data integration software that’s the foundation for the new cloud computing offering.

The Informatica Cloud Platform is currently available as a beta with the final release scheduled for December priced at $1,000 per month. The Data Quality Cloud Edition is available on Amazon EC2 as a beta with the production release set for next year’s first quarter. The Informatica Address Quality cloud services are available today with pricing based on transaction volumes.

On ChannelWeb, Rick Whiting Writes, Informatica Debuts Cloud Computing Integration Tools, November 26, 2009

 

 

 

 

A .NET Cloud Computing Applications Versatilist:

The .NET Cloud Computing Applications Versatilist candidate would be someone who:

  1. Has one or more technical specialties (e.g. application programming, Designing, composing and consuming Services from and in .NET Applications).
  2. Has at least a general knowledge of software development lifecycle.
  3. Has at least a general knowledge of the business domain in which they work.

 

The Versatilist program for .NET Cloud Computing Applications Professional will enhance and empower the candidate with the following skills:

  1. Knowledge of the different cloud computing platforms
  2. Understanding the concept of SaaS
  3. Identifying the benefits and scenarios where cloud computing will be applicable
  4. Detailed understanding of the Cloud Computing platform from Microsoft – Azure Services Platform
  5. Designing, implementing and deploying a solution in the cloud using Azure platform
  6. Creating Service Bus and workflow applications using .NET Services
  7. Using the SQL Database in the cloud with SQL Data Services
  8. Creating Live Mesh applications with Live Services

 

Company proposes ‘The Cloud Computing Advancement Act’ Microsoft Session at Cloud Expo

Today, Brad Smith, senior vice president and general counsel at Microsoft Corp., urged both Congress and the information technology industry to act now to ensure that the burgeoning era of cloud computing is guided by an international commitment to privacy, security and transparency for consumers, businesses and government.

During a keynote speech to the Brookings Institution policy forum, “Cloud Computing for Business and Society,” Smith also highlighted data from a survey commissioned by Microsoft measuring attitudes on cloud computing among business leaders and the general population.

The survey found that while 58 percent of the general population and 86 percent of senior business leaders are excited about the potential of Cloud Computing, more than 90 percent of these same people are concerned about the security, access and privacy of their own data in the cloud. In addition, the survey found that the majority of all audiences believe the U.S. government should establish laws, rules and policies for cloud computing.

At today’s event, Smith called for a national conversation about how to build confidence in the cloud and proposed the Cloud Computing Advancement Act to promote innovation, protect consumers and provide government with new tools to address the critical issues of data privacy and security. Smith also called for an international dialogue on data sovereignty to guarantee to users that their data is subject to the same rules and regulations, regardless of where the data resides.

“The PC revolution empowered individuals and democratized technology in new and profoundly important ways,” said Smith in his keynote address. “As we move to embrace the cloud, we should build on that success and preserve the personalization of technology by making sure privacy rights are preserved, data security is strengthened and an international understanding is developed about the governance of data when it crosses national borders.”

He continued, “ Microsoft is committed to fostering the responsible development of cloud computing to ensure that data is accessible, safe and secure. We also need government to modernize the laws, adapt them to the cloud, and adopt new measures to protect privacy and promote security. There is no doubt that the future holds even more opportunities than the present, but it also contains critical challenges that we must address now if we want to take full advantage of the potential of Cloud Computing.”