The cloud is all about access. It’s about getting access on-demand, access to resources needed for your growing demands, and controlling who has access to your information. Everybody is talking about it, but is the cloud the right answer for your business needs? And if so, which one?
What is the Cloud, Anyway?
Essentially the cloud is a network of servers. However, it is important to understand what distinguishes clouds from virtualization, or you risk "cloud washing", when technology vendors label their product as a cloud when they're really just basic virtualization setups. Bluelock CTO Pat O’Day, a provider of public and private cloud services, says, “True cloud means the users, and not IT, get to control the performance of their applications based on the resources they allocate to them. That is where the cost savings is. Automating deployment and streamlining the human activity previously required to do daily tasks.”
Before deciding between a public or private cloud, it's important to understand what your business needs, and what will successfully propel it into the future. The ISACA recommends looking at cloud computing as a business strategy, not just as an IT project. Be sure to:
- Weigh both the value and opportunity costs
- Check potential conflicts with company culture and processes
- Confirm the right tools and skills are available to support it
- Put reporting mechanisms in place to measure the value and the risks
As more functions and applications are deemed critical, the more your final plan will cost, and the longer it will take to execute. A good way to think about what is truly critical is to consider what kind of damage will result if the application is not available for a period of time, in terms of revenue, brand damage, and regulatory compliance.
Keep in mind, the risk of cloud adoption may be inconsequential when compared with the potential that effective and strategic use of cloud computing can bring.
There are several options available, depending on your needs. Essentially, the public cloud is external to the consumers’ organizations, while private clouds are operated exclusively for a single organization. A community cloud is similar to a private cloud, but the exclusivity of infrastructure and computational resources is for two or more organizations that have common privacy, security, and regulatory considerations. Hybrid clouds are more complex, since they involve a composition of two or more clouds (private, community, or public). Here we will discuss two options in more detail, the private and public cloud.
The Public Cloud
With the public cloud, a service provider makes resources, such as applications and storage, available to the general public over the Internet. Public cloud services may be free or offered on a pay-per-usage model. Examples of public clouds include Amazon Elastic Compute Cloud (EC2), IBM's Blue Cloud, Sun Cloud, Google AppEngine and Windows Azure Services Platform.
The main benefits of using a public cloud service are:
- Easy and inexpensive set-up
- Scalability to meet needs
- No wasted resources because you pay for what you use
The public cloud offers the highest level of scalability because companies can leverage the large pool of resources supported by shared environments. You don't get to determine your own security provisions, and a breach can still cause compliance violations. In theory, the public cloud could be more reliable because it leverages a larger pool of computing resources that eliminates single points of failure. However, it perceived as less secure because the infrastructure belongs to the provider.
The Private Cloud
In the private cloud IT resources are dedicated to a specific company instead of being shared. According to the National Institute of Standards and Technology, private clouds have several differentiating attributes. It may be managed or hosted by the organization or by a third party and has the potential to give the organization greater control over the infrastructure, computational resources, and cloud consumers than a public cloud. Many private cloud technology vendors are starting to build high availability into their offerings by eliminating single points of failure and allowing hot swapping functionality.
A private cloud will cost more to implement initially because companies need to make capital infrastructure investments. Additionally, management costs may outpace the subscription expenses in a public cloud because businesses must allocate internal resources to manage the hardware. If you’ve got an application that is very compute-intensive or requires non-standard virtual machines that require increased I/O or memory, it is more economical to run in a private cloud setting.
If your cloud will deal with a lot of sensitive information that is subject to compliance rules, a private deployment may be the safest option. Tom Roloff, senior vice president of EMC's Global Services business, predicts that this barrier will be eliminated within five years thanks to the fact security is becoming a major competitive differentiator for cloud vendors.
The Cloud and Disaster Recovery
The traditional strategy for ensuring high availability was clustering, nearly identical machines connected to a local area network, which is expensive to maintain and cost prohibitive for many organizations. The cloud can achieve similar results by using an easy-to-deploy and highly scalable virtualized configuration. Traditional clustering may still make more sense than leveraging a private cloud for applications that require immediate recovery, but more companies are using the two strategies together for a cost-effective disaster recovery and business continuity solution.
Some businesses also leverage the public cloud as a hot site for disaster recovery (DR). Using the technology in this way reduces reliability concerns, since third-party infrastructure is not responsible for handling active environments. The Acronis Global Disaster Recovery Index 2012 revealed that the market for cloud-based DR is gaining momentum, with the top three benefits of using the cloud for backup and disaster recovery includes lower IT operating costs (50%), additional or flexible storage space (20%), and improved compliance. Human error remains the biggest cause of system downtime, at 60%.
Evaluate and Conquer
Companies should approach the cloud as it does its other investments: with a business case and clearly defined expectations from the beginning. Stakeholders can start by outlining clear use cases for their potential cloud deployments. For example, organizations planning to move information that is subject to numerous compliance mandates will want to vet providers based on their understanding of those regulations or go with a private cloud to adequately protect their data.
Once a clear plan is formed, it's important to ask cloud providers precise and thorough questions about their services. Do not hesitate to ask about security certifications or question how uptime is measured - some providers may have SLA loopholes that only count downtime that lasts more than a specified time.
Optimize the Cloud
Once you’ve chosen the cloud that works best for your organization, you will need the correct tools to optimize its benefits. Automate Schedule is a powerful, yet affordable, job scheduling software developed to make the cloud work for you. This reliable enterprise job scheduler is easily accessed using a web-browser. No matter which cloud solution you choose, you can automate cross-platform scheduling easily with the user-friendly central monitoring system and event-driven scheduler. If there is a system error, you will be notified, and can find and fix the problem quickly. Automate Schedule's role-based security ensures that your users can access only what they need to perform their jobs, and its audit history helps you meet compliance for Sarbanes-Oxley (SOX), PCI, and HIPAA. Automate tracks who created a new job or event monitor, who changed a job setup or commands, and who forced a job to run outside its scheduled time. And it creates easy-too-pull reports, for yourself or your auditors.
Consider how to take advantage of what the cloud has to offer. Automate Schedule can work with the cloud to make your life easier, and your business more efficient. Every cloud does not have a silver lining, but your cloud can.
Making the Data Integration Process More Efficient
Estimates have “big data” doubling every two years until 2020, when it reaches 40 trillion gigabytes. Analysts say the number of different files containing information is growing faster than the digital ecosystem itself. This creates an enormous burden for IT staff to manage, particularly as their companies turn toward analytics initiatives and business intelligence solutions. This is a tremendous task, and to make integration lean, it is important to understand the challenges.
Data Integration (DI) Challenges
The U.S. Department of Transportation’s primer on data integration provides a comprehensive list of the DI challenges that have come from this explosion of information. The short list includes:
- Heterogeneous Data: In an Intel survey of 200 IT managers, 84 percent of respondents said they were analyzing unstructured data. Of the managers who weren’t, 44 percent said they expected to in the next 12 to 18 months.
- Bad Data: Low-quality data is likely to generate poor insight. There are several metrics outlined by The Department of Defense’s Guidelines for Data Quality Management.
- Excessive Costs: Managing disparate systems is time-consuming, and overtime can quickly drive a DI project beyond its allocated budget. In 2008, Gartner estimated that some companies could save $250,000 or more by consolidating tools or replacing current software with lower-cost options.
If DI is taking an excessive amount of time, it is likely due to one or more of the following:
- Developers have to write custom code to integrate disparate platforms
- The business is using multiple diverse solutions
- IT must rely on manual processes either for DI or delivering business intelligence reports
- The existing DI tool is too complex to use efficiently
An operational shift can necessitate new technology to improve visibility and increase efficiency. Being proactive with lean data management procedures can ease the burden created by bulky applications and tasks.
It is important to clearly understand the company’s existing IT ecosystem. Going through this process at the planning phase can help identify the must-haves, as DI tools come in all shapes and sizes. For example, most come with some form of job scheduler, but the devil is in the details. Knowing what you need in the beginning can save a significant amount of time and money.
In its analysis of the DI vendor landscape, InfoTech Research identified several key considerations for evaluating these tools:
- Real-time integration
- Data cleansing functionality
- Post-failure integration recovery
- Performance monitoring
- Middleware capability
- Data semantics
Data relevance is now so heavily dependent on the speed it is delivered, it is increasingly important to have dynamic control over the start and continuation of DI tasks. Bringing DI tasks together with other IT jobs creates easier workflow planning, with visibility into the jobs running on the system.
Enterprise Job Scheduling Solutions
One helpful tool is the enterprise job scheduling software. It offers:
- Centralized visibility and management over IT operations
- More exceptions to scheduled tasks
- Better workload management
- Exceptions based on dates, holidays, and time zones.
- Event-based scheduling
- More visibility of the jobs being run
- Generation of audit reports on demand
Enterprise job schedulers, such as Automate Schedule, are a strong component of the DI toolset. Centralizing and integrating workflow schedules into the rest of IT operations lessens the burden on IT staff, and significantly reduces error while improving time-to-delivery. Automate's advanced and affordable job scheduler includes support for Informatica PowerCenter, and numerous Windows applications, SAP®, and Oracle® E-Business Suite NetWeaver. This makes it easy to incorporate job scheduling software into IT ecosystems, improve DI processes, save money, increase operational efficiencies, and overcome the growing challenges of Big Data.