Tuesday, July 3, 2012

2nd Watch Named Amazon Web Services Advanced Consulting Partner

2nd Watch has been recognized by Amazon Web Services™ (AWS) as an AWS Advanced Consulting Partner. AWS reserves the Advanced Consulting Partner designation for companies like 2nd Watch that demonstrate a commitment to expert cloud solutions. We are honored to be one of the first AWS partners ever to be publicly recognized with the designation!

Friday, June 29, 2012

Backups: To the Cloud!

Amazon Web Services is adding new functionality on a weekly basis, and a growing list of 3rd party applications has emerged to leverage these new capabilities.  Many of these applications are forgettable, but one 3rd party utility has remained constant around the 2nd Watch offices: Cloudberry Explorer.  Now anyone with an internet connection and a laptop can achieve data durability which eluded most enterprises a decade ago.  Let's exploit this newfound power by creating an automated backup script!

There are three main components to this workflow:
  • Timer (CRON equivalent).  For this we will use windows task scheduler.
  • Scripting shell.  Since we are using windows, it is all PowerShell.
  • S3 interface.  Cloudberry and its awesome PowerShell snap-in.
First, we will install Cloudberry here.  The PowerShell snap-in will install automatically on Windows 2008 or later. 

Next, we need to create a PS script which will execute our desired action.  We will start by changing PowerShell’s default execution policy to allow scripts.  Open up PowerShell and enter the command:
        Set-ExecutionPolicy RemoteSigned

Next, create a new file in c:\scripts called sync.ps1, right click on it and choose edit.  Now we will add the magic:
      $key = "YouAccessKeyIDGoesHere"
      $secret = "YourSecretAccessKeyGoesHere"
      $s3bucket = "YourS3Bucket"
      $localFolder = "c:\test\"

      Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn

      $s3 = Get-CloudS3Connection -Key $key -Secret $secret
     $destination = $s3 | Select-CloudFolder -path $s3bucket
     $src = Get-CloudFilesystemConnection | Select-CloudFolder -path $localFolder
     $src | Copy-CloudSyncFolders $destination –IncludeSubfolders


As you can see in the code above, starting from the top, we have defined variables for our two AWS keys.  Next we’ve defined variables for the S3 bucket and the local folder with which we want to synchronize.  The middle line (Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn ) just informs PowerShell that we’re going to be invoking Cloudberry specific commands.  Next you can see those Cloudberry specific commands in action. Line 8 establishes the connection with our AWS account.  Line 9 sets the S3 bucket as the destination.  Line 10 sets the folder “c:\test” as the source.  Line 11 synchronizes the two folders, in this case we’re telling cloudberry to compare the S3 bucket with our local folder, and if there are changes in the local folder, copy them up to the S3 bucket.  We can reverse this behavior by inverting $src and $destination on Line 11 ($src | Copy-CloudSyncFolders $destination –IncludeSubfolders)  Note that there are numerous flags which impact the way Copy-ClouSyncFolders is run, including 2 way sync, MD5 hashing (paid version only), etc.  Check out the Cloudberry docs for a full list here.

Create a batch script called BatchProcess.bat and place it in your c:\scripts folder.  Add this single line to the batch file:

      powershell.exe C:\scripts\sync.ps1

Now that we have a sync script and a batch file to call it, we need a timer to automate everything.   Go to the Control Panel -> Administrative Tools -> Task Scheduler.  Choose “Create task


Make sure you choose “Run whether user is logged on or not” and check “Run with highest privileges”.  Also be sure to select the proper “Configure For” dropdown at the bottom, it doesn’t default to the OS you’re using.  Next select the “Triggers” tab and create a new trigger.  Change it to run every 5 minutes.  Note that you can also change the “Begin on task” to run on start-up, useful for auto-scaling groups.  Finally, go to the “Actions” tab and create a new action.  Select your BatchProcess.bat file which you created above.


Now we have a sync script which will automatically compare our local folder (c:\test) with a remote S3 bucket (YourS3Bucket) and copy any deltas up to S3.  This is a great way to automate backups.

Here are a number of helpful links:



http://www.techrepublic.com/blog/10things/10-fundamental-concepts-for-powershell-scripting/2146

Thursday, June 28, 2012

AWS Interviews 2nd Watch President, Jeff Aden

AWS recently interviewed Jeff Aden about how 2nd Watch is utilizing AWS to help businesses save money and be competitive on the cloud. Learn about 2nd Watch services and products, like 2W Insight, that could help your business.

Wednesday, June 27, 2012

What is the Real Value of Microsoft Office 365?

As a mid-market or enterprise business owner or IT manager you might question whether or not your business really can benefit from using Microsoft Office 365. What is the business value you'll actually get out of the product? Will you see any real cost savings by using it? The short answer is, yes.

Customers are moving to Office 365 because it offers more than just a way to work collaboratively from anywhere in the world. Not only can users manage projects, co-author documents and communicate in real time from virtually anywhere in the world, but they can do it all with the same Microsoft tools they already know, knowing they have 24/7 reliability and enterprise-class security.



Office 365 simplifies IT management while increasing productivity. Your IT team can save time by controlling service configuration and user access while Microsoft handles your routine server administration tasks. All of this is done securely, with server-level antivirus, enterprise-grade reliability, DR capabilities and geo-reduntant datacenters, with a guaranteed 99.9% uptime, resulting in higher productivity and efficiency for your business.


More than business and productivity value, you will see significant cost savings with Office 365. By moving software and services off local machines and onto Microsoft-hosted servers, you can
lower hardware overhead costs, decrease electricity costs, enable IT budget predictability, and simplify IT system management.

If you're interested in watching a live demo of Office 365 or learning how it can contribute to your business objectives, join 2nd Watch and Microsoft for a live seminar at the Davenport in downtown Spokane in July. Check back for dates and registration information.

Friday, May 4, 2012

AWS Tip of the Month - Free Cloud Watch Monitoring

AWS has a great monitoring service called Cloud Watch.  Cloud Watch includes basic monitoring on your infrastructure at no additional charge.  Below you can see a graph of CPU utilization across a few of our running instances:

If you add name tags to your EC2 instances Cloud Watch can be a great tool to quickly understand what your servers are doing and if you need to address any load issues or application issues.

Here is a graph of our disk IO on our mounted server storage (EBS):

The other benefits to Cloud Watch is that you can set alarms at specific thresholds (these do cost money) to trigger a notification or event based on something happening to your AWS infrastructure (e.g. CPU spike at 100% for over 2 mins).

We find that most of our clients have an existing monitoring system in place that they want to continue to use, and we integrate these into AWS.  Cloud Watch is helpful for the proof of concept, but it is also the only monitoring solution for AWS specific infrastructure (ELB, Simple DB, etc.) and can be a helpful tool in ensuring your application is running as desired.

-KB

Tuesday, May 1, 2012

Security On the Cloud

As more and more companies migrate their IT infrastructure to the cloud the main cloud-related concerns for businesses continue to be security, data control, and reliability. There are several factors to consider with any technological advancement. Most of these cloud-related concerns are not new and, with well-planned risk management, can be avoided to ensure data is both available and protected.

An ISACA Emerging Technology White Paper notes some common risk factors and solutions businesses should consider when making the move to the cloud.
• Enterprises need to be particular in choosing a provider. Reputation, history and sustainability should all be factors to consider.
• The dynamic nature of cloud computing may result in confusion as to where information actually resides. When information retrieval is required, this may create delays.
• Public clouds allow high-availability systems to be developed at service levels often impossible to create in private networks. The downside to this availability is the potential for commingling of information assets with other cloud customers, including competitors.

Companies should have a risk management program that is able to deal with continuously evolving risks. An experienced provider can deliver useful strategies for mitigating these risks. For example, requirements for disaster recovery should be communicated between the business and the provider. Having a detailed Service Level Agreement will help the company manage its data once it migrates to the cloud as well as outline expectations regarding the handling, usage, storage and availability of information. Companies should also consider their security and management options when choosing a public, private or hybrid cloud. What are the pros and cons of each?

Public Cloud
·         Pros: Because infrastructure is maintained outside of the organization , public clouds offer the greatest level of cost savings and efficiency - provides ability to add capacity as needed.  The public cloud has commoditized traditional technology infrastructure.
·         Cons: You share this cloud infrastructure with other users, potentially including competitors. Consider the sensitivity of the data to be stored on a public cloud and use encryption where required to protect corporate assets

Private Cloud
·         Pros: Because infrastructure is maintained on a private network, private clouds offer the greatest level of security and control. You own not only the data but the cloud that houses it too.
·         Con: Provides lower cost savings than a public cloud, and the infrastructure lifecycle has to be managed.

Hybrid Cloud
·         Pros: Includes a mix of public and private storage and server infrastructure. Different parts of your business data can be stored on different clouds, ensuring high security or efficiency where needed.
·         Con: You have to keep track of multiple platforms and ensure all parts can communicate to each other.

By keeping these factors in mind you can ensure a smooth and successful transition to the cloud with secure and easy access to your data.

Monday, April 30, 2012

Recently I was in NY at the AWS Summit and heard of about the company Kris described below.  You can view the slides from the presentation at the link below. In short, 50,000 core cycle computing cost this company $4,828 per hour.  This is how AWS is changing the way we think about business and what is possible for us to accomplish.

http://www.slideshare.net/AmazonWebServices/aws-customer-presentation-cycle-computing-aws-summit-2012-nyc

Jeff