Tuesday, July 3, 2012

2nd Watch Named Amazon Web Services Advanced Consulting Partner

2nd Watch has been recognized by Amazon Web Services™ (AWS) as an AWS Advanced Consulting Partner. AWS reserves the Advanced Consulting Partner designation for companies like 2nd Watch that demonstrate a commitment to expert cloud solutions. We are honored to be one of the first AWS partners ever to be publicly recognized with the designation!

Friday, June 29, 2012

Backups: To the Cloud!

Amazon Web Services is adding new functionality on a weekly basis, and a growing list of 3rd party applications has emerged to leverage these new capabilities.  Many of these applications are forgettable, but one 3rd party utility has remained constant around the 2nd Watch offices: Cloudberry Explorer.  Now anyone with an internet connection and a laptop can achieve data durability which eluded most enterprises a decade ago.  Let's exploit this newfound power by creating an automated backup script!

There are three main components to this workflow:
  • Timer (CRON equivalent).  For this we will use windows task scheduler.
  • Scripting shell.  Since we are using windows, it is all PowerShell.
  • S3 interface.  Cloudberry and its awesome PowerShell snap-in.
First, we will install Cloudberry here.  The PowerShell snap-in will install automatically on Windows 2008 or later. 

Next, we need to create a PS script which will execute our desired action.  We will start by changing PowerShell’s default execution policy to allow scripts.  Open up PowerShell and enter the command:
        Set-ExecutionPolicy RemoteSigned

Next, create a new file in c:\scripts called sync.ps1, right click on it and choose edit.  Now we will add the magic:
      $key = "YouAccessKeyIDGoesHere"
      $secret = "YourSecretAccessKeyGoesHere"
      $s3bucket = "YourS3Bucket"
      $localFolder = "c:\test\"

      Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn

      $s3 = Get-CloudS3Connection -Key $key -Secret $secret
     $destination = $s3 | Select-CloudFolder -path $s3bucket
     $src = Get-CloudFilesystemConnection | Select-CloudFolder -path $localFolder
     $src | Copy-CloudSyncFolders $destination –IncludeSubfolders


As you can see in the code above, starting from the top, we have defined variables for our two AWS keys.  Next we’ve defined variables for the S3 bucket and the local folder with which we want to synchronize.  The middle line (Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn ) just informs PowerShell that we’re going to be invoking Cloudberry specific commands.  Next you can see those Cloudberry specific commands in action. Line 8 establishes the connection with our AWS account.  Line 9 sets the S3 bucket as the destination.  Line 10 sets the folder “c:\test” as the source.  Line 11 synchronizes the two folders, in this case we’re telling cloudberry to compare the S3 bucket with our local folder, and if there are changes in the local folder, copy them up to the S3 bucket.  We can reverse this behavior by inverting $src and $destination on Line 11 ($src | Copy-CloudSyncFolders $destination –IncludeSubfolders)  Note that there are numerous flags which impact the way Copy-ClouSyncFolders is run, including 2 way sync, MD5 hashing (paid version only), etc.  Check out the Cloudberry docs for a full list here.

Create a batch script called BatchProcess.bat and place it in your c:\scripts folder.  Add this single line to the batch file:

      powershell.exe C:\scripts\sync.ps1

Now that we have a sync script and a batch file to call it, we need a timer to automate everything.   Go to the Control Panel -> Administrative Tools -> Task Scheduler.  Choose “Create task


Make sure you choose “Run whether user is logged on or not” and check “Run with highest privileges”.  Also be sure to select the proper “Configure For” dropdown at the bottom, it doesn’t default to the OS you’re using.  Next select the “Triggers” tab and create a new trigger.  Change it to run every 5 minutes.  Note that you can also change the “Begin on task” to run on start-up, useful for auto-scaling groups.  Finally, go to the “Actions” tab and create a new action.  Select your BatchProcess.bat file which you created above.


Now we have a sync script which will automatically compare our local folder (c:\test) with a remote S3 bucket (YourS3Bucket) and copy any deltas up to S3.  This is a great way to automate backups.

Here are a number of helpful links:



http://www.techrepublic.com/blog/10things/10-fundamental-concepts-for-powershell-scripting/2146

Thursday, June 28, 2012

AWS Interviews 2nd Watch President, Jeff Aden

AWS recently interviewed Jeff Aden about how 2nd Watch is utilizing AWS to help businesses save money and be competitive on the cloud. Learn about 2nd Watch services and products, like 2W Insight, that could help your business.

Wednesday, June 27, 2012

What is the Real Value of Microsoft Office 365?

As a mid-market or enterprise business owner or IT manager you might question whether or not your business really can benefit from using Microsoft Office 365. What is the business value you'll actually get out of the product? Will you see any real cost savings by using it? The short answer is, yes.

Customers are moving to Office 365 because it offers more than just a way to work collaboratively from anywhere in the world. Not only can users manage projects, co-author documents and communicate in real time from virtually anywhere in the world, but they can do it all with the same Microsoft tools they already know, knowing they have 24/7 reliability and enterprise-class security.



Office 365 simplifies IT management while increasing productivity. Your IT team can save time by controlling service configuration and user access while Microsoft handles your routine server administration tasks. All of this is done securely, with server-level antivirus, enterprise-grade reliability, DR capabilities and geo-reduntant datacenters, with a guaranteed 99.9% uptime, resulting in higher productivity and efficiency for your business.


More than business and productivity value, you will see significant cost savings with Office 365. By moving software and services off local machines and onto Microsoft-hosted servers, you can
lower hardware overhead costs, decrease electricity costs, enable IT budget predictability, and simplify IT system management.

If you're interested in watching a live demo of Office 365 or learning how it can contribute to your business objectives, join 2nd Watch and Microsoft for a live seminar at the Davenport in downtown Spokane in July. Check back for dates and registration information.

Friday, May 4, 2012

AWS Tip of the Month - Free Cloud Watch Monitoring

AWS has a great monitoring service called Cloud Watch.  Cloud Watch includes basic monitoring on your infrastructure at no additional charge.  Below you can see a graph of CPU utilization across a few of our running instances:

If you add name tags to your EC2 instances Cloud Watch can be a great tool to quickly understand what your servers are doing and if you need to address any load issues or application issues.

Here is a graph of our disk IO on our mounted server storage (EBS):

The other benefits to Cloud Watch is that you can set alarms at specific thresholds (these do cost money) to trigger a notification or event based on something happening to your AWS infrastructure (e.g. CPU spike at 100% for over 2 mins).

We find that most of our clients have an existing monitoring system in place that they want to continue to use, and we integrate these into AWS.  Cloud Watch is helpful for the proof of concept, but it is also the only monitoring solution for AWS specific infrastructure (ELB, Simple DB, etc.) and can be a helpful tool in ensuring your application is running as desired.

-KB

Tuesday, May 1, 2012

Security On the Cloud

As more and more companies migrate their IT infrastructure to the cloud the main cloud-related concerns for businesses continue to be security, data control, and reliability. There are several factors to consider with any technological advancement. Most of these cloud-related concerns are not new and, with well-planned risk management, can be avoided to ensure data is both available and protected.

An ISACA Emerging Technology White Paper notes some common risk factors and solutions businesses should consider when making the move to the cloud.
• Enterprises need to be particular in choosing a provider. Reputation, history and sustainability should all be factors to consider.
• The dynamic nature of cloud computing may result in confusion as to where information actually resides. When information retrieval is required, this may create delays.
• Public clouds allow high-availability systems to be developed at service levels often impossible to create in private networks. The downside to this availability is the potential for commingling of information assets with other cloud customers, including competitors.

Companies should have a risk management program that is able to deal with continuously evolving risks. An experienced provider can deliver useful strategies for mitigating these risks. For example, requirements for disaster recovery should be communicated between the business and the provider. Having a detailed Service Level Agreement will help the company manage its data once it migrates to the cloud as well as outline expectations regarding the handling, usage, storage and availability of information. Companies should also consider their security and management options when choosing a public, private or hybrid cloud. What are the pros and cons of each?

Public Cloud
·         Pros: Because infrastructure is maintained outside of the organization , public clouds offer the greatest level of cost savings and efficiency - provides ability to add capacity as needed.  The public cloud has commoditized traditional technology infrastructure.
·         Cons: You share this cloud infrastructure with other users, potentially including competitors. Consider the sensitivity of the data to be stored on a public cloud and use encryption where required to protect corporate assets

Private Cloud
·         Pros: Because infrastructure is maintained on a private network, private clouds offer the greatest level of security and control. You own not only the data but the cloud that houses it too.
·         Con: Provides lower cost savings than a public cloud, and the infrastructure lifecycle has to be managed.

Hybrid Cloud
·         Pros: Includes a mix of public and private storage and server infrastructure. Different parts of your business data can be stored on different clouds, ensuring high security or efficiency where needed.
·         Con: You have to keep track of multiple platforms and ensure all parts can communicate to each other.

By keeping these factors in mind you can ensure a smooth and successful transition to the cloud with secure and easy access to your data.

Monday, April 30, 2012

Recently I was in NY at the AWS Summit and heard of about the company Kris described below.  You can view the slides from the presentation at the link below. In short, 50,000 core cycle computing cost this company $4,828 per hour.  This is how AWS is changing the way we think about business and what is possible for us to accomplish.

http://www.slideshare.net/AmazonWebServices/aws-customer-presentation-cycle-computing-aws-summit-2012-nyc

Jeff

Friday, April 27, 2012

How Cloud Computing is transforming scientific R&D!

Every once and awhile you run across something to do with Cloud Computing that makes you stop and say "Wow that is cool!".

Last week we had the privalege of being at the AWS Summit in New York to launch our beta of 2W Insight, our new billing application for AWS.  While at the summit we had a chance to listen to how innovative companies are using AWS to solve real business problems.

The most interesting of these to me was a company called Cycle Computing.  Jason Stowe from Cycle Computing was on stage during the keynote to describe the AWESOME work he and his team are doing around building supercomputers on AWS.  I've posted lots of blogs on how to run a server or utility for <$1 month on AWS and while these are neat tricks they pale in comparison to the social importance of Jason and his team's work.

Cycle Computing built a supercomputer on AWS with over 50,000 cores to do cancer research.  I will explain it terribly so please go over to their blog - blog.cyclecomputing.com and read about it yourself.  It is impressive to say the least.

Jason - keep it up!!

-Kris

Tuesday, April 24, 2012

Corner Booth Media Features 2nd Watch Case Study

Corner Booth Media has featured our Gravity Jack case study video in their blog "Are Case Study Videos Right for You?"

Corner Booth Media has helped us create all of our case study and testimonial videos, and they have been a great resource for us in getting the word out about 2nd Watch. As the blog quotes Jeff saying, "It’s easy for us to tell people how great we think we are, but much more effective when our clients do it." They've provided us with a really effective medium to let our clients speak on our work.

Thursday, April 19, 2012

2nd Watch Unveils Beta Billing Application Today

Kris and Jeff are at a summit in New York today to unveil the beta version of 2nd Watch's new cloud billing application, 2W Insight. 2W Insight will allow AWS customers to manage billing and consolidated billing environments in an easy to understand, streamlined manner. It will give customers easy visibility into their AWS bills - no more confusion in deciphering your bills!

2W Insight updates billing costs on a regular schedule, allowing accountants the most accurate billing information of any tool on the market, not just estimates. 2W Insight provides an organization's management with simple, clean and easy to understand summaries of AWS use. Services are organized into compute, storage and network categories with charges summarized by region, size and VPC. 2W Insight also enables analysts to create more detailed drilldowns of AWS charges, giving them control over how they view and organize thier bill. Users can sort and filter detailed cost and usage information, view instance level details and multiple department costs. The printing feature also allows users to print a PDF summary of bills, delivering easy reporting capabilities.

Are you an AWS Solution Provider? The application will also be available as a platform to manage customer billing. It will generate accurate bills across the consolidated billing set up, enabling solution providers to apply AWS pricing to end consumers in a consumer-friendly format.

We are very excited about the new application and plan to have a production version available in early June.

Monday, April 16, 2012

Launching this week....

2nd Watch is headed to NY this week.
We will be unveiling something big at the Amazon Web
Services Summit….stay tuned.

Friday, March 30, 2012

The Cloud creates jobs AND ideas!

Microsoft sponsored a study with IDC about job growth related to Cloud computing.  The headline - "Cloud Computing to create 14 Million New Jobs by 2015".

You can read the entire article here - http://www.microsoft.com/presspass/features/2012/mar12/03-05CloudComputingJobs.mspx

The article has a bit of sales in it - it does quote several partners and the VP of enterprise sales.  That aside it's an interesting take on something I believe in strongly - the power of Cloud Computing to enable creators to create in a way not previously possible!

Here at 2nd Watch we are very proud to be a part of this new job creation!  Since 2011 2nd Watch has created 15 new jobs in the Cloud Computing industry!

The Cloud is a technology industrialization that will help empower business owners in a way not before possible.  Let me give you a case study of something that was not feasible before Cloud Computing:

Here at 2nd Watch we are working on a number of utilities to make the Cloud easier to consume. One such utility requires very large datasets to be stored for each customer.  Our application stores 21 Million data points per month, and that is only for 10 customers.  In the old world of technology if I wanted to mass market an application at this scale I would need to buy and provision several servers and massive disk storage to keep up with the demand.

In this instance I estimate that I would have purchased $65-75,000 worth of equipment.  Forget depreciation and useful life - for a startup this is just money I don't have.  Today my almost Terabyte of storage costs me less than $10 per month, and I only pay for what I use.  My servers are several hundred dollars a month, and I can scale them up or down based on usage.

Big data problems are no longer big dollar problems.  This is a huge shift in an industry built around selling and consuming hardware.  This new wave of technology will spur the thinkers and doers in our society to create new and compelling business ideas that could not have existed before.

Welcome to the industrial age of technology.  I for one am looking forward to the opportunities this new world will bring!

-Kris


AWS Tip of the Month: Backup Server for less than $1 a month

AWS has a very neat way to backup your server with attached storage called EBS Snapshots.  This technology starts with a full snapshot of the entire volume and continues to do incremental snapshots based on the difference between the last snapshots.

Snap shotting is not a new technology, and many people use this in their on-premise environment to take snapshots of a virtual machine or specific disk for backup or copy purposes.

At AWS you cannot schedule these snapshots. You either have to call these via API or use the management console to do this.

Below is the high-level process to automate these snapshot backups at a particular time each day:


  1. Build a t1.micro that has a script to run that calls the snapshot API for EBS for the volumes you want to backup.  Make sure this script runs on machine start up.
  2. Setup an Auto-scale group to launch a server based on a schedule (we start ours at 3am and shut it off at 3:50 am).  This ensures time to kick off all the snapshot jobs - note: I do not have to run the server while they complete, only to kick off the snapshot and any pre-activities I need to accomplish before I can take my backup.
  3. Check your snapshots to ensure they are occurring regularly and test them on a monthly or quarterly basis to ensure you can recover sufficiently in the case of an outage.


By doing this daily we have incurred the costs of our storage for snapshots and the $.02 an hour charge for our T1.micro - effectively $0.62 per month for a backup server.

-Kris

Sunday, March 18, 2012

Radio Interview Recording

On Thursday I was interviewed on local business radio KSBN on the Spokane Entreprenuer Show. My friends Bill Kalivas and Catherine Greer hosted a 30 minute spot on our business and what it took to get things going.

Click here to listen to the recording of the show.

-Kris

Wednesday, February 22, 2012

AWS Tip of the Month - Super fast Micro instance installs for less than a buck

Here is an easy but effective tip for managing micro instances in AWS.

Micro instances can be a challenge to install software or do application build upgrades due to the limited amount of CPU given to this instance size. Micro instances can make fantastic web servers, but when you need to do work on the server it can take more time than the typical IT pro has patience for.

Here is a quick tip: Change the instance size for patching and installs.

1) Stop the micro instance for your maintenance or install work.
2) Change the instance size to Large (or X-Large if you need a huge box for the install).
3) Start the instance
4) Run your install, upgrade, etc. I've found that I can usually do most of my maintenance or installs in less than an hour on a Large instance = $0.52
5) Stop the instance
6) Change the instance size back to Micro
7) Start the instance again.

The starting and stopping take seconds, and the upgrade to Large is instantaneous. Following these steps will lead to pain free patching, installs and upgrades for less than the price of a latte.

-Kris

Tuesday, February 7, 2012

Gartner Identifies Amazon Web Services as the Leader

In a recent Gartner article, Amazon Web Services has clearly been distinguished as the leader in cloud computing. Gartner did a great job of explaining why AWS is leading the way and why other companies may struggle to keep up with them.

2nd Watch has moved a great deal of companies off the "niche" and "challengers" and will continue to do so as companies realize the value of AWS. If you are currently hosted in a COLO or another provider like GoGrid, the costs and product set cannot match AWS, and this is why we are seeing companies migrating to AWS - even from other cloud providers.



http://www.gartner.com/technology/reprints.do?id=1-18BC06X&ct=111213&st=sb

Thursday, February 2, 2012

Amazon Web Service Webinar

AWS is making it less complicated and less expensive for your company to recover in the event of a disaster. Check out a great webinar that will change your view of how you recover after a disaster! 2nd Watch is highlighted in the webinar.

Monday, January 30, 2012

Cloud Workload - Analytics

If you work at an organization that values data you likely have been asked at one point in time to run a report.  The report you ran likely answered some key business question like how many sales you processed last month or data on your key customer accounts.

What about ad-hoc reporting?  Where did you go when you had a specific question and no pre-built report to answer it?  Some organizations build large Data Warehousing applications that churn data from multiple sources and form what's called a "Cube" or aggregate data store.  This is simply pre-run aggregates (sums, averages, counts, etc.) across large data sets so that querying is faster than trying to search the base data (often millions of rows).  This Cube was something that was built typically on a batch basis (weekly, nightly, hourly) depending on need and technical limitations of the environment.  The IT environment could only afford so large servers to process the massive data to run the data import and analytic jobs.

Sounds like heavy computing.  How could the Cloud help?

One of the unique and very valuable attributes of Cloud Computing is that it is elastic.  This means you can spin up resources when you need them and spin them down when you don't.  I've worked on many data warehouses in my career and most of them are very cyclical in their computing needs.  For 6-12 hours a day (typically at night) the warehouse is churning data as fast as it can to build the aggregate cube for end users to use when they arrive at work in the morning.  The rest of the day the cube queries are light load in comparison as end users query aggregate data.

What if you could increase your compute capacity by 20 fold for those 6-12 hours at night when the data needed to be loaded?  Would that cut down on the window?  Could you load more data?  Could you run more analytics?

What would you do with the cost savings for shutting off the data load servers during the day?  Soon, your CFO will be asking.  What will your answer be?

Learn more @ www.2ndwatch.com

-KB

Workloads for the Cloud

We are going to start posting a series of posts about different workloads that are appropriate for use in the Cloud.

The Cloud is enabling new technology and business workloads that were not possible in the past.

Keep up on the blog for additional posts in the series.

-KB