Programming for the Amazon EC2 cloud

SimpleDB – Scalable DB

One final service to consider is SimpleDB – a hugely scalable database. Amazon provides a free tier of pricing, which means you can scale up to around two-million queries for the service before you're charged. Although you might be worried that you will quickly exceed two million queries as soon as your application becomes world famous, this threshold should be enough to get you started. A Ruby gem provides a solution for building SimpleDB into your web application. You will also find bindings for many other languages. For Ruby, start by installing the gem:

gem install aws-sdb

To install this gem in your Rails app, refer to the full documentation [5].

Once you set up the model, which you can do in one line,

class Post < ActiveResource::Base
  self.site = "http://localhost:8888"
  self.prefix = "/our_website_users/"
end

the cool thing about the Rails bindings is that you hardly notice that you are using SimpleDB.

The first parameter, site, is the proxy through which Rails accesses SimpleDB, while prefix is the SimpleDB domain in which the data is stored. If you decide to host a user model on SimpleDB, it will still look like any other model:

user = User.create(
 :username => 'dan@example.com',
 :favorite_products => {2341, 4251, 2567})
user.save

So you could easily move your user tables to SimpleDB but keep the product DB sitting in a relational database and then build most of your pages using preemptive caching. This solution puts all of the really hefty services at the front of your website: unlimited EC2 instances, S3 for static files and caching, and SimpleDB for massive tables.

Suppose most of the site is built from pre-cached fragments and you need to retrieve fragments based on the logged in user. If you still had databases running on EC2 instances (e.g., a MySQL cluster), you would still have to manage how this database scales.

Using SimpleDB, you can just throw all the data in and get the user record back:

user = User.find(9876)
cached_snippet = AWS::S3::S3Object.find 'Welcome-' + user.username, 'welcome-messages'

To perform the validation, you use the Rails API as usual:

user = User.find(:first, :params => { :username => 'dan@example.com', :password => 'secrets' })

SimpleDB is the place to store all those really terrifyingly big tables, instead of spending days optimizing relational structures and building clever caches.

This example illustrates the real benefit of cloud computing services – someone else can do the heavy lifting. SimpleDB, S3, EC2 and the many other services each provide an efficient way to perform an important task.

Scale Up; Scale Down

Once you've created the application that will be using cloud services to scale beautifully, how do you perform the actual scaling?

Part of the "ecosystem" building up around AWS (as well as many other web services) are tools such as RightScale and Scalr that do the work of starting and stopping servers as they are needed (Figure 2). With both systems, what you do is design the kinds of servers you need and then set some scaling rules based on the CPU load, maximum number of machines, or any other relevant consideration. These services talk directly to AWS on your behalf, so you don't have to start and stop using the AWS EC2 API directly.

Figure 2: Amazon says its services do the "heavy lifting" so you don't have to. Apps like RightScale and Scalr help manage the details so you can concentrate on the app.

You can sign-up for either of the services and deploy your application to as many servers as you like. If you really do like "getting under the hood," you can always role your own scaling system that speaks directly to EC2, S3, and other services. The API is based on SOAP, with bindings in most common languages.

Although each system works in its own way, the principles are similar. For example, with the preceding example of a system to generate recommendations for customers, you need at least one server running all the time, but if the number of active customers grows, you might want to increase the number of servers automatically.

Recommendation App

You can create a "recommendation app" that pulls items from the queue and generates recommendations. Along with this app, you can implement a set of rules that creates a new server instance if the CPU goes above a certain level (e.g., 70 percent).

You could also include rules that start a new server based on the number of items in the queue (e.g., if the total number of items goes above 1000, start up another server). This rule set will keep the queue down to a minimum by throwing in additional processing power when the line gets big. Your application is truly and dynamically scalable.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News