Wednesday, December 10, 2008

Final Post on the Databases in the Cloud Blog

Well, this will be the final post on this blog. I have been neglecting it anyway. The good news is that I want to pick back up on cloud blogging but I wanted to tie it to my new site, That site has a blog so I created a new blog with the same look and feel but covering cloud topics instead of database topics.

The new blog is cloud computing info (more generic than databases in the cloud). The URL is

A final note, in addition to the blog, I am writing a Cloud Computing With Amazon book. It will be available on amazon, mobipocketbooks (ebook site) and may other bookstores within the next month. I will be posting an html version, viewable for free, on

I will also post updates about the book at

Take care and I hope to see you on the new blog.


Technorati : ,

Friday, August 22, 2008

iCloud OS

I'm evaluating various web based desktops. A web desktop is a desktop in the cloud. So far, I have found three that I like. Today's is iCloud by Xcerion. I'll have a lot more info after my beta login is approved but from the videos, it looks like iCloud is the most advanced of all the Cloud OSes that I have looked at this far.

Here is a quick 3 minute video showing how to use the iCloud day planner.

And here is a little bit longer demo (5 minutes) of the OS itself.


Technorati : , ,

Friday, August 8, 2008

MS Live Mesh - Remote Desktop Meets the Cloud

You might not think of remote desktop as a cloud tool but MS has added cloud storage to remote desktop and called it Live Mesh. I have been using it recently and it is pretty nice. I use VNC fairly extensively and, when I'm not using VNC, I tend to use SSH. Well, I heard about this Live Mesh thing and decided to download it and give it a try. It is currently a beta product but I haven't had any issues.

My first thought on using it was that it was a clone of gotomypc. I'm not a user of gotomypc so I can't say for sure but it looks that way. The big difference is that gotomypc doesn't have a free version or online storage. I think the integration, storage and synchronization services are what make Mesh a unique tool.

Being an MS product, you might expect there is no Linux support. You would be right. There is MAC OS/X support though and Windows Mobile is on the way. I would be a lot more excited if they planned to support blackberry. But, it is still a great way to stay in touch with my desktop, laptop, work computer and the non-linux database servers in my lab.

This is what my home desktop looks like from the Live Mesh Desktop:

You can access a live mesh remote desktop from any computer that can run IE. The computer you are connecting FROM does not need to be running Live Mesh at all. Just log into your account at and login. You can then connect to any device in your mesh.

When you log in via a browser, you get a device screen where you see all of your devices and can connect to new ones. Your device names are not the actual hardware identification. You get to give them friendly text names.

If you open your live desktop, you can create folders to store data in the cloud. You currently get 5GB of storage for free,. You can create multiple directories and automatically sync those directories to the devices of your choosing. As an example, I created a Documents directory. Anything I put in that directory is automatically propagated to my work computer, my laptop (BIGDOG) and to one of my database servers (which has partially become my son's computer).

This is my Live Desktop:

I haven't really noticed any speed differences between the IE connection and the Remote Desktop tool. I prefer to use the remote desktop but I can't really say why. Here is what my db server looks like, first in IE and then in the remote desktop tool:


Remote Desktop:

If you notice in some of the above shots, Mesh windows have a side window with tips and information. I snipped that window out of some of these shots. It's kind of annoying but usually provides helpful information. I could see in the future, if this becomes a pay service, that ads might be placed there to support a free version.

This is the text of the help window when connected to a remote desktop:

So that's pretty much live mesh. It offers synching and free storage (more than mozy even) as well as a remote desktop that is accessible from a browser. I think this just shows how much cloud computing will be integrated in everyone's life and not just in business.


Technorati : , , , , ,

Thursday, August 7, 2008

Hot Cloud Computing Job

Have you ever wished for the perfect job? Something you're interested in and that will be fun? Aptana is hiring. It's bleeding edge and perfect for the uber-hardcore-geek 2.0.

I've used Aptana Studio, the web language IDE. This is a free, open source IDE that can run standalone or as part of eclipse. This isn't a lightweight IDE like so many. This is a full featured, web development environment.

Aptana also offers an AJAX server (which I haven't used) called JAXER and a brand new Cloud Computing environment, called, obviously enough, Aptana Cloud.

If you want to know a little From the Aptana site:

Aptana Cloud is a scalable Elastic Application Cloud™ featuring fully stacked and integrated PHP app engines, Ajax/Jaxer app engines, and soon Ruby on Rails app engines -- ready to use and ready to scale for your apps and sites as you need it. Aptana Cloud plugs right into Aptana Studio to provide instant deployment, smart synchronization, and seamless migration as you scale with those same app engines running locally on your desktop for development and testing before you deploy. Aptana Cloud's team management, integrated source control, application monitoring and other turnkey services make working with the Cloud a breeze. Aptana Cloud is ideal for developers who use scripting languages to create Ajax, Facebook, mySpace and all other sorts of Web applications.

I downloaded the cloud tools and signed up for the beta and will be playing with over the next few weeks. It's pretty cool. Expect some updates here, over the next few weeks, for some hands on updates.

The purpose of this entry though is to tell you that Aptana is hiring. Just to clarify, I don't work for them or get paid by them. This is just an update because the job sounded so cool. If it wasn't 3000 miles away, I would apply for it. Check it out:

Cloud Team

The Aptana Cloud team is chartered with defining, designing, implementing, and managing the Aptana Cloud offering, including APIs and infrastructure to support multiple cloud server and service providers, management applications and tooling, and integration facilities. Key positions are available on this team. If your skillset and passions fit some significant subset of the following list, please get in touch - we are looking for a few brilliant, versatile, driven individuals as we prepare to launch our 1.0 offering.

Do you:

  • Love to build, assemble and administer OS + software systems of many different sorts?
  • Understand virtualization and abstraction of APIs at many levels?
  • Want to understand and work closely with top-notch cloud server providers: Amazon, Google, and others?
  • Get excited about defining the future of cloud services and integration points?
  • Yearn to build very cool management and reporting applications on top of the services?
  • Feel confident in being the technical liaison with integration partners?
  • Excel at gluing systems together or - when you can't find the right pre-built system - excel at just building your own?
  • Wear many hats - and are you the person everyone always comes to for expertise in multiple areas?
  • Jump at the opportunity to own a scalable, fail-safe Java server to…
    • Automatically provision nodes in a cluster based on "smart" business rules
    • Define and implement APIs that talk to the cloud provider, the IDE, the billing system, the monitoring system, and even the cloud users' code
    • Offer both OLTP and OLAP reporting for monitoring, management, and trend analysis
    • (At least this part of the cloud infrastructure is in Java; the rest can be in various languages)

Areas and systems of interest:

  • The software stack of virtual servers (web, DB, app server, etc.)
  • Monitoring and instrumentation (e.g. Nagios, and/or roll-your-own scripts)
  • Versioning a cluster of machines
  • Optimizing performance and resource utilization (incl. kernel hacking)
  • Horizontal scalability, distributed systems, failsafe design
  • RESTful APIs
  • Java Tomcat
  • MySQL

Please submit resumes to: Thanks!

Like I said, if the job wasn't on the west coast, I would SOOOOO be applying for it.

If you do apply, let me know how it goes.


Technorati : , , ,

Wednesday, August 6, 2008

AT&T Plays in the Cloud

I have said quite a few times that Cloud Computing is more like a phone company than it is like electricity. A phone company can offer various services and will change from provider to provider. Electricity needs to be the same for everyone (at least within a region).

Today, AT&T announced the move to the cloud: AT&T Launches Global Utility Computing Service. According to this article from GridToday, AT&T announced it's new AT&T Synaptic Hosting, a managed network, security and storage business for businesses. They are build 5 "super data centers" to have a total of 38 data centers, world-wide.

A core feature of AT&T Synaptic Hosting is its next-generation utility computing platform. This enables the service to deliver a complete hosting solution with features that use the AT&T network to manage applications, compute resources on servers and store data. AT&T Synaptic Hosting also provides designated account support all backed by a single end-to-end, service-level agreement that is unique within the industry.

This looks like the beginning of something. Maybe telephony will morph to VOIP and the big bells will, with their tremendous computing power and dedicated networking, become the home of the cloud. Just because Amazon and Google started it, doesn't mean they will do it best or last the longest. This is the kind of thing the major communications companies need.

"Today's announcement is yet another example of AT&T's commitment to deliver next-generation services and solutions to companies worldwide," said Ron Spears, group president, AT&T Global Business Services. "The AT&T global network, combined with our powerful computing platform, is driving the convergence of networking and hosting services in ways that are allowing companies to deliver end-user applications whenever and wherever they are needed - while paying only for the capacity actually used.

The official Web site of the U.S. Olympic Committee (USOC) is powered by AT&T Synaptic Hosting. is the USOC's new feature-rich Web site that connects fans of the U.S. Olympic and Paralympic teams with America's athletes on their journey to the Olympic Games. The site features stories on U.S. Olympians and Paralympians and Olympic and Paralympic hopefuls, athlete blogs and social networking tools.

And they wouldn't be the phone company if they didn't offer a menu of options and add-ons.

In addition to utility computing features, AT&T Synaptic Hosting offers the following:

  • A broad selection of dynamic storage and security features that enterprises have come to rely on to protect their data and assets.
  • The ability to use AT&T's BusinessDirect customer portal to easily manage capacity, complete maintenance and monitor network service and performance of their virtual IT environment.
  • Personalized support from teams of designated hosting and application specialists who are experienced in the business and technical needs of the clients.
  • Application monitoring and reporting capabilities that work with most client software available in the industry today.
  • One end-to-end service level agreement that covers the customer's entire environment.

You can bet there will be more to come from AT&T.


Technorati : , , ,

Sunday, August 3, 2008

IBM is Building $400 Million Data Centers for the Cloud

According to GridToday, IBM is spending $400 million on two new cloud data center in Tokyo and North Carolina. The big news from this is that IBM is shooting for a very green data center.

In North Carolina, they are renovating an existing building and reusing 90% of the existing components. They plan on the center being 50% more energy efficient than the industry average. The site will originally be 60,000 square feet of raised flooring and will modularly expandable. The center will be open sometime in 2009. IBM received $750,000 in local and state economic incentives for building the center.

The Tokyo center will be a customer facing center that will help companies and universities use and implement cloud computing. This is IBM's ninth cloud computing center worldwide. Japan is a very mature and complex technology corridor. Due somewhat to the maturity, the systems may be old and inflexible. IBM will be helping Japanese companies adapt to cloud computing.

"To develop high skilled human resources in IT field, it is necessary to create latest IT environment in education place," said Hiroto Yasuura, Dean of Graduate School of Information Science and Electrical Engineering, Kyushu University. "Kyushu University is very interested in cloud computing technology, which can provide an on-demand IT environment to our students and teachers. We have been working with IBM, the pioneer of this field. Kyushu University will continue to take advantage of cloud computing technology more actively."

IBM, rather than being a cloud provider, is looking to be the cloud consultant. I imagine they will also eventually release a private cloud with their own software.

On the green front, IBM is investing $1 billion in R&D efforts to increase the efficiency of data centers. IBM has several huge data centers world wide. I've used the Dallas center, remotely, to prototype a large BI system I developed several years ago. At their center, I had access to a 32 CPU P595 with 512GB of RAM and an EMC san. That was just one of the many systems available for customer prototypes.

The center will partially be powered by alternative power sources (although IBM hasn't yet said what kind). The center will use virtualization as much as possible which will further save power. In the winter months, the center will use a free cooling system to allow the center to cool itself naturally. Very cool.

This is just another sign of the growing maturity of cloud computing. I'm glad IBM is investing so heavily in the cloud.

Technorati : , ,

Saturday, August 2, 2008

Dell Tries to Trademark "Cloud Computing"

Someone needs to smack Dell. I was reading Yahoo news and ran across an article about Dell. It says that Dell has applied to trademark the term "cloud computing". What a bunch of jerks.

They have made it past the notification of allowance phase and is at the opposition phase. That means that companies can complain and and let the USPTO know of their objections. I hope plenty of companies do file an objection. This is really what is NOT needed right now in the cloud.

I don't see how the USPTO can even allow this claim of a trademark. I'm not a lawyer but I'm pretty sure cloud computing as a term was around long before Dell even thought of entering the cloud hardware market. This is a lot like non-open source companies trying to take over the open source moniker.

Technorati : , , ,

Monday, July 28, 2008

VMWare Enters the Cloud

VMWare is the virtualization king and I have been wondering when they would chose to enter the cloud competition. Until now, it's pretty much been Amazon's game to win or lose. According to this article, VMWare is opening a data center in Washington state. The new data center will be 189,000 square feet, of which VMWare will use over 100,000 square feet of it. That's a nice size data center.

An interesting side note os that VMWare is not the first to build a data center in the area. Microsoft, Yahoo, Intuit, and base partners also have, or are building, data centers in the area. The article attributes that to cheaper electricity from the local dams generating hydro power.

Technorati : ,

Saturday, July 26, 2008

The Computers of Tomorrow

Is Cloud Computing a new idea? As a matter of a fact, it is not. Is comparing cloud computing to the electric utilities a new concept. As a matter of a fact, it is not. What does this sound like:


The computing machine is fundamentally an extremely useful device. The service it provides has a kind of universality and generality not unlike that afforded by electric power. Electricity can be harnessed for any of a wide variety of jobs: running machinery, exercising control, transmitting information, producing sound, heat, and light. Symbolic computation can be applied to an equally broad range of tasks: routine numerical calculations, manipulation of textual data, automatic control of instrumentation, simulation of dynamic processes, statistical analyses, problem solving, game playing, information storage, retrieval, and display.

Does that sounds like Nick Carr's analogy with electricity? "Symbolic computation" - When's the last time you heard it said like that?

How about:


The concept of an information-processing utility poses many questions. Will the role of information utilities be sufficiently extensive and cohesive to create a whole new industry? If so, will this industry consist of a single integrated utility, like American Telephone and Telegraph, or will there be numerous individual utilities, like Consolidated Edison and the Boston Gas Company? Will the design and manufacture of computing components, terminal equipment, and programming systems be accomplished by subsidiaries of the information utility, as in the telephone industry, or will there be a separate industry of independent private manufacturers, like General Electric and Westinghouse in today's electrical equipment industry?

This sounds an awful lot like utility computing. Something just isn't right though. An information processing utility? American Telephone and Telegraph? Is that AT&T? GE and Westinghouse?

Perhaps the most important question of all concerns the legal matter of government regulation. Will the information utility be a public utility, or will it be privately owned and operated? Will some large companies have their own information utilities, just as some companies today have their own generating plants?

That also sounds like the cloud computing that has been growing at Amazon and Google. Those are sort of like the questions people are asking. Who will own the cloud and how homogeneous will it be. And re-read that last sentence.

The high cost of capital equipment is a major reason why producers of electricity are public utilities instead of unregulated companies. A second reason is the extensive distribution network they require to make their product generally available. This network, once established, is geographically fixed and immovable. Wasteful duplication and proliferation of lines could easily result if there were no public regulation.

This above paragraph is so true. Check out the following paragraph:

Barring unforeseen obstacles, an on-line interactive computer service, provided commercially by an information utility, may be as commonplace by 2000 AD as telephone service is today. By 2000 AD man should have a much better comprehension of himself and his system, not because he will be innately any smarter than he is today, but because he will have learned to use imaginatively the most powerful amplifier of intelligence yet devised.

Did you read that? "by 2000 AD". The article I have been quoting from was written for the Atlantic Monthly in May 1964. This article was written two years before I was born.

This article just blows my mind. The companies are different, the primary industries have changed. The vision is amazing to me. Talk about an accurate extrapolation of the computer industry.

There are a couple of items here that I love.

Dr. Bush himself was only extrapolating from the technology of the time in these particular predictions.

That's the author of this article, Martin Greenberger, complimenting an earlier author on his foresight. Congratulations to you Mr Greenberger. You hit it dead on.

nor did he bank on the perfection of electronic logic, magnetic cores, and transistors.

heh. That one makes me smile. "the perfection of electronic logic, magnetic cores, and transistors", oh if only he know how "more perfect" it could get.

And finally, to set it in perspective. When this was written,

Tens of thousands of computers have been perfected and successfully applied in the past two decades

"Tens of thousands". How many is that now? 10s of billions? This article is 44 years old. I am just amazed.


Technorati : , , ,

Friday, July 25, 2008

Google App Engine Gets Perl, Sort Of

If you are not familiar with it, Google App Engine is Google's entry in the cloud, specifically a PaaS or Platform as a Service. With the Google App Engine, you get an IDE (python) to code your applications and then you deploy it to the Google cloud. You can integrate with other google services (as well as other http services) and use BigTable as a data store.

One of the limitation, IMHO, of Google App Engine is that it limited to Python. While I can do a little bit of python coding when I have to, I'm not a big fan of it. I don't see anything wrong with it, I just only have so many hours in the day and getting deep into python is not a priority for me. Having said that, I think I would rather python to Java.

What I would love to see is a truly pluggable architecture where a developer can choose his own language to interface with the app engine. Obviously, core code would need to be in the language of Google's choice but everything else should be pluggable via services or APIs. That may be closer than I thought.

Brad Fitzpatrick announced on his blog that he is working on a 20% project to add Perl to google app engine. He makes sure to be very specific that he is not on the app engine team and that this is not an app engine effort:

To be clear: I'm not a member of the App Engine team and the App Engine team is not promising to add Perl support. They're just saying that I (along with other Perl hackers here at Google) are now allowed to work on this 20% project of ours out in the open where other Perl hackers can help us out, should you be so inclined.

This is also not quite what I would like in that it is an effort to add Perl, not open it up for pluggable languages. Of course, as Brad says, the need for a hardened interpreter does require internal google effort. Still, this is a nice start. I prefer Perl to python.


Technorati : , , , ,

Wednesday, July 23, 2008

Voices in the Clouds

One of the big difficulties of the cloud is properly defining it. I don't think it will be completely defined for a while yet. Since that is the case, I think I would like to muddy the waters a little more.

Is VOIP a cloud service? It's a service, runs on the internet, on someone else's servers, in a location that I have no clue about. Isn't that SaaS? Ok, maybe something like Vonage, Cable Phone or Verizon is really a utility. Those actually have hardware in my house.

What about Skype, MSN Messenger or Yahoo Messenger? All of those can make phone calls to others using the same service. All the things I said about the VOIP providers above apply. So, is Skype a cloud service?

Actually, let's look at Yahoo Messenger. I can make video calls, do instant messaging and more. It really is a service and it gets more functionality all the time.

These ride a fine line. What do we call a service (SaaS) versus just a regular internet application? Or, is every internet application now a service?

It's a conundrum. ;-)

Technorati : ,

Monday, July 21, 2008

Thank You Larry Ellison

LewisC's An Expert's Guide To Oracle Technology

Larry Ellison is a technology leader. I think that's generally accepted. Some people might not like him, but you can't really deny what he has done with Oracle. Larry apparently has one giant weakness though. He's way ahead of his time. I ran across this news story from 1996.

New York -- Oracle Corp. CEO and Chairman Larry Ellison told a group of customers here that the first network computer conforming to the company's specifications will be launched in October, and priced at $299.

..a keyboardless, diskless NC with 8 megabytes of RAM. The configuration used a Zenith television as a monitor, a mouse, and ran Oracle's InterOffice groupware application.

What does that sounds like?

It may not be much to look at, but CherryPal's new device - a $249 paperback-sized box containing an underpowered processor and a token amount of memory - is a forerunner of the oncoming revolution in "cloud" computing.

Sound a bit like that? And that is a "forerunner of the oncoming revolution in cloud computing"? That's from an article on venturebeat, from July 21, 2008.

Maybe Larry should have named the NC, the CC, cloud computer. Oh wait, a cloud was just a bunch water way back then.

It's not exactly comparable. The NC was supposed to be entirely diskless not just cloud based. If you want to get even closer to the NC, check out the Nimbus Cloud Computer.

This is the new NC. Not really a computer at all but a network interface to the cloud. Best of all, it's free! Seriously, they'll send you a cloud computer for free. Well, the free version has ads but for $19.00 per month, you can get it ad free. For free you get access to some software and 2GB of storage.

A Cloud Computer is a re-imagination of the idea of a computer. We think that an ordinary computer is too expensive, too complicated, and too much for what most people want to use a computer for. What we did is put all of the costly and complicated pieces of hardware and software into our data centers. You then use a smaller, simpler, much less expensive device thats always connected to the internet to control your computer. We think this is a much better way for you to do just what you want with a computer.

In 2008:

Use your keyboard and mouse to control your nimbus cloud computer. We manage your computer & all web-based and desktop applications & access to the internet. We send your virtual computer desktop to your nimbus.

In 1996:

During the customer presentation, nearly a year to the day after Ellison first floated the NC concept at an industry forum in Europe, an executive demonstrated a keyboardless, diskless NC with 8 megabytes of RAM. The configuration used a Zenith television as a monitor, a mouse, and ran Oracle's InterOffice groupware application.

In 1996, the VP of Global Financial Development of Estee Lauder saw the vision of network computing:

"Not right now, but somewhere down the road," said Philip Theiss, vice president of global financial process development at Estee Lauder Companies in Melville, New York. He said that he could see a future application for the NC in field sales.

Throw in Google Docs, Zoho, Web mail (Yahoo or Google), calendars, etc and cloud computing is here; 12 years after Ellison tried to sell the Network Computer. Larry has vision. I think he just sees to far sometimes. I wonder if he still has an NC sitting around. I wonder if he kicks now and then and shouts, "See! I told you! Morons!" I bet he does.

LewisC : , , ,

Sunday, July 20, 2008

S3 was Down - Back up at 6PM

Bad news for Amazon, AWS and cloud computing in general. S3, Amazon's cloud storage, was down until just a few minutes ago. I was looking over a couple of my blog posts and noticed the images weren't displaying. I use and they use S3. Bummer.

I realize that this is a beta product for Amazon. I realize downtime happens. What is really bad is that there will now be a month of "cloud computing sucks" and "cloud computing is not ready for the enterprise" and others of that type. I won't argue the validity of those claims.

I just hate the set back this creates for cloud computing in general. Advancements come with adoption. People won't adopt something with mysterious outages. Hopefully, Amazon will be completely transparent on the cause of this outage and what they are doing to prevent it in the future.

I think this is, what? The third time in the last few months that either S3 or EC2 (or both) have gone down? Once a year is too much.

CenterNetworks has a nice blow by blow of the amazon updates on this blog entry.

Technorati : , , , ,

Thursday, July 17, 2008

Cloud Computing Journal

I stumbled across the Cloud Computing Journal published by sys-con today. Not much there at the moment. It's almost like a parked domain with a bunch of ads. I guess that's not fair, I can find 3 stories already: Cloud Computing: Introducing the Pyramid, Cloud Computing - The Jargon is back and an advertorial for Cloud Computing Journal.

The first is another "what is cloud computing article" and the second is a "what does the term cloud computing mean" article. Well, there are actually blog entries not articles, per se. If you search the web, you will find posting about "what is a cloud" out number "here's how to use the cloud" posts by about 100 to 1.

To break that trend, my next entry will be a useful entry about actually using the cloud. But first, I have to go read what the cloud is. ;-)


Technorati : ,

Tuesday, July 15, 2008

Will Cloud Computing be a $100 Billion Market

Do you think Cloud Computing will hit $100 billion? When? Merrill Lynch thinks it will pass $100 billion and by 2011 it will be worth $160 billion in combined business value and advertising. They also expect that cloud computing will be the next wave of investment for venture capitalists.

Merill Lynch figures right now that 10 companies are the big boys. It will be interesting to see how it plays out. They see VMWare and Citrix as major players which I don't quite see. They own the vitalization market but cloud computing is a lot more that virtual servers. What they both need to is develop the next generation private cloud offering. That's where I see the the Fortune 1000 going.

The article has a quote from author Nick Carr. He wrote the Big Switch which is about utility computing.

"I think for a lot of the traditional companies whether it is software companies like Microsoft, Oracle and SAP, or hardware companies like Dell or EMC, it's going to be a very tough transition to go from the old world to the new world."

I have to agree with that. I think Microsoft is more on the ball on this one than any of the others mentioned. They're working on Mesh, SDDS and the Office Live stuff. Oracle can't even figure out how to license for a VM much less for the cloud. EMC started a move with Mozy. I can see EMC partner with someone like Akamai and put out a second generation cloud. Dell will be hurting.

All I know is that the next few years will be very interesting in this space. I'm just hoping I will be able to stay in it. Job changes might be moving me back to a pure Oracle world (which isn't a bad thing, it's just not the next big thing).

Technorati : , , ,

Saturday, July 12, 2008

Cloud Tools: Cloud Studio

Amazon ships a handful of tools to use EC2 and S3. There are some freely downloaded scripts that make life a bit easier. Personally, I want to use a GUI. I use SSH enough when I connect remotely. If I'm in Windows, I want a windows tool.

Today I downloaded Cloud Studio from Cloud Services, Ltd. Cloud Studio is an free S3 browser with a little bit of EC2 support. It's a version 1.0 product so you can't expect too much. According to the site:

Cloud Studio is a visual tool designed to make the development of applications for Amazon Elastic Compute Cloud (EC2) more convenient. Developers (or someone responsible for applications deployment) can effortlessly create and destroy instances, manage security groups, keypairs, and allocate and assign IP addresses.

User can choose to run Cloud Studio as a traditional standalone application, or to use it as an extension to Eclipse IDE, which is currently one of the most widely used application development environments featuring support for Java, C/C++, PHP, and other programming languages.

The interface is extremely easy to use. It has a three pane window. Upper left is an AMI browser, upper right is a pane with a set of tabs showing configuration information. The bottom pane is the most useful with an instance monitor, an S3 browser and a progress tab (that shows outstanding tasks).

The instance monitor show any currently running instances. If you right click on an instance, you can terminate or reboot it. You can also associate an elastic IP to the instance.

The S3 browser is the most functional in the program. At the right is a drop down list and a series of icons. The drop down list lets you select a top level bucket. The icons, in order, allow you to get file properties, create a new bucket, delete a bucket, upload a file and refresh the screen. The final two icons are on all the panes and allow you to minimize or maximize a pane. If you right click on a file, you can choose to download the file.

That's about it. It's a very simple program but it does exactly what it advertises to do. I like it. I'll be trying other tools as I find them but for now, Cloud Studio is a part of my cloud computing toolbox.


Technorati : , , , ,

Friday, July 11, 2008

The Storage Cloud, Currently

InformationWeek has a good article, Behind The Storage Cloud. This article gives something of the plumbing behind the available storage in the cloud. Something they didn't talk about in that article are the limitations I have been running into using the cloud.

For infrastructure providers like Google or, who are offering a PaaS (Platform as a Service), the storage is built into the application. If you chose them to develop your application, that works out fine. However, if you are looking for archiving or storage scaling (grow storage as you need it), it's not so good.

Amazon offers a different kind of storage. S3 is a web based storage system. It's like a bucket for data. It's the biggest bucket you'll probably ever see, but it's just a big bucket. When you create a new directory, you're creating a smaller bucket in the big bucket. The namespace for the bucket is global. That means that your smaller bucket can't have the same name as someone else's bucket. That's a huge limitation.

Another issue with S3 is that it is a web service and not a block device. That means you can't directly attach it and use it as a file system. You need to make API calls. Even from within EC2 (Amazon's Cloud Computing Environment), S3 is only accessible through the API using PUT and GET style commands. Amazon is working on allowing EC2 to attach directly and there are other projects working on the same thing, PersistentFS being one of them.

I haven't found a good, cloud based attachable storage yet. I think Amazon, when they make S3 attachable will be the first (although that might only be attachable from within EC2). What is currently available is a plethora of archival solutions. For home usage, I don't think anything beats EMC Mozy. For $4.95/month, you get unlimited storage for one PC. It's slow to add new files (at least for me it is), but overall, I don't think any of its competitors really compete. I tried two others previously and decided to go with mozy for its price/feature ratio.

For business archival, I don't know that I would recommend Mozy. It's not that I would recommend against them, I just think there are better options. Off site, tape backups are still cheap and reliable. For the SMB market, burning a DVD once a week might even be enough. Just depends on your workflow and volume of data.

As a side note, it would be fairly easy to write a custom application to automatically backup changed files to S3. At less than 20 cents per month per GB, that might be a fairly reasonable solution, especially if you frequently need to access the archived data. I might even write a free giveaway to do just that. Just a POC kind of thing.

Technorati : , , , , , , , , ,

Thursday, July 10, 2008

5 Reasons to Embrace Cloud Computing

Yesterday, I wrote about 5 Reasons to avoid Cloud Computing. Today I am turning that around.

  1. Scalability - Scalability is *THE* marketing buzz for cloud computing. To be able to dynamically add storage and computing power as needed, is a huge benefit to any business that needs scalability. Take note of that, not all businesses need that kind of scalability. If you are not presenting a web presence with a likelihood of viral acceptance, you probably don't need this kind of scalability.
  2. Time to market - This is goodness for any business. You have your plan, your software, your domain name, whatever and you're ready to go. Now you need to buy hardware, configure the hardware, find space for the hardware, get power to the hardware, staff up maintenance for the hardware.... Or, press a button on a web page, have your web guy load the application and be on your way. Obviously I exaggerate. That's close though.
  3. R&D - have an idea? Fire up a cloud image and try it out. Doesn't pan out? Throw it away. You're only out a few dollars. No pesky unused hardware sitting around cluttering up your data center. Need a development or test instance? Press a few buttons on a web page, presto! Pay as you go. Scheduling dev and test hardware has now become painless for large organizations. Data refreshed? Thing of the past. Boot up a couple of images, load them up with data, throw them away when finished.
  4. Upfront Costs - Pay as you go is a wonderful thing. For most pieces of the cloud you will either pay per use or you will sign up for a subscription. I prefer pay as you go as I am a techie type and I only need it to do development or proof of concepts. Subscriptions work out well for businesses as it's still very low up front and costs are very predictable in the long run. Like "rent to own", pay as you go and subscriptions usually cost more in the long run. But like leasing, pay as you go does provide some guarantees for hardware.
  5. Buzzworthiness - Hey, if your widget doesn't run in the cloud, it is so 20th century. You thought web 2.0 was big? Used to be. Naw it's all about being SaaSy! Seriously, for at least the next few years, just being associated with cloud computing will get you additional bang for the buck. This applies to small/new players. If you have some stability, don't trade that for cloud computing just yet.


Technorati : , , ,

Wednesday, July 9, 2008

5 Reasons to avoid Cloud Computing

I will follow this post with a "5 Reasons to embrace Cloud Computing". I am actually very pro-cloud computing. I think it is the next big thing.

But, and there is always a but, there are reasons to avoid it right now.

  1. It's immature - we don't really know who the long term players will be. We can guess on MS, IBM, Google, Amazon. What about Mosso, GoGrid, Elastra, Rightscale and any of a dozen others. The market needs to mature and play out for a while.
  2. Security - is the cloud secure? I think it is actually. The problem is that I have yet to see any of the infrastructure providers offer any accounting or auditablity of that security. That is a barrier to compliance.
  3. No disconnected computing - what happens when the construction company fixing the road cuts your cable connection? Ok, probably the same thing as if you used a data center. But what happens if your cloud is based in the middle east, or africa? What happens when a ship's anchor cuts a cable? It could be. Most providers won't tell you where your data and applicatiosn are sitting.
  4. How much does it cost? We know how much it costs today but how much will it cost tomorrow? No one is offering long term contracts for infrastructure because they don't know what it will really cost. Third party management and tools vendors are offering subscription pricing and that gives you some protection but if the cpu per hour cost at your provider goes up, what will you do with those subscriptions?
  5. SLAs! This is my particular peeve of the moment. Amazon offers a future discount for not meeting their SLA. That's not adequate for most businesses. However, coughing up cash for a major outage could put a provider of business. I wonder if cloud insurance might be on the way in the future?

Those are my 5 reasons to avoid the cloud for now.