Loading...

Wednesday, December 10, 2008

Final Post on the Databases in the Cloud Blog

Well, this will be the final post on this blog. I have been neglecting it anyway. The good news is that I want to pick back up on cloud blogging but I wanted to tie it to my new site, DatabaseWisdom.com. That site has a blog so I created a new blog with the same look and feel but covering cloud topics instead of database topics.

The new blog is cloud computing info (more generic than databases in the cloud). The URL is http://clouddb.info/.

A final note, in addition to the blog, I am writing a Cloud Computing With Amazon book. It will be available on amazon, mobipocketbooks (ebook site) and may other bookstores within the next month. I will be posting an html version, viewable for free, on http://databasewisdom.com/

I will also post updates about the book at http://clouddb.info/

Take care and I hope to see you on the new blog.

LewisC

Technorati : ,

Friday, August 22, 2008

iCloud OS

I'm evaluating various web based desktops. A web desktop is a desktop in the cloud. So far, I have found three that I like. Today's is iCloud by Xcerion. I'll have a lot more info after my beta login is approved but from the videos, it looks like iCloud is the most advanced of all the Cloud OSes that I have looked at this far.

Here is a quick 3 minute video showing how to use the iCloud day planner.

And here is a little bit longer demo (5 minutes) of the OS itself.

LewisC

Technorati : , ,

Friday, August 8, 2008

MS Live Mesh - Remote Desktop Meets the Cloud

You might not think of remote desktop as a cloud tool but MS has added cloud storage to remote desktop and called it Live Mesh. I have been using it recently and it is pretty nice. I use VNC fairly extensively and, when I'm not using VNC, I tend to use SSH. Well, I heard about this Live Mesh thing and decided to download it and give it a try. It is currently a beta product but I haven't had any issues.

My first thought on using it was that it was a clone of gotomypc. I'm not a user of gotomypc so I can't say for sure but it looks that way. The big difference is that gotomypc doesn't have a free version or online storage. I think the integration, storage and synchronization services are what make Mesh a unique tool.

Being an MS product, you might expect there is no Linux support. You would be right. There is MAC OS/X support though and Windows Mobile is on the way. I would be a lot more excited if they planned to support blackberry. But, it is still a great way to stay in touch with my desktop, laptop, work computer and the non-linux database servers in my lab.

This is what my home desktop looks like from the Live Mesh Desktop:

You can access a live mesh remote desktop from any computer that can run IE. The computer you are connecting FROM does not need to be running Live Mesh at all. Just log into your account at livemesh.com and login. You can then connect to any device in your mesh.

When you log in via a browser, you get a device screen where you see all of your devices and can connect to new ones. Your device names are not the actual hardware identification. You get to give them friendly text names.

If you open your live desktop, you can create folders to store data in the cloud. You currently get 5GB of storage for free,. You can create multiple directories and automatically sync those directories to the devices of your choosing. As an example, I created a Documents directory. Anything I put in that directory is automatically propagated to my work computer, my laptop (BIGDOG) and to one of my database servers (which has partially become my son's computer).

This is my Live Desktop:

I haven't really noticed any speed differences between the IE connection and the Remote Desktop tool. I prefer to use the remote desktop but I can't really say why. Here is what my db server looks like, first in IE and then in the remote desktop tool:

IE:

Remote Desktop:

If you notice in some of the above shots, Mesh windows have a side window with tips and information. I snipped that window out of some of these shots. It's kind of annoying but usually provides helpful information. I could see in the future, if this becomes a pay service, that ads might be placed there to support a free version.

This is the text of the help window when connected to a remote desktop:

So that's pretty much live mesh. It offers synching and free storage (more than mozy even) as well as a remote desktop that is accessible from a browser. I think this just shows how much cloud computing will be integrated in everyone's life and not just in business.

LewisC

Technorati : , , , , ,

Thursday, August 7, 2008

Hot Cloud Computing Job

Have you ever wished for the perfect job? Something you're interested in and that will be fun? Aptana is hiring. It's bleeding edge and perfect for the uber-hardcore-geek 2.0.

I've used Aptana Studio, the web language IDE. This is a free, open source IDE that can run standalone or as part of eclipse. This isn't a lightweight IDE like so many. This is a full featured, web development environment.

Aptana also offers an AJAX server (which I haven't used) called JAXER and a brand new Cloud Computing environment, called, obviously enough, Aptana Cloud.

If you want to know a little From the Aptana site:

Aptana Cloud is a scalable Elastic Application Cloud™ featuring fully stacked and integrated PHP app engines, Ajax/Jaxer app engines, and soon Ruby on Rails app engines -- ready to use and ready to scale for your apps and sites as you need it. Aptana Cloud plugs right into Aptana Studio to provide instant deployment, smart synchronization, and seamless migration as you scale with those same app engines running locally on your desktop for development and testing before you deploy. Aptana Cloud's team management, integrated source control, application monitoring and other turnkey services make working with the Cloud a breeze. Aptana Cloud is ideal for developers who use scripting languages to create Ajax, Facebook, mySpace and all other sorts of Web applications.

I downloaded the cloud tools and signed up for the beta and will be playing with over the next few weeks. It's pretty cool. Expect some updates here, over the next few weeks, for some hands on updates.

The purpose of this entry though is to tell you that Aptana is hiring. Just to clarify, I don't work for them or get paid by them. This is just an update because the job sounded so cool. If it wasn't 3000 miles away, I would apply for it. Check it out:

Cloud Team

The Aptana Cloud team is chartered with defining, designing, implementing, and managing the Aptana Cloud offering, including APIs and infrastructure to support multiple cloud server and service providers, management applications and tooling, and integration facilities. Key positions are available on this team. If your skillset and passions fit some significant subset of the following list, please get in touch - we are looking for a few brilliant, versatile, driven individuals as we prepare to launch our 1.0 offering.

Do you:

  • Love to build, assemble and administer OS + software systems of many different sorts?
  • Understand virtualization and abstraction of APIs at many levels?
  • Want to understand and work closely with top-notch cloud server providers: Amazon, Google, and others?
  • Get excited about defining the future of cloud services and integration points?
  • Yearn to build very cool management and reporting applications on top of the services?
  • Feel confident in being the technical liaison with integration partners?
  • Excel at gluing systems together or - when you can't find the right pre-built system - excel at just building your own?
  • Wear many hats - and are you the person everyone always comes to for expertise in multiple areas?
  • Jump at the opportunity to own a scalable, fail-safe Java server to…
    • Automatically provision nodes in a cluster based on "smart" business rules
    • Define and implement APIs that talk to the cloud provider, the IDE, the billing system, the monitoring system, and even the cloud users' code
    • Offer both OLTP and OLAP reporting for monitoring, management, and trend analysis
    • (At least this part of the cloud infrastructure is in Java; the rest can be in various languages)


Areas and systems of interest:

  • The software stack of virtual servers (web, DB, app server, etc.)
  • Monitoring and instrumentation (e.g. Nagios, and/or roll-your-own scripts)
  • Versioning a cluster of machines
  • Optimizing performance and resource utilization (incl. kernel hacking)
  • Horizontal scalability, distributed systems, failsafe design
  • RESTful APIs
  • Java Tomcat
  • MySQL

Please submit resumes to: kathy@aptana.com. Thanks!

Like I said, if the job wasn't on the west coast, I would SOOOOO be applying for it.

If you do apply, let me know how it goes.

LewisC

Technorati : , , ,

Wednesday, August 6, 2008

AT&T Plays in the Cloud

I have said quite a few times that Cloud Computing is more like a phone company than it is like electricity. A phone company can offer various services and will change from provider to provider. Electricity needs to be the same for everyone (at least within a region).

Today, AT&T announced the move to the cloud: AT&T Launches Global Utility Computing Service. According to this article from GridToday, AT&T announced it's new AT&T Synaptic Hosting, a managed network, security and storage business for businesses. They are build 5 "super data centers" to have a total of 38 data centers, world-wide.

A core feature of AT&T Synaptic Hosting is its next-generation utility computing platform. This enables the service to deliver a complete hosting solution with features that use the AT&T network to manage applications, compute resources on servers and store data. AT&T Synaptic Hosting also provides designated account support all backed by a single end-to-end, service-level agreement that is unique within the industry.

This looks like the beginning of something. Maybe telephony will morph to VOIP and the big bells will, with their tremendous computing power and dedicated networking, become the home of the cloud. Just because Amazon and Google started it, doesn't mean they will do it best or last the longest. This is the kind of thing the major communications companies need.

"Today's announcement is yet another example of AT&T's commitment to deliver next-generation services and solutions to companies worldwide," said Ron Spears, group president, AT&T Global Business Services. "The AT&T global network, combined with our powerful computing platform, is driving the convergence of networking and hosting services in ways that are allowing companies to deliver end-user applications whenever and wherever they are needed - while paying only for the capacity actually used.

The official Web site of the U.S. Olympic Committee (USOC) is powered by AT&T Synaptic Hosting. Teamusa.org is the USOC's new feature-rich Web site that connects fans of the U.S. Olympic and Paralympic teams with America's athletes on their journey to the Olympic Games. The site features stories on U.S. Olympians and Paralympians and Olympic and Paralympic hopefuls, athlete blogs and social networking tools.

And they wouldn't be the phone company if they didn't offer a menu of options and add-ons.

In addition to utility computing features, AT&T Synaptic Hosting offers the following:

  • A broad selection of dynamic storage and security features that enterprises have come to rely on to protect their data and assets.
  • The ability to use AT&T's BusinessDirect customer portal to easily manage capacity, complete maintenance and monitor network service and performance of their virtual IT environment.
  • Personalized support from teams of designated hosting and application specialists who are experienced in the business and technical needs of the clients.
  • Application monitoring and reporting capabilities that work with most client software available in the industry today.
  • One end-to-end service level agreement that covers the customer's entire environment.

You can bet there will be more to come from AT&T.

LewisC

Technorati : , , ,

Sunday, August 3, 2008

IBM is Building $400 Million Data Centers for the Cloud

According to GridToday, IBM is spending $400 million on two new cloud data center in Tokyo and North Carolina. The big news from this is that IBM is shooting for a very green data center.

In North Carolina, they are renovating an existing building and reusing 90% of the existing components. They plan on the center being 50% more energy efficient than the industry average. The site will originally be 60,000 square feet of raised flooring and will modularly expandable. The center will be open sometime in 2009. IBM received $750,000 in local and state economic incentives for building the center.

The Tokyo center will be a customer facing center that will help companies and universities use and implement cloud computing. This is IBM's ninth cloud computing center worldwide. Japan is a very mature and complex technology corridor. Due somewhat to the maturity, the systems may be old and inflexible. IBM will be helping Japanese companies adapt to cloud computing.

"To develop high skilled human resources in IT field, it is necessary to create latest IT environment in education place," said Hiroto Yasuura, Dean of Graduate School of Information Science and Electrical Engineering, Kyushu University. "Kyushu University is very interested in cloud computing technology, which can provide an on-demand IT environment to our students and teachers. We have been working with IBM, the pioneer of this field. Kyushu University will continue to take advantage of cloud computing technology more actively."

IBM, rather than being a cloud provider, is looking to be the cloud consultant. I imagine they will also eventually release a private cloud with their own software.

On the green front, IBM is investing $1 billion in R&D efforts to increase the efficiency of data centers. IBM has several huge data centers world wide. I've used the Dallas center, remotely, to prototype a large BI system I developed several years ago. At their center, I had access to a 32 CPU P595 with 512GB of RAM and an EMC san. That was just one of the many systems available for customer prototypes.

The center will partially be powered by alternative power sources (although IBM hasn't yet said what kind). The center will use virtualization as much as possible which will further save power. In the winter months, the center will use a free cooling system to allow the center to cool itself naturally. Very cool.

This is just another sign of the growing maturity of cloud computing. I'm glad IBM is investing so heavily in the cloud.

Technorati : , ,

Saturday, August 2, 2008

Dell Tries to Trademark "Cloud Computing"

Someone needs to smack Dell. I was reading Yahoo news and ran across an article about Dell. It says that Dell has applied to trademark the term "cloud computing". What a bunch of jerks.

They have made it past the notification of allowance phase and is at the opposition phase. That means that companies can complain and and let the USPTO know of their objections. I hope plenty of companies do file an objection. This is really what is NOT needed right now in the cloud.

I don't see how the USPTO can even allow this claim of a trademark. I'm not a lawyer but I'm pretty sure cloud computing as a term was around long before Dell even thought of entering the cloud hardware market. This is a lot like non-open source companies trying to take over the open source moniker.

Technorati : , , ,