Something weird and unexpected happened lately: the big Amazon cloud failed. Anyone on any media is talking about it, and everyone is communicating just this sense of surprise.
Wait a second and let’s ask to ourselves: why is it weird and unespected that AWS failed? The cloud is something human, so it has failed, as expected, and will fail again. By the way, I am sure that it failed a number of times in the past, but the failures weren’t so big to be noticed like the last big event. Continue reading Cloud Demystification
Only a small note to let you know that Amazon is hearing us and added a new feature to EC2: persistent storage.
As a subscriber of AWS services yesterday I received an email in which Amazon announces that we “will be able to create volumes ranging in size from 1 GB to 1 TB, and will be able to attach multiple volumes to a single instance. Volumes are designed for high throughput, low latency access from Amazon EC2, and can be attached to any running EC2 instance where they will show up as a device inside of the instance…“.
The mail ends saying that the new functionality “will be publicly available later this year” and offers a link to request to join the private beta program; I subscribed it and will let you now as soon as I’ll put my hands on it.
Recently I stumbled upon a couple of articles1,2 and, remembering my experience with EC2, I discovered that utility computing was not what I was searching for: I was searching for something that helped me without adding complexity, but I was not happy with simple web hosting offers, I wanted also complete control over my infrastructure to have the technical freedom that I could need and because, when I think about my customers’ data, I trust no one.