Storagebod Rotating Header Image

January, 2012:

Your Data, Your Responsibility

I’ve been thinking recently about the post-PC era and what it really means; for some people it means the end of the desktop and the traditional PC but I think that this is slightly wrong-headed. For me, the post-PC era is my content anywhere and at any time.

Access to data is more important than anything but you might still use a traditional desktop to do heavy-lifting and manipulation; for example, tablets are great but for many tasks, I would still want to use a traditional keyboard, mouse and big screen. But when I’m away from base, I still want access to my data and perhaps do some lightweight manipulation.

So the post-PC world is moving us away from a single tethered end-point device to a multitude of devices, some mobile and some fixed. The applications we use on these devices may be different, in both scope and function but the data will be common and accessible everywhere.

This will bring challenges to us as individuals and as businesses; where do we store that data and how do we protect that data, both ensuring it is stored securely but also that it remains available. The recent Megaupload closure has already lead some people to question the long-term viability of cloud-storage. What happens if the site you store your data on is suddenly shut-down?

Question where you are putting your data; if it becomes obvious that a site has a slightly dubious reputation, then perhaps you should ask yourself whether you want to rely on its availability. But even if it is a site which has the highest reputation, ‘Shit Does Happen’; so you probably want to ensure that you have multiple copies stored in multiple places.

But also be aware of the underlying service, if both your Cloud storage providers are reselling storage from the same Cloud provider; question again.

Your data, your responsibility…

 

Meaningless Metrics!

Recently we’ve had a bit of a senior management reshuffle and as a result, people are in ;prove that their teams are valuable’ mode again. Inevitably this means producing metrics to show why your team is brilliant, so my manager came along and asked me to produce some metrics about our storage estate.

So off I went and produced some pretty graphs showing storage growth, increase in number of technologies supported and some other things that I thought might show why my team does an excellent job. One of the problems with running a very good team is that they tend to have a relatively low visibility; they don’t cause problems and things don’t often break. Most of the time, people don’t know that we are here.

Anyway, as is the way of these things; the usual comment comes back; how many terabytes per head do we manage and what is the industry average?  Now with over two petabytes of active data per head under management, I could claim that my team is great but to be honest no-one really knows what the industry average is and would it be meaningful anyway? I’ve seen from 50Tb to 1 Petabyte quoted but with a figure of 150-200Tb most oft quoted; so my team could be good, great or downright amazing (it’s somewhere between the last two).

However, this is all meaningless and becomes more meaningless the more that industry changes. For example, we are managing what is closer to a big-data environment; big data environments have large infrastructures but if I am being honest, they are not especially hard to manage.

We rely on a fair amount of automation and standardisation; applications often do a lot of the storage management function and although the storage infrastructure grows, it tends not to change massively. Allocations are large but they tend to be relatively static; in that once allocated, it does not move around a lot; we make a lot of use of clustered file-systems and most of the work we do is non-disruptive. We add nodes in and even if node fails, it tends not to take every thing with it; we can live with a node down for weeks; the applications are resilient and services can generally cope with failures.

We have our busy times but it does generally run pretty smoothly; most of our time is spent on working out how we can make it run even more smoothly and how we improve the service, which in my opinion is exactly how it should be. The best support teams look busy but not stressed; hero cultures are not where it’s at.

So I’ve given my boss a figure but I am really not sure that it has a lot of value. Lies, Damn Lies and Metrics!

Dell of Dilemma

I have just spent time at the Dell Storage Forum event 2012 (disclosure: Dell kindly paid for the hotel and subsistence) and although is no-where near the size of an EMC-World type event, everyone seemed to be having a good time and there were lots of smiling attendees.

Compellent customers seem to be almost uniquely happy with their supplier and this does not appear to have changed too much even after year of Dell’s stewardship; so it appears that Dell are at least not messing this up.

But I think Dell have a dilemma, how do you meld a set of acquired IP and product into a strategy? Especially with all due respect to Dell, in a company which does not have a massive history in product innovation.

Dell have been a great client-side supplier with strong supply-change management and delivery but they are not the company you immediately think about when talking about innovation.

Many of the announcements made at the Storage Forum are arguably just catch-up and there’s some interesting niche products; for example SharePoint object storage which show Dell working in partnership with ISVs to address particular pain-points but there’s nothing which shows Dell leading the way.

I think Dell need to move beyond this and think larger, they need to build a strong narrative for the future and deliver on it.

1) Dell need a strong file story, they desperately need a credible NAS product; the Exanet IP gave them a file-system but if you look at the Exanet product, that was about all it gave them.

2) Dell need an integrated management console for their products; their product range is still small enough that this is an achievable goal, if they leave it much longer, then they will stand no chance.

3) Dell need to stop comparing themselves to HP and the like; they need to build their own narrative and focus on next generation storage. This means looking at what some of the pure SSD players are doing and build a strategy around ‘What happens when SSD becomes free in the same way that spinning rust as a component is free?’; what can you do then that you cannot do now. Automated Storage Tiering is very clever but we will still care about it in 10 years or even 5 years? If all your primary storage is SSD, what changes?

4) Dell could also consider being Dell; what happens if you decide that you are not going to gouge your customer base and move away from the high margins which storage and Enterprise storage traditionally attracts?

5) Buy Brocade? Lots of smart people in Brocade and it would bring a number of innovators into the fold.

What-ever happened to Object Storage?

We have heard a lot about Object Storage but really how much impact has it had on the storage market so far? EMC make lots of noise about Atmos for sure but I hear very much conflicting stories on the take-up; NetApp bought Bycast and I hear a deafening silence; HDS have HCP and seem to be doing okay in some niche markets; Dell have their DX platform and there are many smaller players.

But where is it being deployed? Niche markets like medical and legal uses but general deployment? I hear of people putting Object Storage behind NAS gateways and using it as a cheaper NAS but is that not missing the point.  If you are just using NAS to dump files as objects into an Object Store, you are not taking advantage of much of the meta-data which is the advantage of Object Storage and you continue to build systems which are file-system centric. And if you really want a cheaper NAS, there might be better ways to do it.

For Object Storage to take off, we need a suite of applications and APIs which are object-centric; we need a big education effort around Object Storage but not aimed at the storage community but at the development and data community.

Object Storage is currently being sold to the wrong people; don’t sell it to Storage Managers, we’ll manage it when there is a demand for it but we are probably not the right people to go out and educate people about it. Yes, we are interested in it but developers never listen to us anyway.

I hear Storage Managers saying ‘we’d be interested in implementing an Object Storage solution but we don’t know what we’d use it for’; this isn’t that surprising as most Storage Managers are not developers or that application-centric.

If you don’t change your approach, if you don’t educate users about the advantages, if you continue to focus on the infrastructure; then we’ll be asking this question again and again. Object Storage changes infrastructure but it is probably more akin to a middle-ware sale than an infrastructure sale.

 

Happy New Year

Hope everyone had a nice break and is ready to get back into the swing of things; 2012 is upon us and for us living in London, we look forward to a summer of travel chaos and ever increasing levels of hyperbole. It is both the London Olympics but also the Queen’s Diamond Jubilee, so a great time to visit London and probably a great time to be living elsewhere.

Next week sees the Dell Storage Forum in London and the first #storagebeers of the year. Dell has had a year now to get their storage portfolio in order and 2012 must be the year that they begin to see their acquisitions deliver; yet, even that might not be enough and we need to see some innovation and road-maps presented. From Exanet to Compellent via Equallogic, there is enough product and I am looking forward to see how it gets woven into a strategy.

Yet Dell are not the only company who need to start weaving a strategy, arguably with the exception of EMC, this is the year when everyone needs to start drawing the weft and clothing their products with strategy and coherence.

And it is not just the vendors who need to get their strategies in order; this is very much the case for the end-user as well. Too much product and too much fluff still proliferates in many end-user organisations, this often due to a confusion between flexibility and choice.

From Cloud to Data Analytics; there has been a lot of playing with these technologies but many organisations need to move beyond this and into delivery of investment and results. As in every year, there is lots to do and as in every year, there might be too much to do. Start stripping away the fluff and delivering.