How quickly can data work for you?
13/07/18 | by: Jack Blogg
By now, most IT professionals have seen and understood the potential benefits of Big Data. The problem often is the time it takes to get to the value that is apparently on offer. This is possibly truer in the public sector than anywhere else, where strictures on how data may be used are rightly robust.
A lot of the difficulty experienced both in the private and public sectors, however, is technological rather than procedural or legal. Put simply, the time it takes to surface data from an underlying system is too long because one piece of data doesn’t marry up with another. The data has to be shared, even internally, as swiftly as possible.
There are two sides to making this happen as swiftly as public sector clients want. One is the analytics. The right software has to be asking the right questions of the data or it’s simply not going to deliver what’s required. The second is the underlying storage system.
Most organisations that have been around for any length of time have built up hordes of legacy systems, particularly when it comes to how they manage their storage. Something that was on paper in the 1960s might have been archived to tape in the late 1970s and through the 1980s, with disk taking over in the 1990s and solid state or other modern technologies getting a look-in most recently. Inevitably, once an analytics program has to sift through all of these media it takes time.
This is why a partnership with a storage specialist such as S3 can pay early dividends. Using software defined storage in partnership with IBM we can persuade your system that it’s all “storage” and will all be searched at once. This leads to the sort of speed a modern IT executive needs.
It’s robust and compliant with all existing and known forthcoming legislation – and it starts with a conversation with S3.
Comments are turned off for this article.