Please activate JavaScript!
Please install Adobe Flash Player, click here for download

Oec. Magazin 2

Thomas Landolt, at what point does data become big data? We use a definition combining what we call the four «Vs» – Volume, Velocity, Variety and Veracity. Volume refers to the sheer magnitude of data available today, and you can imagine how much this has increased in one generation; Velocity has to do with the speed at which data is being generated in real-time using electro- nic sensors and the like; then we have Variety: besides so-called «structured» data which is very well defined and found in databases, you also have the problem of dealing with «unstructured» data – in fact about 80% of data today is un- structured; and finally, Veracity: a lot of data is of questionable reliability – you simply don’t know whether it’s trustworthy or not – so how should you handle it? Any concept of big data must encompass these four characteristics. Isn’t big data also a relative concept in that what qualifies as big data today might in 5 years’ time be considered to be ridiculously small? Absolutely. I remember the time when we were proud to have a computer that could process 32 kilobytes of data. At the time this was considered a sensational performance but today we would laugh. The big data industry – specifically data manage- ment and analytics - is reportedly growing twice as fast as the software industry in general.What’s driving this growth? Several factors. On the production side there’s the Internet, for example, which is producing a huge volume of data – just think of all the state- ments done on Twitter. Then you have sensors The booming industry Big data has been hailed as a new factor of production with the potential to unlock new sources of economic value and give fresh insights into human behaviour. But how can the sheer volume of data available today be mined to make a qualitative difference to our lives? Thomas Landolt, CEO of IBM Switzerland, shares his thoughts on the booming industry of «big data». James O.H. Nason and many other devices being widely installed which measure and report data in their respec- tive areas. Then there’s the consumer side where, thanks to the availability of the corresponding technology and software, companies now have the technical means to gain insights into cus- tomer preferences, behaviour, risk profiles and so on. So it’s primarily a combination of the producer and consumer sides that’s driving this whole market and giving rise to the demand for big data processing solutions. The volume of digitised data continues to grow exponentially, as do expectations about how to access and analyse it faster. At the same time, data is being produced at a rate which outstrips our physical ability to store it. Are we talking pri- marily about a hardware or software challenge? Ultimately it’s both, and from our point of view of course it’s also a market. But it starts with a question on the business side: what do you want to do with your data and what new insights can you get from it? This leads to increasingly sophisticated algorithms to analyse the data, so from that point of view it’s a software issue. But then you also have to consider storage require- ments and the capacity to transfer enormous quantities of data over a network, and that’s a hardware issue. So it’s really a whole chain of components that’s required. How can big data help companies unlock new sources of economic value or maintain a competi- tive edge in their respective industries? This can best be illustrated with examples. Im- agine, for example, a retail organisation which, thanks to big data, can obtain a much better «I remember the time when we were proud to have a computer that could process 32 kilo- bytes of data.» Thomas Landolt 20 Oec. Dezember 2014 Fokus