Big Data are huge amounts of data. Data storage has become affordable and very cheap. New techniques make long-term storage of massive data files easier, more reliable and safer than ever. Google and Facebook already offer these techniques. The enormous increase of data brings new challenges: how do you analyse all these data? 
Big Data has a high turnover rate. We can learn from data real-time. And we can immediately see the results of these real-time data. This is possible thanks to strong computer processors that are so much smarter and faster than they used to be.

Big data are a huge variety of structured and unstructured data. Nowadays it is possible to connect and produce all kinds of unstructured data. We look at internal data that is already available and we can take information that people leave behind online. The added value of this is high: 75% of all data are generated by users themselves.
Big Data is about the veracity of information. Thanks to Big Data we can analyze all the data. But a lot of data are noise. How do you put data into reliable insights and filter out this noise? That’s an art. It is also important that you can put the information into a certain context.