Big data and the illusions of certainty and meaning

weird rock thing.jpg

There’s a lot of buzz around AI and big data at the moment. Particularly about how AI is already, or is soon going to become smarter and better than humans.

One recent commentary on this theme that I really liked suggested that while AI is super smart it is not smart in a human way. The authors suggest that AI is more analogous to a kind of alien intelligence. And it’s usually super-good at one thing rather than being able to adapt itself (like humans can) to any random task you throw at it.

And I think this is one reason why big data on its own doesn’t always live up to its promise. Yes, it sees patterns, but what do these patterns mean, and particularly what do they mean in human terms. Tricia Wang's TEDX podcast I listened to suggested that over 70% of the $100billion of big data being produced isn’t actually profitable. And I suspect much of the claimed real-time personalisation that big data promises to deliver just ends up as creepily well-targeted banner advertising. (Have you had that spooky thing where one mention of an item on Messenger immediately spawns a month of unwanted Facebook ads?)

My contention is not that big data is bad. It’s amazing, and we will no doubt see much more of it. But just that big data on its own is often hard to read. There’s just so much of it and we (as pattern recognising creatures) see patterns in it which may not be particularly meaningful in human terms.

That’s why I continue to spruik much more human tools like ethnography: tools that focus on ‘thick data’ and describe the detail of people’s experiences, situations, feelings and interactions. These tools can help make sense of big data. By telling granular stories we can then start to link the numbers to a human narrative and create a far better understanding of what’s going on.