Big data.
Two words spoken by many experts, in many many fields, from science to social networks, to economy and finance, to earth observation. But while producing big data is one thing, securing it so it may remain useful, if not accessible, over time is another one.
“Big science” efforts led by international consortia typically have data-management and sharing plans built in. But many labs doing small- to medium-scale studies in more specialised areas – such as analysing the biological contents of a single lake, or tracking the physiology of specific animal models – have no such systems. Their data often remain siloed in the labs that generated them, fading from memory as project members leave. For the scientific community, that’s a tragedy of wasted effort, lost collaborative opportunities and irreproducibility”, says an interesting article in Nature (see below). But solutions are emerging. They are nicely described in this paper, which concludes: “Remedying this will require structural changes in the infrastructure for scientific funding and support. But the rising generation of scientists – born into an era of open-access, open-source and automated science – might be more amenable to the effort than their predecessors.”
But what if those big data should not be disclosed without being encrypted, for any privacy reason. That is the case, among others, in the medical world. Here comes a solution too: “fully homomorphic encryption”, which allows analysis to be run on data without ever seeing the contents. “It could help us reap the full benefits of big data, from fighting financial fraud to catching diseases early”, writes a New Scientist article. Even better: this method has just been tested on data coming from Swiss hospitals!
- Olivier Dessibourg, GESDA
(EN)
|