Big data is set to revolutionise public services, but projects should focus on those that the public will understand, such as improvements in patient care.
Eddie Copeland, head of technology policy at think tank Policy Exchange, said big data can be used to help public sector services run more efficiently.

Speaking at a roundtable event organised by WANdisco, Copeland said there would be a £12bn shortfall in the public sector budget by 2020 and austerity in the public sector will continue, but “with big data we can change the model – prevention is better than cure”.
He said the public sector should focus on outcomes that really matter to people, pointing out that in the private sector consumers get an immediate tangible benefit in return for their data, but that is not the case in the public sector.
Tim Kelsey, NHS director for patient and information at NHS England, said it was “serious about making this NHS a data-driven organisation” this year, and would be “starting up” its Care.data programme again, despite it being labelled a failure last year.
A report released in December 2014 by the All Party Parliamentary Group for Patient and Public Involvement in Health and Social Care and the Patients Association found there was still widespread concern about the NHS’s Care.data programme and people are worried about how their personal data will be used.

Beyond Care.data, Kelsey said the NHS would look be looking at genomics. 
“This will not only allow us to understand our genes, but also to relate that to the real-time movement of molecules in our bodies. The datasets will challenge how we store data and the volume of that data,” he said. 
As an example, Kelsey described how Imperial College had modified machines used to analyse athletes’ urine during the London 2012 Olympics for use as molecular diagnostics machines. “These machines are only looking at the molecular construction of urine, but are producing more data per day than the Large Hadron Collider,” he said.
According to David Richards, founder and CEO of WanDisco, the economics of big data is changing. “Digital storage is growing at 60%, while IT budgets are only growing at 5%,” he said.
As an example, he cited a US agency that wanted to build a disease database. He said the cost of running and deploying the system on a commercial, enterprise-class data warehouse would have cost the agency more than the GDP of a top-five country.
Richards said Hadoop changes the economics, allowing governments and big businesses to run big data projects using commodity storage and servers.

Email Alerts
Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Read More

Related content from ComputerWeekly.com

RELATED CONTENT FROM THE TECHTARGET NETWORK

Leave a Reply