Much has been written about the accelerated digital transformation in the public sector because of the Covid-19 pandemic. Driven by the need to deliver a better online experience for citizens, public sector organisations are looking at how to create efficient and personalised experiences.
Earlier this year, the Scottish Government published its new digital strategy, A Changing Nation: How Scotland will Thrive in a Digital World, to highlight its commitment to ensuring Scotland keeps pace with the challenges and opportunities the digital world brings.
But while digital initiatives accelerate across the Scottish public sector, data volumes are also growing at an exponential rate with a greater desire to actively share data between agencies and departments, which has been vital in directing the public and keeping them informed during the pandemic by evaluating virus symptoms and informing the overall vaccination rollout.
Citizens now accept that data is shaping our lives in this way and that all the challenges the UK faces – whether it is security, taxation or movement of people – require better use and sharing of data in the public sector.
But what is stopping Scottish public sector organisations from capitalising on this new data era?
The biggest challenge with managing data today is not only the growth rate but the changing nature of the data itself. The majority of growth is coming from unstructured and semi-structured data such as images, video and machine-generated content from Internet of Things devices.
Much of this data rapidly becomes “stale”, ie not actively accessed by users or applications, but that doesn’t mean it can be deleted, especially in public services.
For example, medical x-rays for a – now healed – broken arm must be retained to inform future assessments of a patient’s health. Evidence in a – now closed – criminal case needs to be retained in case new evidence emerges.
Data that doesn’t need to be accessed regularly but still retained, adds to the cost and resources needed to manage it.
Also, increased cybersecurity threats such as ransomware are ever-more real, causing organisations to consider additional data protection security measures.
The key is to look at technologies and architectures that can help contain data storage costs without compromising performance or availability, increase your security but also disaster recovery capabilities.
With many public sector organisations still reliant on legacy technology, there are three key considerations to help them adopt a secure and sustainable data strategy:
Build a data fabric, not datacentres
As public sector organisations develop their approach to buying appropriate cloud services, they should consider how data will be stored and managed. Rather than separate disconnected silos, consider building a data fabric that spans a hybrid cloud environment, with provision for centralised backup, disaster recovery, overarching data management and controls to reduce operational cost and complexity.
The value cost equation
Exponential data growth will continue so organisations need environments to help them better understand and categorise their data, so it is more appropriately stored or optimised. Storage efficiency, data tiering and cloning help reduce the environmental impact of data storage.
A shared responsibility model
Just because organisations are using the cloud, doesn’t mean they don’t have any responsibility for their data. Organisations must put in place a framework which makes security obligations and accountability clear, so data is protected and there is a plan in place if something goes wrong.
As we navigate this new data era, these three areas will be vital for Scottish public sector organisations to be more efficient and sustainable with data.
Author Adrian Cooper is UK CTO of NetApp, a leading, cloud-led, data management company
Partner Content in association with NetApp