Where’s my data gone?

Understanding how data flows within an organisation is key to ensuring that it can be managed and analysed effectively.

 

Photo Credit of Silvia Paola Lai

By Nicola Brady, QA Compliance Specialist
Data is already everywhere and as information technologies evolve and the world in which we live becomes more automated available data gets even bigger and more prolific.  Understanding how data flows within an organisation is key to ensuring that it can be managed and analysed effectively.  The bigger data gets, the more complex it is to deal with.  Therefore, understanding the supply chain of your data is so important.  This is particularly true for the life science industry where quality and GMP decisions are made every day based on data, and where data itself is a critical product as it underpins all products and processes.  As such it is imperative that we understand how and where data flows i.e. do you know where your data is and who is accessing your data at any given time during its lifecycle and how can you assure the preservation of data integrity?

The data flow for a given process or system can be defined as the supply chain for all data, metadata, inputs and outputs for that process and
system.  All data goes through a process of creation, processing, review, reporting and use, retention and retrieval,
and destruction.  During the data lifecycle the data may cross between different systems, between manual
(paper) processes and computerized systems, to cloud-based applications and storage.  Data may move across
organisational boundaries, e.g. internally between departments, or externally  between regulated companies and third parties. 
Understanding and controlling these hand offs between processes, systems and entities is already complex and even more so where the data is moving in and out of cloud-based applications provisioned by a third party!

Has your organisation made a decision to outsource activities, such as data storage, to external cloud service
providers?  Are you taking a risk handing over your data to an unknown entity?  Do you understand how your data will be protected and controlled by the external service provider? Does the external service provider fully understand what’s expected from a life-science regulatory perspective? Are they
willing and able to demonstrate this?

To mitigate the potential risk to your data when outsourcing to a third party you must have a clear understanding of exactly where your data will reside, whether other third-party suppliers / subcontractors will have access to it and what security control measures will be implemented to safeguard it.  This can only be achieved through appropriate vetting of your potential third party supplier.  Once you are satisfied with the potential third party supplier you should then ensure that an agreement and contract is established and approved containing explicit requirements and controls prior to using the supplier for the outsourced activity.  Once in use you should ensure a periodic evaluation of your third-party supplier to ensure that the requirements and controls per the contract agreements are being adhered to.   

So irrespective of the process or system or its interfaces and boundaries, once an organisation can pin point where all associated data is at any given time during the data life-cycle and understand the controls in place to protect the data, even when it is stored in a cloud-based application managed by a third-party supplier, they can be confident that data integrity can be assured.

0 0 votes
Article Rating
Nicola Brady

Nicola Brady

Nicola Brady is a Senior Quality Assurance Specialist with Odyssey VC and Compliant Cloud. She specialises in articles about Quality and Data.

Share on

Share on twitter
Share on linkedin
Share on facebook
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x