|
|
Line 1: |
Line 1: |
| [[File:Workshop GDPR compliance at the 2019 Global Entrepreneurship Summit.jpg|right|500px]]In Chapter 2, we examined standards and regulations influencing cloud computing. We also noted that there's an "elephant in the room" in the guise of data privacy and protection considerations in the cloud. Many of the regulations and data security considerations mentioned there apply not only to financial firms, manufacturers, and software developers, but also laboratories of all shapes and sizes. At the heart of it all is keeping client, customer, and organizational data out of the hands of people who shouldn't have access to it, a significant regulatory hurdle.
| | Chapter 2 also looked at deployment approaches in detail, so this section won't get too in-depth. However, the topic of choosing the right deployment approach for your laboratory is still an important one. As Agilent Technologies noted in its 2019 white paper ''Cloud Adoption for Lab Informatics'', a laboratory's "deployment approach is a key factor for laboratory leadership since the choice of private, public, or hybrid deployment will impact the viability of solutions in the other layers of informatics" applied by the lab.<ref name="AgilentCloud19" /> In some cases, like most software, the deployment choice will be relatively straightforward, but other aspects of laboratory requirements may complicate that decision further. |
|
| |
|
| Most regulation of data and information in an enterprise, including laboratories, is based on several aspects of the data and information: its sensitivity, its location or geography, and its ownership.<ref name="SimorjayData14">{{cite web |url=https://download.microsoft.com/download/0/A/3/0A3BE969-85C5-4DD2-83B6-366AA71D1FE3/Data-Classification-for-Cloud-Readiness.pdf |format=PDF |title=Data classification for cloud readiness |author=Simorjay, F.; Chainier, K.A.; Dillard, K. et al. |publisher=Microsoft Corporation |date=2014 |accessdate=21 August 2021}}</ref><ref name="TolsmaGDPR18">{{cite web |url=https://www2.deloitte.com/nl/nl/pages/risk/articles/cyber-security-privacy-gdpr-update-the-impact-on-cloud-computing.html |title=GDPR and the impact on cloud computing: The effect on agreements between enterprises and cloud service providers |author=Tolsma, A. |publisher=Deloitte |date=2018 |accessdate=21 August 2021}}</ref><ref name="EusticeUnder18">{{cite web |url=https://legal.thomsonreuters.com/en/insights/articles/understanding-data-privacy-and-cloud-computing |title=Understand the intersection between data privacy laws and cloud computing |author=Eustice, J.C. |work=Legal Technology, Products, and Services |publisher=Thomson Reuters |date=2018 |accessdate=21 August 2021}}</ref> Not coincidentally, these same aspects are often applied to data classification efforts of an organization, which attempt to determine and assign relative values to the data and information managed and communicated by the organization. This classification in turn allows the organization to better discover the risks associated with those classifications, and allow data owners to realize that all data shouldn't be treated the same way.<ref name="SimorjayData14" /> And well-researched regulatory efforts recognize this as well.
| | Broadly speaking, any attempt to move you LIS, LIMS, or [[electronic laboratory notebook]] (ELN) to the cloud will almost certainly involve a [[software as a service]] (SaaS) approach.<ref name="AgilentCloud19" /> If the deployment is relatively simple, with well-structured data, this is relatively painless. However, in cases where your LIS, LIMS, or ELN need to interface with additional business systems like an [[enterprise resource planning]] (ERP) system or other informatics software such as a [[picture archiving and communication system]] (PACS), an argument could be made that a [[platform as a service]] (PaaS) approach may be warranted, due to its greater flexibility.<ref name="AgilentCloud19" /> |
|
| |
|
| Take for example the European Union's [[General Data Protection Regulation]] (GDPR). The GDPR stipulates how personal data is collected, used, and stored by organizations in the E.U., as well as by organizations providing services to individuals and organizations in the E.U.<ref name="GoogleGDPR">{{cite web |url=https://cloud.google.com/security/gdpr |title=Google Cloud & the General Data Protection Regulation (GDPR) |publisher=Google Cloud |accessdate=21 August 2021}}</ref> The GDPR appears to classify "personal data" as sensitive "in relation to fundamental rights and freedoms," formally defined as "any information relating to an identified or identifiable natural person."<ref name="EUR-LexGDPR16">{{cite web |url=https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679 |title=Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) |work=EUR-Lex |publisher=European Union |date=27 April 2016 |accessdate=21 August 2021}}</ref> This is the sensitivity aspect of the regulation. GDPR also addresses location at many points, from data transfers outside the E.U. to the location of the "main establishment" of a data owner or "controller."<ref name="EUR-LexGDPR16" /> As for ownership, GDPR refers to this aspect of data as the "controller," defined as "the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data."<ref name="EUR-LexGDPR16" /> The word "controller" appears more than 500 times in the regulation, emphasizing the importance of ownership or control of data.<ref name="EUR-LexGDPR16" /> However, cloud providers using hybrid or multicloud approaches pose a challenge to labs, as verifying GDPR compliance in these deployments gets even more complicated. The lab would likely have to turn to key documents such as the CSP's SOC 2 audit report (discussed later) to get a fuller picture of GDPR compliance.<ref name="TellingHowCloud19">{{cite web |url=https://cloudcomputing-news.net/news/2019/jun/03/how-cloud-computing-changing-laboratory-ecosystem/ |title=How cloud computing is changing the laboratory ecosystem |author=Telling, C. |work=CloudTech |date=03 June 2019 |accessdate=21 August 2021}}</ref>
| | But laboratories don't run on software alone; analytical instruments are also involved. Those instruments are usually interfaced to the lab's software systems to better capture raw and modified instrument data directly. However, addressing how your cloud-based informatics systems handle instrument data can be challenging, requiring careful attention to cloud service and deployment models. [[Infrastructure as a service]] (IaaS) may be a useful service model to look into due to its versatility in being deployed in private, hybrid, and public cloud models.<ref name="AgilentCloud19" /> As Agilent notes: "This model enables laboratories to minimize the IT footprint in the lab to only the resources needed for instrument control and acquisition. The rest of the data system components can be virtualized in the cloud and thus take advantage of the dynamic scaling and accessibility aspects of the cloud."<ref name="AgilentCloud19" /> It also has easy scalability, enables remote access, and simplifies disaster recovery.<ref name="AgilentCloud19" /> |
|
| |
|
| While data privacy and protection regulations like GDPR, the Personal Data Protection Law (KVKK), and California Consumer Privacy Act (CCPA) take a broad approach, affecting most any organization doing cloud- or non-cloud business while handling sensitive and protected data, some regulations are more focused. The U.S.' [[Health Insurance Portability and Accountability Act]] (HIPAA) is huge for any laboratory handling electronic protected health information (ePHI), including in the cloud. The regulation is so significant that the U.S. Department of Health & Human Services (HHS) has released its own guidance on HIPAA and cloud computing.<ref name="HHSGuidance20">{{cite web |url=https://www.hhs.gov/hipaa/for-professionals/special-topics/health-information-technology/cloud-computing/index.html |title=Guidance on HIPAA & Cloud Computing |author=Office for Civil Rights |work=Health Information Privacy |publisher=U.S. Department of Health & Human Services |date=24 November 2020 |accessdate=21 August 2021}}</ref> That guidance highlights the sensitivity (ePHI), location (whether inside or outside the U.S.), and ownership (HIPAA covered entities and business associates) of data. That ownership part is important, as it addresses the role a CSP takes in this regard<ref name="HHSGuidance20" />:
| | Notice that Agilent still allows for the in-house IT resources necessary to handle the control of instruments and the acquisition of their data. This brings up an important point about instruments. When it comes to instruments, realistic decisions must be made concerning whether or not to take instrument data collection and management into the cloud, keep it local, or enact a hybrid workflow where instrument data is created locally but uploaded to the cloud later.<ref name="VyasCloud20">{{cite web |url=https://planetinnovation.com/perspectives/cloud-vs-on-premises-solutions/ |title=Cloud vs. on-premises solutions – 5 factors to consider when planning the connectivity strategy for your healthcare instrument |author=Vyas, K. |work=Planet Innovation: Perspectives |date=September 2020 |accessdate=21 August 2021}}</ref> In 2017, the Association of Public Health Laboratories (APHL) emphasized latency issues as a real concern when it comes to instrument control computing systems, finding that most shouldn't be in the cloud. The APHL went on to add that those systems can, however, "be designed to share data in network storage systems and/or be integrated with cloud-based LIMS through various communication paths."<ref name="APHLBreaking17" /> |
|
| |
|
| <blockquote>When a covered entity engages the services of a CSP to create, receive, maintain, or transmit ePHI (such as to process and/or store ePHI), on its behalf, the CSP is a business associate under HIPAA. Further, when a business associate subcontracts with a CSP to create, receive, maintain, or transmit ePHI on its behalf, the CSP subcontractor itself is a business associate. This is true even if the CSP processes or stores only encrypted ePHI and lacks an encryption key for the data. Lacking an encryption key does not exempt a CSP from business associate status and obligations under the HIPAA Rules. As a result, the covered entity (or business associate) and the CSP must enter into a HIPAA-compliant business associate agreement (BAA), and the CSP is both contractually liable for meeting the terms of the BAA and directly liable for compliance with the applicable requirements of the HIPAA Rules.</blockquote>
| | So what should a laboratory do with instrument systems? PI Digital's general manager, Kaushal Vyas, recognized the difficulties in deciding what to do with instrument systems in September 2020, breaking the decision down into five key considerations. First, consider the potential futures for your laboratory instruments and how any developing industry trends may affect those instruments. Do you envision your instruments becoming less wired? Will mobile devices be able to interface with them? Are they integrated via a hub? And does a cloud-based solution that manages those integrations across multiple locations make sense at some future point? Second, talk to the people who actually use the instruments and understand the workflows those instruments participate in. Are analysts loading samples directly into the machine and largely getting results from the same location? Or will results need to be accessed from different locations? In the case of the latter, cloud management of instrument data may make more sense. Third, are there additional ways to digitize your workflows in the lab? Perhaps later down the road? Does enabling from-anywhere access to instruments with the help of the cloud make sense for the lab? This leads to the fourth consideration, which actually touches upon the prior three: location, location, location. Where are the instruments located and from where will results, maintenance logs, and error logs be read? If these tasks don't need to be completed in real time, it's possible some hybrid cloud solution would work for you. And finally—as has been emphasized throughout the guide—ask what regulations and security requirements drive instrument deployment and use. In highly regulated environments like pharmaceutical research and production, moving instrument data off-premises may not be an option.<ref name="VyasCloud20" /> |
|
| |
|
| Clinical and public health laboratories are already affected by HIPAA, but understanding how moving to the cloud affects those HIPAA requirements is vital and not always clear.<ref name="HHSGuidance20" /> In particular, the idea of a CSP as a business associate must be taken seriously, in conjunction with its shared responsibility policy and compliance products. Laboratories that need to be HIPAA-compliant should be prepared to do their research on HIPAA and the cloud by reading [https://www.hhs.gov/hipaa/for-professionals/special-topics/health-information-technology/cloud-computing/index.html the HHS guide] and other reference material, as well consulting with experts on the topic when in-house expertise isn't available.
| | One additional consideration that hasn't been fully discussed yet is whether or not to go beyond a hybrid cloud—depending on one single cloud vender to integrate with on-premises systems—to a cloud approach that involves more than one cloud vendor. We examine this consideration, as well as how it related to vendor lock-in, in the following subsection. |
| | |
| Another area of concern for laboratories is GxP or "good practice" quality guidelines and regulations. Take for example the pharmaceutical manufacturer and their laboratories, which must take sufficient precautions to ensure that any manufacturing documentation (data, information, etc.) related to [[good manufacturing practice]] (GMP) requirements (e.g., via E.U. GMP Annex 11, U.S. 21 CFR Part 211, or Germany's AMWHV) is securely stored yet available for a specific retention period "in the event of closure of the manufacturing or testing site where the documentation is stored."<ref name="ECACloud20">{{cite web |url=https://www.gmp-compliance.org/gmp-news/cloud-computing-regulations-for-the-return-transmission-of-data-in-the-event-of-business-discontinuation |title=Cloud Computing: Regulations for the Return Transmission of Data in the Event of Business Discontinuation |work-ECA Academy |date=16 December 2020 |accessdate=21 August 2021}}</ref> As the ECA Academy notes, in the cloud computing realm, it would be up to the laboratory to get provisions added into the CSP's service-level agreement (SLA) to address the GMP's necessity for data availability, and to decide whether maintaining a local backup of the data would be appropriate (as in, perhaps, a hybrid cloud scenario).
| |
| | |
| That part about the SLA is important. Although a formal contract will address agreed-upon services, it's usually in vague terms; the SLA, on the other hand, defines all the responsibility the cloud provider holds, as well as your laboratory, for the supply and use of the CSP's services.<ref name="CaldwellContracts19">{{cite web |url=https://www.asgnational.com/miscellaneous/contracts-vs-service-level-agreements/ |title=Contracts vs. Service Level Agreements |author=Caldwell, D. |work=ASG Blog |date=16 February 2019 |accessdate=21 August 2021}}</ref> These are your primary protections, along with vetting the CSP you use. This may be difficult, however, particularly in a public cloud, where auditing the security controls and protections of the CSP will be limited at best. To ensure GxP compliance in the cloud, your lab will have to examine the various certifications and compliance offerings of the CSP, and hope that the CSP staff handling your GxP data are trained on the requirements of GxP in the cloud.<ref name="McDowallClouds20">{{cite web |url=https://www.csolsinc.com/blog/clouds-or-clods-software-as-a-service-in-gxp-regulated-laboratories/ |title=Clouds or Clods? Software as a Service in GxP Regulated Laboratories |author=McDowall, B. |work=CSols Blog |publisher=CSols, Inc |date=27 August 2020 |accessdate=21 August 2021}}</ref> However, public cloud providers like Microsoft<ref name="MazzoliGood20">{{cite web |url=https://docs.microsoft.com/en-us/compliance/regulatory/offering-gxp |title=Good Clinical, Laboratory, and Manufacturing Practices (GxP) |author=Mazzoli, R. |work=Microsoft Documentation |publisher=Microsoft, Inc |date=30 November 2020 |accessdate=21 August 2021}}</ref> and Google<ref name="GoogleUsing20">{{cite web |url=https://cloud.google.com/security/compliance/cloud-gxp-whitepaper |title=Using Google Cloud in GxP Systems |publisher=Google Cloud |date=May 2020 |accessdate=21 August 2021}}</ref> provide their own documentation and guidance on using their services in GxP environments. As Microsoft notes, however, "there is no GxP certification for cloud service providers."<ref name="MazzoliGood20" /> Instead, the CSPs focus on meeting [[Quality management system|quality management]] and information security standards and employing their own best practices that match up with GxP requirements. Finally, they may also have an independent third party conduct GxP qualification reviews, with the resulting qualification guidelines detailing GxP responsibility between the CSP and the laboratory.<ref name="MazzoliGood20" /><ref name="GoogleUsing20" />
| |
| | |
| Ultimately, navigating the challenge of ensuring your laboratory's move to the cloud complies with necessary regulations is a tricky matter. Ensuring the CSP you choose actually meets HIPAA, GxP, and other requirements isn't always a guaranteed proposition by simply auditing the CSPs whitepapers and other associated compliance documents. However, experts such as Linford & Co.'s Nicole Hemmer and IDBS' Damien Tiller emphasize that the most comprehensive CSP documentation to examine towards gaining a more complete picture of the CSP's security is the SOC 2 (SOC for Service Organizations: Trust Services
| |
| Criteria) report.<ref name="HemmerTrust19">{{cite web |url=https://linfordco.com/blog/trust-services-critieria-principles-soc-2/ |title=Trust Services Criteria (formerly Principles) for SOC 2 in 2019 |author=Hemer, N. |work=Linford & Company IT Audit & Compliance Blog |publisher=Linford and Co. LLP |date=18 December 2019 |accessdate=21 August 2021}}</ref><ref name="TillerIsThe19">{{cite web |url=https://storage.pardot.com/468401/1614781936jHqdU6H6/Whitepaper_Is_the_cloud_a_safe_place_for_your_data.pdf |format=PDF |title=Is the Cloud a Safe Place for Your Data?: How Life Science Organizations Can Ensure Integrity and Security in a SaaS Environment |author=Tiller, D. |publisher=IDBS |date=2019 |accessdate=21 August 2021}}</ref> A CSP's SOC 2 audit results outline nearly 200 information security, data integrity, data availability, and data retention controls and any non-conformities with those controls (the CSP must show those controls have been effectively in place over a six- to 12-month period). The SOC 2 report has other useful aspects, including a full service description and audit observations, making it the best tool for a laboratory to judge a CSP's ability to assist with regulatory compliance.<ref name="TillerIsThe19" />
| |
|
| |
|
| ==References== | | ==References== |
| {{Reflist|colwidth=30em}} | | {{Reflist|colwidth=30em}} |
Chapter 2 also looked at deployment approaches in detail, so this section won't get too in-depth. However, the topic of choosing the right deployment approach for your laboratory is still an important one. As Agilent Technologies noted in its 2019 white paper Cloud Adoption for Lab Informatics, a laboratory's "deployment approach is a key factor for laboratory leadership since the choice of private, public, or hybrid deployment will impact the viability of solutions in the other layers of informatics" applied by the lab.[1] In some cases, like most software, the deployment choice will be relatively straightforward, but other aspects of laboratory requirements may complicate that decision further.
Broadly speaking, any attempt to move you LIS, LIMS, or electronic laboratory notebook (ELN) to the cloud will almost certainly involve a software as a service (SaaS) approach.[1] If the deployment is relatively simple, with well-structured data, this is relatively painless. However, in cases where your LIS, LIMS, or ELN need to interface with additional business systems like an enterprise resource planning (ERP) system or other informatics software such as a picture archiving and communication system (PACS), an argument could be made that a platform as a service (PaaS) approach may be warranted, due to its greater flexibility.[1]
But laboratories don't run on software alone; analytical instruments are also involved. Those instruments are usually interfaced to the lab's software systems to better capture raw and modified instrument data directly. However, addressing how your cloud-based informatics systems handle instrument data can be challenging, requiring careful attention to cloud service and deployment models. Infrastructure as a service (IaaS) may be a useful service model to look into due to its versatility in being deployed in private, hybrid, and public cloud models.[1] As Agilent notes: "This model enables laboratories to minimize the IT footprint in the lab to only the resources needed for instrument control and acquisition. The rest of the data system components can be virtualized in the cloud and thus take advantage of the dynamic scaling and accessibility aspects of the cloud."[1] It also has easy scalability, enables remote access, and simplifies disaster recovery.[1]
Notice that Agilent still allows for the in-house IT resources necessary to handle the control of instruments and the acquisition of their data. This brings up an important point about instruments. When it comes to instruments, realistic decisions must be made concerning whether or not to take instrument data collection and management into the cloud, keep it local, or enact a hybrid workflow where instrument data is created locally but uploaded to the cloud later.[2] In 2017, the Association of Public Health Laboratories (APHL) emphasized latency issues as a real concern when it comes to instrument control computing systems, finding that most shouldn't be in the cloud. The APHL went on to add that those systems can, however, "be designed to share data in network storage systems and/or be integrated with cloud-based LIMS through various communication paths."[3]
So what should a laboratory do with instrument systems? PI Digital's general manager, Kaushal Vyas, recognized the difficulties in deciding what to do with instrument systems in September 2020, breaking the decision down into five key considerations. First, consider the potential futures for your laboratory instruments and how any developing industry trends may affect those instruments. Do you envision your instruments becoming less wired? Will mobile devices be able to interface with them? Are they integrated via a hub? And does a cloud-based solution that manages those integrations across multiple locations make sense at some future point? Second, talk to the people who actually use the instruments and understand the workflows those instruments participate in. Are analysts loading samples directly into the machine and largely getting results from the same location? Or will results need to be accessed from different locations? In the case of the latter, cloud management of instrument data may make more sense. Third, are there additional ways to digitize your workflows in the lab? Perhaps later down the road? Does enabling from-anywhere access to instruments with the help of the cloud make sense for the lab? This leads to the fourth consideration, which actually touches upon the prior three: location, location, location. Where are the instruments located and from where will results, maintenance logs, and error logs be read? If these tasks don't need to be completed in real time, it's possible some hybrid cloud solution would work for you. And finally—as has been emphasized throughout the guide—ask what regulations and security requirements drive instrument deployment and use. In highly regulated environments like pharmaceutical research and production, moving instrument data off-premises may not be an option.[2]
One additional consideration that hasn't been fully discussed yet is whether or not to go beyond a hybrid cloud—depending on one single cloud vender to integrate with on-premises systems—to a cloud approach that involves more than one cloud vendor. We examine this consideration, as well as how it related to vendor lock-in, in the following subsection.
References