Journal:Towards a risk catalog for data management plans
Full article title | Towards a risk catalog for data management plans |
---|---|
Journal | International Journal of Digital Curation |
Author(s) | Weng, Franziska; Thoben, Stella |
Author affiliation(s) | Kiel University |
Primary contact | Email: franziskaweng at web dot de |
Year published | 2020 |
Volume and issue | 15(1) |
Page(s) | 18 |
DOI | 10.2218/ijdc.v15i1.697 |
ISSN | 1746-8256 |
Distribution license | Creative Commons Attribution 4.0 International |
Website | http://www.ijdc.net/article/view/697 |
Download | http://www.ijdc.net/article/view/697/614 (PDF) |
This article should be considered a work in progress and incomplete. Consider this article incomplete until this notice is removed. |
Abstract
Although data management and its careful planning are not new topics, there is little published research on risk mitigation in data management plans (DMPs). We consider it a problem that DMPs do not include a structured approach for the identification or mitigation of risks, because it would instill confidence and trust in the data and its stewards, and foster the successful conduction of data-generating projects, which often are funded research projects. In this paper, we present a lightweight approach for identifying general risk in DMPs. We introduce an initial version of a generic risk catalog for funded research and similar projects. By analyzing a selection of 13 DMPs for projects from multiple disciplines published in the Research Ideas and Outcomes (RIO) journal, we demonstrate that our approach is applicable to DMPs and transferable to multiple institutional constellations. As a result, the effort for integrating risk management in data management planning can be reduced.
Keywords: data management plan, data management, risk management, risk assessment, information security
Introduction
University of New Mexico's William Michener describes a data management plan (DMP) as "a document that describes how you will treat your data during a project and what happens with the data after the project ends.”[1] The Digital Curation Centre's (DCC) Martin Donnelly notes that DMPs “serve to mitigate risks and help instill confidence and trust in the data and its stewards.”[2] Sarah Jones, also of the DCC, adds that “planning for the effective creation, management, and sharing of your data enables you to get the most out of your research.”[3] As such, the creation of a DMP should not only happen for obtaining a grant but also for successfully conducting the proposed project.
According to ISO 31000[4], a risk is “an effect of uncertainty on objectives.” Data management plans should help to decrease effects of uncertainty on project objectives. We consider it a problem that neither DMPs nor funders’ DMP evaluation schemes include a structured approach for the identification or mitigation of risks, since this would foster the successful conduction of data-generating projects, which often are funded research projects. We believe our approach will help funders evaluate risks of proposed projects and hence the risks of their investment options.
Data management maturity models like the Data Management Maturity (DMM) model[5] or the Enterprise Information Management (EIM) maturity model[6] are primarily designed for enterprises and may not be feasible for higher education institutions (HEIs). A rigid model for HEIs to coordinate support of data management and sharing across a diverse range of actors and processes to deliver the necessary technological and human infrastructures “cannot be prescribed since individual organizations and cultures occupy a spectrum of differences.”[7] Also, there is a potential conflict between organizational demands and scientific freedom. The Charter of Fundamental Rights of the E.U. contains scientific freedom as a constitutional right, and researchers may view the imposition of specific data management processes as a restriction of their scientific freedom. On an even more international level, the UNESCO recommends that “each Member State should institute procedures adapted to its needs for ensuring that, in the performance of research and development, scientific researchers respect public accountability while at the same time enjoying the degree of autonomy appropriate to their task and to the advancement of science and technology.”[8]
We consider it important, that researchers commit themselves to data management practices like e.g., ISO 31000. However, ISO 31000 defines the risk management process as a feedback loop to be conducted in organizations.[4] Projects tend to have a much more limited scope with regard to funding and duration than organizations. Therefore, we regard the ISO 31000 risk management process as too time-consuming and of limited suitability for funded research and similar projects.
In this paper, we propose a lightweight approach for the identification of general risks in DMPs. We introduce an initial version of a generic risk catalog for funded research and similar projects. By analyzing a selection of 13 DMPs for projects from multiple disciplines published in the Research Ideas and Outcomes (RIO) journal[9][10][11][12][13][14][15][16][17][18][19][20][21], we demonstrate that our approach is applicable and transferable to multiple institutional constellations. As a result, the effort for integrating risk management in data management planning can be reduced.
Related work
Jones et al. developed a guide for HEIs “to help institutions understand the key aims and issues associated with planning and implementing research data management (RDM) services.”[7] In this guide, the authors mention data management risks for HEIs. They note that While the upfront costs for cheap storage of active data “may be only a fraction of those quoted by central services, the risks of data loss and security breaches are significantly higher, potentially leading to far greater costs in the long term.”[7] Additionally, there are “potential legal risks from using third-party services.”[7] However, data selection counters the risks of “reputational damage from exposing dirty, confidential, or undocumented data that has been retained long after the researchers who created it have left.”[7]
The OSCRP working group developed the OSCRP (Open Science Cyber Risk Profile), which “is designed to help principal investigators (PI) and their supporting information technology (IT) professionals assess cybersecurity risks related to open science projects.”[22] The OSCRP working group proposes that principal investigators examine risks, consequences and avenues of attack for each mission critical science asset on an inventory list, whereas assets include devices, systems, data, personnel, workflows, and other kinds of resources.[22] We regard this as a very detailed alternative to our approach, but FAIR Guiding Principles[23] and long-term preservation need to be added.
In 2014, Ferreira et al.[24] “propose an analysis process for eScience projects using a data management plan and ISO 31000 in order to create a risk management plan that can complement the data management plan.” The authors describe an analytical process for creating a risk management plan and “present the previous process’ validation, based on the MetaGen-FRAME project.”[24] Within this validation Ferreira et al. also identify a project’s task-specific risks, e.g., “R6: Loss of metadata, denying the representation of the output information to the user via Taverna.”[24] This risk is tailored to the use of Taverna and hence may not be relevant for the majority of funded research and similar projects. There may be projects for which analyzing specific risks for all resources may be crucial. However, a detailed risk analysis may require a considerable amount of work.
Methods
We propose a lightweight approach that can serve as a starting point to include risk management in research data management planning. It doesn’t preclude detailed approaches like OSCRP[22] or ISO 31000.[4] Instead, we propose an approach which tries to reduce and maybe avoid the burden of a full risk management process like, e.g., ISO 31000. Our approach is based on a pre-tailored and extensible general risk catalog (Table 1) to lessen the effort required for risk management. We derived part of this risk catalog from 29 interviews with researchers from multiple disciplines[a], which we conducted as part of project SynFo: Creating synergies on the operational level of research data management.[25] One goal of project SynFo was the development of a transferable approach to improve research data management in multiple organizational constellations. In generalized content from the interviews, we identified risks entailed by interfaces of information, e.g., between researchers and data subjects or between researchers and external service providers. For the development of our approach, we also consulted the catalogs for threats and measures from the supplement of the “IT-Grundschutz” catalogs[26] by the German Federal Office for Information Security (BSI), the FAIR Guiding Principles[23], and the report and action plan from the European Commission expert group on FAIR data.[27]
|
Our risk identification includes risks, their possible risk sources, mitigation approaches, and consequences. By analyzing occurrences and mitigations of risks from our catalog within a selection of 13 DMPs from multiple disciplines[b], published in the ‘’RIO’’ journal, we demonstrate that our lightweight approach is applicable to DMPs and transferable to multiple institutional constellations. We evaluate the occurrences of the 15 risks in our catalog by identifying possible risk sources in each of the selected DMPs and analyze the risk mitigations in accordance to what the authors wrote.
Risks
Legal risks
A breach of a regulation like the General Data Protection Regulation (GDPR) or the Nagoya Protocol can result in high fines. At worst, compliance breaches can lead to reputational damages, legal disputes, and enormous cost.
Penalty for conducting unreported notifiable practices [RLEGU]
Research may include reportable research practices like the collection of physical samples regulated by the Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization, which was transposed into E.U. law by Regulation (EU) No 511/2014. Under this regulation, there is a reporting obligation if the research on genetic resources is financially supported (Art. 7, Sec. 1) and if the research is in the final stage of development of a product that is based on the utilization of genetic resources (Art. 7, Sec.2).[28] Article 11 says that “Member States shall lay down the rules on penalties applicable to infringements of Articles 4 and 7 and shall take all the measures necessary to ensure that they are applied.”[28] The Nagoya Protocol “and EU documents themselves give no guidance on penalties, each country has the liberty to determine these.”[29] Consequences may be fines of up to EUR 810,000 or even imprisonment.[29] To avoid penalties, the parties should comply strictly with the rules. The Convention on Biological Diversity publishes a detailed list of parties to the Nagoya Protocol.[c]
Penalty for unpermitted usage of external data [RLEGE]
In many countries, data by themselves do not have inherent legal protection. License contracts can reach various agreements concerning terms of use. Free licenses make (data) objects available for utilization to everyone, but usage can be restricted or conditioned. Creative Commons (CC) licenses and the GNU General Public License (GPL), which is specialized for free software, are widely used. Nonetheless, using CC licenses can lead to conflicting rights of third parties. Publicity, personality, and privacy rights “not held by the licensor are not affected and may still affect your desired use of a licensed work.”[30] “If there are any third parties who may have publicity, privacy, or personality rights that apply, those rights are not affected by your application of a CC license, and a reuser must seek permission for relevant uses.”[30] This example holds for pictures of persons. Also, the GNU GPL license imposes transitive obligations, e.g., “derivative programs must also be subject to the same initial GPL conditions of ability to copy, modify, or redistribute.”[31] To mitigate the risk of unpermitted usage of external data, it is recommended to abide by the license terms. In general, an overview about the data and the related licenses can be developed in the DMP or within the framework of a data policy.
Penalty for unpermitted usage of personal data [RLEGP]
In the E.U., the GDPR governs the processing of personal data. Articles 6 and 7 of the GDPR regulate the lawfulness of processing and the conditions of consent. On an international level, the European Commission can conduct an assessment to “ensure that the level of data protection in a third country or international organization is essentially equivalent to the one established by the E.U. legislation.”[32] Canada (commercial organizations), Israel, Switzerland, Japan, and the U.S. (limited to the Privacy Shield Framework) offer an adequate level of data protection.[33] To avoid penalties, it is recommended to receive written consent forms from data subjects, including information about the purpose and procedures of data processing.
Penalty for conducting inadequate data protection practices [RLEGD]
Article 5 of the GDPR enumerates principles related to processing of personal data: the principle of lawfulness, fairness and transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity, and confidentiality, as well as accountability. According to Article 45 of the GDPR, “A transfer of personal data to a third country or an international organization may take place where the Commission has decided that the third country, a territory or one or more specified sectors within that third country, or the international organization in question ensures an adequate level of protection. Such a transfer shall not require any specific authorization.”[34] Countries without adequacy, which are not classified as safe third countries, can guarantee protection in other ways, for example by using appropriate safeguards (Art. 6, GDPR) or binding corporate rules (Art. 7, GDPR). To avoid penalties, it is recommended to abide by the applicable laws. In case of doubt, researchers can contact the appropriate (data protection) authorities.
Privacy risks
A loss of confidentiality can have adverse effects on an organization like financial effects.[26] These effects may also apply to a researcher who additionally may want to keep research data confidential before scientific output is published, so that research data will not be subject to theft of work.
Loss of confdentiality through sending data to an unintended recipient [RPRIR]
Correspondence has the intrinsic potential that a researcher transmits data to an unintended recipient. This may happen accidentally or as the result of a fraudulent attack like social engineering and leads to loss of confidentiality. “Social engineering is a method used to gain unauthorized access to information or IT systems by social action.”[26] Researchers should take extra care when sending confidential information and be aware of fraudulent attacks.
Loss of confdentiality through interception or eavesdropping of information [RPRII]
In the supplement of the IT-Grundschutz catalogs, the BSI specifies the threats of interception or eavesdropping of information, which entail the risk of loss of confidentiality.[26] “Since data is sent using unforeseeable routes and nodes on the internet, the sent data should only be transmitted in an encrypted form, as far as possible.”[26]
Loss of confdentiality through loss or theft of portable storage media or devices [RPRIS]
“Portable terminal devices and mobile data media in particular can be lost easily”[26] or even be stolen. “Whenever possible, mobile data media such as USB sticks and laptops should always be encrypted completely even if they are only occasionally used for confidential information.”[26]
Loss of confdentiality through careless data handling by an external party [RPRIE]
We regard the event that researchers share data with an external party without the purpose of publication as entailing the risk of loss of confidentiality. The external party may handle confidential data carelessly. “It can frequently be observed that there are a number of organizational or technical security procedures available in organizations, but they are then undermined through careless handling of the specifications and the technology.”[26] We recommend that researchers who share their research data to always grant specific usage rights in written form to the external party or to check if appropriate security measures are applied by the external party.
Technical risks
Data can lose their integrity or be lost[26], leading to the major risk of unavailability of data. Unavailability of the correct data through silent corruption can lead to usage of incorrect data and hence to the production of incorrect results. If data are unavailable, either the project may fail or researchers need to repeat their data collection and the project will be behind schedule.
“The integrity of information may be impaired due to different causes, e.g., manipulations, errors caused by people, incorrect use of applications, malfunctions of software, or transmission errors.”[26] “If only accidental changes need to be detected, then checksum procedures (e.g., cyclic redundancy checks) or error-correcting codes can be used.”[26] Nonetheless, there may be other scenarios where these verification techniques are insufficient.
Data may “be lost when devices or data media are damaged, lost, or stolen,”[26] hence becoming unavailable. Approaches to mitigate irretrievable losses of data are for example regular backups[26] or keeping copies in multiple storage locations.[35]
Science risks
Consequences of poor discoverability and reusability of data are that researchers may unnecessarily repeat work and that scientific outputs derived from it may fail to be comprehensible, reproducible, or traceable. Problems with reproducibility and replication “can cause permanent damage to the credibility of science,”[36] and thus why this category is referred to as "science risks."
Poor knowledge discovery or reusability for stakeholders cannot find [RSCIF], access [RSCIA], integrate [RSCII], or reuse [RSCIR] the data
Making data fundable, accessible, interoperable, and reusable (FAIR) to human and computational stakeholders is a best practice approach described in the FAIR Guiding Principles of Wilkinson et al.[23] Therefore, we include the risks that stakeholders cannot find, access, process, or reuse data in our risk catalog. Authors of DMPs can mitigate these risks as described by Wilkinson et al.[23] We abbreviated the risk names under this risk category using the term “poor knowledge discovery or reusability,” but refer to all the FAIR Guiding Principles for more information.[23]
Preservation risk
If data are not suitably preserved, scientific outputs derived from them may fail to be comprehensible, reproducible, or traceable in the long run. Data should be stored in a trusted and sustainable digital repository.[27]
A digital preservation location has the intrinsic technical risk that data become unavailable through data loss or corruption. However, preservation locations also entail the risk of becoming unavailable when their funding ends. For example, Canhos states that discontinuity of financial support is a threat to Brazil’s Virtual Herbarium and its data sources.[9] Authors of DMPs should consider these risks when selecting a preservation location. They can mitigate the risk that data are not preserved long-term by reviewing the external preservation location’s longevity, certificates, and funding. We also suggest that attention is paid to possible migration and exit strategies like exporting and handing over data to a national data archive. This may particularly be important when the preservation location is not external.
Evaluation
When applying the risk catalog (Table 1) to the sample of 13 DMPs, we distinguish between risk occurrences themselves and risk occurrences with at least one mitigation, as show in Table 2.
|
Footnotes
- ↑ Geo sciences (12), biology (5), humanities (5), social and behavioral sciences (4), computer science, systems engineering and electrical engineering (2), and medicine (1)
- ↑ Biology (4), geo sciences (4), social and behavioural sciences (3), computer science, systems engineering and electrical engineering (1), and humanities (1)
- ↑ Parties to the Nagoya Protocol
References
- ↑ Michener, W.K. (2015). "Ten Simple Rules for Creating a Good Data Management Plan". PLoS Computational Biology 11 (10): e1004525. doi:10.1371/journal.pcbi.1004525.
- ↑ Donnelly, M. (2012). "Chapter 5: Data management plans and planning". In Pryor, G.. Managing Research Data. Facet. pp. 83–104. doi:10.29085/9781856048910.006. ISBN 9781856048910.
- ↑ Jones, S. (2011). "How to Develop a Data Management and Sharing Plan". Digital Curation Centre. https://www.dcc.ac.uk/guidance/how-guides/develop-data-plan. Retrieved 19 November 2019.
- ↑ 4.0 4.1 4.2 "ISO 31000:2018 Risk management — Guidelines". International Organization for Standardization. February 2018. https://www.iso.org/standard/65694.html.
- ↑ "Data Management Maturity (DMM)". Information System Audit and Control Association, Inc. 2019. https://cmmiinstitute.com/data-management-maturity. Retrieved 22 November 2019.
- ↑ Newman, D.; Logan, D. (23 December 2008). "Overview: Gartner Introduces the EIM Maturity Model". Gartner. https://www.gartner.com/en/documents/846312/overview-gartner-introduces-the-eim-maturity-model.
- ↑ 7.0 7.1 7.2 7.3 7.4 Jones, S.; Pryor, G.; Whyte, A. (25 March 2013). "How to Develop RDM Services - A guide for HEIs". Digital Curation Centre. https://www.dcc.ac.uk/guidance/how-guides/how-develop-rdm-services. Retrieved 19 November 2019.
- ↑ UNESCO (2017). "Records of the General Conference, 39th session, Paris, 30 October-14 November 2017, v. 1: Resolutions". p. 116. https://unesdoc.unesco.org/ark:/48223/pf0000260889.page=116. Retrieved 30 November 2019.
- ↑ 9.0 9.1 9.2 Canhos, D.A.L. (2017). "Data Management Plan: Brazil's Virtual Herbarium". RIO 3: e14675. doi:10.3897/rio.3.e14675.
- ↑ 10.0 10.1 Fey, J.; Anderson, S. (2016). "Boulder Creek Critical Zone Observatory Data Management Plan". RIO 2: e9419. doi:https://doi.org/10.3897/rio.2.e9419.
- ↑ 11.0 11.1 Fisher, J.; Nading, A.M. (2016). "A Political Ecology of Value: A Cohort-Based Ethnography of the Environmental Turn in Nicaraguan Urban Social Policy". RIO 2: e8720. doi:10.3897/rio.2.e8720.
- ↑ 12.0 12.1 Gatto, L. (2017). "Data Management Plan for a Biotechnology and Biological Sciences Research Council (BBSRC) Tools and Resources Development Fund (TRDF) Grant". RIO 3: e11624. doi:10.3897/rio.3.e11624.
- ↑ 13.0 13.1 McWhorter, J.; Wright, D.; Thomas, J. (2016). "Coastal Data Information Program (CDIP)". RIO 2: e8827. doi:10.3897/rio.2.e8827.
- ↑ 14.0 14.1 Neylon, C. (2017). "Data Management Plan: IDRC Data Sharing Pilot Project". RIO 3: e14672. doi:10.3897/rio.3.e14672.
- ↑ 15.0 15.1 Nichols, H.; Stolze, S. (2016). "Migration of legacy data to new media formats for long-time storage and maximum visibility: Modern pollen data from the Canadian Arctic (1972/1973)". RIO 2: e10269. doi:10.3897/rio.2.e10269.
- ↑ 16.0 16.1 Pannell, J.L. (2016). "Data Management Plan for PhD Thesis "Climatic Limitation of Alien Weeds in New Zealand: Enhancing Species Distribution Models with Field Data"". RIO 2: e10600. doi:10.3897/rio.2.e10600.
- ↑ 17.0 17.1 Traynor, C. (2017). "Data Management Plan: Empowering Indigenous Peoples and Knowledge Systems Related to Climate Change and Intellectual Property Rights". RIO 3: e15111. doi:10.3897/rio.3.e15111.
- ↑ 18.0 18.1 Wael, R. (2017). "Data Management Plan: HarassMap". RIO 3: e15133. doi:10.3897/rio.3.e15133.
- ↑ 19.0 19.1 White, E.P. (2016). "Data Management Plan for Moore Investigator in Data Driven Discovery Grant". RIO 2: e10708. doi:10.3897/rio.2.e10708.
- ↑ 20.0 20.1 Woolfrey, L. (2017). "Data Management Plan: Opening access to economic data to prevent tobacco related diseases in Africa". RIO 3: e14837. doi:10.3897/rio.3.e14837.
- ↑ 21.0 21.1 Xu, H.; Ishida, M.; Wang, M. (2016). "A Data Management Plan for Effects of particle size on physical and chemical properties of mine wastes". RIO 2: e11065. doi:10.3897/rio.2.e11065.
- ↑ 22.0 22.1 22.2 Peisert, S.; Welch, V.; Adams, A. et al. (2017). "Open Science Cyber Risk Profile (OSCRP)". IUScholar Works. http://hdl.handle.net/2022/21259. Retrieved 19 November 2019.
- ↑ 23.0 23.1 23.2 23.3 23.4 Wilkinson, M.D.; Dumontier, M.; Aalbersberg, I.J. et al. (2016). "The FAIR Guiding Principles for scientific data management and stewardship". Scientific Data 3: 160018. doi:10.1038/sdata.2016.18. PMC PMC4792175. PMID 26978244. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4792175.
- ↑ 24.0 24.1 24.2 Ferreira, F.; Coimbra, M.E.; Bairrão, R. et al. (2014). "Data Management in Metagenomics: A Risk Management Approach". International Journal of Digital Curation 9 (1): 41–56. doi:10.2218/ijdc.v9i1.299.
- ↑ University Computing Centre (Rechenzentrum) (2019). "SynFo - Creating synergies on the operational level of research data management". Kiel University. https://www.rz.uni-kiel.de/en/projects/synfo-creating-synergies-on-the-operational-level-of-research-data-management.
- ↑ 26.00 26.01 26.02 26.03 26.04 26.05 26.06 26.07 26.08 26.09 26.10 26.11 26.12 German Federal Office for Information Security (22 December 2016). "IT-Grundschutz-catalogues 15th version - 2015 (Draft)". Archived from the original on 28 January 2020. https://web.archive.org/web/20200128211607/https://www.bsi.bund.de/SharedDocs/Downloads/DE/BSI/Grundschutz/International/GSK_15_EL_EN_Draft.html. Retrieved 19 November 2019.
- ↑ 27.0 27.1 Collins, S.; Genova, F.; Harrower, N. (26 November 2018). "Turning FAIR into reality". European Commission. doi:10.2777/1524. https://op.europa.eu/en/publication-detail/-/publication/7769a148-f1f6-11e8-9982-01aa75ed71a1/language-en.
- ↑ 28.0 28.1 "Regulation (EU) No 511/2014 of the European Parliament and of the Council of 16 April 2014 on compliance measures for users from the Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization in the Union Text with EEA relevance". EUR-Lex. European Union. 16 April 2014. http://data.europa.eu/eli/reg/2014/511/oj. Retrieved 01 December 2019.
- ↑ 29.0 29.1 "Implementation of Nagoya Protocol: A comparison between The Netherlands, Belgium and Germany". vo.eu. 18 June 2018. https://publications.vo.eu/implementation-of-nagoya-protocol/. Retrieved 30 November 2019.
- ↑ 30.0 30.1 "Frequently Asked Questions". Creative Commons. https://creativecommons.org/faq/. Retrieved 08 December 2019.
- ↑ Lipinski, T.A. (2012). Librarian's Legal Companion for Licensing Information Resources and Legal Services. Neal-Schuman Publishers. p. 312. ISBN 9781555706104.
- ↑ Article 29 Data Protection Working Party (6 February 2018). "Working document on Adequacy Referential (wp254rev.01)". European Commission. https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=614108. Retrieved 01 December 2019.
- ↑ European Commission (14 January 2019). "Adequacy decisions". European Commission. https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/adequacy-decisions_en. Retrieved 01 December 2019.
- ↑ "Consolidated text: Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance)". EUR-Lex. European Union. 27 April 2016. http://data.europa.eu/eli/reg/2016/679/2016-05-04. Retrieved 30 November 2019.
- ↑ Reich, V.; Rosenthal, D.S.H. (2009). "LOCKSS (Lots of Copies Keep Stuff Safe)". New Review of Academic Librarianship 6 (1): 155–61. doi:10.1080/13614530009516806.
- ↑ Peng, R. (2015). "The reproducibility crisis in science: A statistical counterattack". Sginificance 12 (3): 30–32. doi:10.1111/j.1740-9713.2015.00827.x.
Notes
This presentation is faithful to the original, with only a few minor changes to presentation. Grammar was cleaned up for smoother reading. In some cases important information was missing from the references, and that information was added. The original article lists references in alphabetical order; this version lists them in order of appearance, by design. [[Category:LIMSwiki journal articles on cybersecurity]