Difference between revisions of "Template:Article of the week"

From LIMSWiki
Jump to navigationJump to search
(Updated article of the week text)
(Updated article of the week text)
(26 intermediate revisions by the same user not shown)
Line 1: Line 1:
<div style="float: left; margin: 0.5em 0.9em 0.4em 0em;">[[File:Fig1 Perez-Castillo Sensors2018 18-9.png|240px]]</div>
<div style="float: left; margin: 0.5em 0.9em 0.4em 0em;">[[File:Fig1 Wang BMCMedInfoDecMak2019 19-1.png|240px]]</div>
'''"[[Journal:DAQUA-MASS: An ISO 8000-61-based data quality management methodology for sensor data|DAQUA-MASS: An ISO 8000-61-based data quality management methodology for sensor data]]"'''
'''"[[Journal:Design and evaluation of a LIS-based autoverification system for coagulation assays in a core clinical laboratory|Design and evaluation of a LIS-based autoverification system for coagulation assays in a core clinical laboratory]]"'''


The [[internet of things]] (IoT) introduces several technical and managerial challenges when it comes to the use of data generated and exchanged by and between various smart, connected products (SCPs) that are part of an IoT system (i.e., physical, intelligent devices with sensors and actuators). Added to the volume and the heterogeneous exchange and consumption of data, it is paramount to [[Quality assurance|assure]] that data quality levels are maintained in every step of the data chain/lifecycle. Otherwise, the system may fail to meet its expected function. While data quality (DQ) is a mature field, existing solutions are highly heterogeneous. Therefore, we propose that companies, developers, and vendors should align their data quality management mechanisms and artifacts with well-known best practices and [[Specification (technical standard)|standards]], as for example, those provided by ISO 8000-61. This standard enables a process-approach to data quality management, overcoming the difficulties of isolated data quality activities. This paper introduces DAQUA-MASS, a methodology based on ISO 8000-61 for data quality management in sensor networks. ('''[[Journal:DAQUA-MASS: An ISO 8000-61-based data quality management methodology for sensor data|Full article...]]''')<br />
n autoverification system for coagulation consists of a series of rules that allows normal data to be released without manual verification. With new advances in [[medical informatics]], the [[laboratory information system]] (LIS) has growing potential for the use of autoverification, allowing rapid and accurate verification of [[clinical laboratory]] tests. The purpose of the study is to develop and evaluate a LIS-based autoverification system for validation and efficiency.
 
Autoverification decision rules—including quality control, analytical error flag, critical value, limited range check, delta check, and logical check rules, as well as patient’s historical information—were integrated into the LIS. Autoverification limit ranges was constructed based on 5% and 95% percentiles. The four most commonly used coagulation assays—prothrombin time (PT), activated partial thromboplastin time (APTT), thrombin time (TT), and fibrinogen (FBG)—were followed by the autoverification protocols. ('''[[Journal:Design and evaluation of a LIS-based autoverification system for coagulation assays in a core clinical laboratory|Full article...]]''')<br />
<br />
<br />
''Recently featured'':
''Recently featured'':
: ▪ [[Journal:Security architecture and protocol for trust verifications regarding the integrity of files stored in cloud services|Security architecture and protocol for trust verifications regarding the integrity of files stored in cloud services]]
: ▪ [[Journal:CyberMaster: An expert system to guide the development of cybersecurity curricula|CyberMaster: An expert system to guide the development of cybersecurity curricula]]
: ▪ [[Journal:What Is health information quality? Ethical dimension and perception by users|What Is health information quality? Ethical dimension and perception by users]]
: ▪ [[Journal:Costs of mandatory cannabis testing in California|Costs of mandatory cannabis testing in California]]
: ▪ [[Journal:SCADA system testbed for cybersecurity research using machine learning approach|SCADA system testbed for cybersecurity research using machine learning approach]]
: ▪ [[Journal:An integrated data analytics platform|An integrated data analytics platform]]

Revision as of 15:52, 11 November 2019

Fig1 Wang BMCMedInfoDecMak2019 19-1.png

"Design and evaluation of a LIS-based autoverification system for coagulation assays in a core clinical laboratory"

n autoverification system for coagulation consists of a series of rules that allows normal data to be released without manual verification. With new advances in medical informatics, the laboratory information system (LIS) has growing potential for the use of autoverification, allowing rapid and accurate verification of clinical laboratory tests. The purpose of the study is to develop and evaluate a LIS-based autoverification system for validation and efficiency.

Autoverification decision rules—including quality control, analytical error flag, critical value, limited range check, delta check, and logical check rules, as well as patient’s historical information—were integrated into the LIS. Autoverification limit ranges was constructed based on 5% and 95% percentiles. The four most commonly used coagulation assays—prothrombin time (PT), activated partial thromboplastin time (APTT), thrombin time (TT), and fibrinogen (FBG)—were followed by the autoverification protocols. (Full article...)

Recently featured:

CyberMaster: An expert system to guide the development of cybersecurity curricula
Costs of mandatory cannabis testing in California
An integrated data analytics platform