Difference between revisions of "Information"

From LIMSWiki
Jump to navigationJump to search
(Created stub. Still need to do a lot of clean-up.)
 
 
(7 intermediate revisions by the same user not shown)
Line 1: Line 1:
'''Information''', in its most restricted technical sense, is a sequence of symbols that can be interpreted as a message, recorded as signs, or transmitted as signals.<ref name="VonBaeyer">{{cite book |url=http://books.google.com/books?id=QpuZgAR8DJwC |author=Von Baeyer, Hans Christian |title=Information: The New Language of Science |publisher=Harvard University Press |pages=258 |year=2004 |isbn=0674013875}}</ref> Conceptually, information is the message (utterance or expression) being conveyed. Therefore, in a general sense, information is "knowledge communicated or received concerning a particular fact or circumstance."<ref name="DComDef">{{cite web |url=http://dictionary.reference.com/browse/information |title=information |work=Dictionary.com Unabridged |publisher=Random House, Inc |accessdate=12 September 2013}}</ref>
'''Information''', in its most restricted technical sense, is a sequence of symbols that can be interpreted as a message, recorded as signs, or transmitted as signals.<ref name="VonBaeyer">{{cite book |url=https://books.google.com/books?id=QpuZgAR8DJwC |author=Von Baeyer, Hans Christian |title=Information: The New Language of Science |publisher=Harvard University Press |pages=258 |year=2004 |isbn=0674013875}}</ref> Conceptually, information is the message (utterance or expression) being conveyed. Therefore, in a general sense, information is "knowledge communicated or received concerning a particular fact or circumstance."<ref name="DComDef">{{cite web |url=https://www.dictionary.com/browse/information |title=information |work=Dictionary.com Unabridged |publisher=Dictionary.com, LLC |accessdate=05 January 2022}}</ref>


From the stance of information theory, information is taken as a sequence of symbols from an alphabet, say an input alphabet χ, and an output alphabet ϒ. Information processing consists of an input-output function that maps any input sequence from χ into an output sequence from ϒ. The mapping may be probabilistic or determinate. It may have memory or be memoryless.<ref name="Wicker">{{cite book |title=Fundamentals of Codes, Graphs, and Iterative Decoding |author=Wicker, Stephen B.; Kim, Saejoon |publisher=Springer |year=2003 |url=http://books.google.com/books?id=rMu-R3FFG54C&pg=PA1 |pages=1 |isbn=1402072643}}
From the stance of information theory, information is taken as a sequence of symbols from an alphabet, say an input alphabet χ, and an output alphabet ϒ. Information processing consists of an input-output function that maps any input sequence from χ into an output sequence from ϒ. The mapping may be probabilistic or determinate. It may have memory or be memoryless.<ref name="Wicker">{{cite book |title=Fundamentals of Codes, Graphs, and Iterative Decoding |author=Wicker, Stephen B.; Kim, Saejoon |publisher=Springer |year=2003 |url=https://books.google.com/books?id=rMu-R3FFG54C&pg=PA1 |pages=1 |isbn=1402072643}}
</ref>
</ref>


Information cannot be predicted and resolves uncertainty. The uncertainty of an event is measured by its probability of occurrence and is inversely proportional to that. The more uncertain an event, the more information is required to resolve uncertainty of that event. The amount of information is measured in bits.<ref name="Floridi">{{cite book |url=http://books.google.com/books?id=VupFqa3IJiUC |author=Floridi, Luciano |title=Information - A Very Short Introduction |publisher=Oxford University Press |pages=130 |year=2010 |isbn=0199551375}}</ref> The concept that ''information is the message'' has different meanings in different contexts. Thus the concept of information becomes closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, understanding, stimulation, pattern, perception, representation, and entropy.<ref name="Floridi" />
Information cannot be predicted and resolves uncertainty. The uncertainty of an event is measured by its probability of occurrence and is inversely proportional to that. The more uncertain an event, the more information is required to resolve uncertainty of that event. The amount of information is measured in bits.<ref name="Floridi">{{cite book |url=https://books.google.com/books?id=VupFqa3IJiUC |author=Floridi, Luciano |title=Information - A Very Short Introduction |publisher=Oxford University Press |pages=130 |year=2010 |isbn=0199551375}}</ref> The concept that ''information is the message'' has different meanings in different contexts. Thus the concept of information becomes closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, understanding, stimulation, pattern, perception, representation, and entropy.<ref name="Floridi" />


==Variations of information==
==Variations of information==


===As sensory input===
===As sensory input===
Often information can be viewed as a type of input to an organism or system. Some inputs are important to the function of the organism (for example, food) or to the system itself (energy) and are called causal inputs. Other inputs (information) are important only because they are associated with causal inputs and can be used to predict the occurrence of a causal input at a later time (and perhaps another place). Some information is important because of its association with other information, but eventually there must be a connection to a causal input.<ref name="Dusenberry">{{cite book |title=Sensory Ecology: How Organisms Acquire and Respond to Information |author=Dusenbery, David B. |publisher=W H Freeman Limited |year=1992 |url=http://books.google.com/books?id=6S1mQgAACAAJ |pages=558 |isbn=0716723336}}</ref>  
Often information can be viewed as a type of input to an organism or system. Some inputs are important to the function of the organism (for example, food) or to the system itself (energy) and are called causal inputs. Other inputs (information) are important only because they are associated with causal inputs and can be used to predict the occurrence of a causal input at a later time (and perhaps another place). Some information is important because of its association with other information, but eventually there must be a connection to a causal input.<ref name="Dusenberry">{{cite book |title=Sensory Ecology: How Organisms Acquire and Respond to Information |author=Dusenbery, David B. |publisher=W H Freeman Limited |year=1992 |url=https://books.google.com/books?id=6S1mQgAACAAJ |pages=558 |isbn=0716723336}}</ref>  


In practice, information is usually carried by weak stimuli that must be detected by specialized sensory systems and amplified by energy inputs before they can be functional to the organism or system. For example, light is often a causal input to plants but provides information to animals. The colored light reflected from a flower is too weak to do much photosynthetic work. However, the visual system of the bee detects it, and the bee's nervous system uses the information to guide the bee to the flower, where the bee often finds nectar or pollen, causal inputs serving a nutritional function.<ref name="Dusenberry" />
In practice, information is usually carried by weak stimuli that must be detected by specialized sensory systems and amplified by energy inputs before they can be functional to the organism or system. For example, light is often a causal input to plants but provides information to animals. The colored light reflected from a flower is too weak to do much photosynthetic work. However, the visual system of the bee detects it, and the bee's nervous system uses the information to guide the bee to the flower, where the bee often finds nectar or pollen, causal inputs serving a nutritional function.<ref name="Dusenberry" />


===As representation and complexity===
===As representation and complexity===
One theory says information is a concept that involves at least two related entities in order to make quantitative sense: a dimensionally defined category of objects "S" and any of its subsets "R". In essence "R" is a representation of "S"; it conveys representational (and hence, conceptual) information about "S". The amount of information that "R" conveys about "S " is equivalent to the rate of change in the complexity of "S" whenever the objects in "R" are removed from "S". Under this theory, the universal scientific constructs of pattern, invariance, complexity, representation, and information are unified under a novel mathematical framework.<ref name="VigoRepInfo">{{cite journal |url=http://www.sciencedirect.com/science/article/pii/S0020025511002696 |title=Representational information: a new general notion and measure of information |author=Vigo, Ronaldo |journal=Information Sciences |volume=181 |issue=21 |pages=4847–4859 |date=01 November 2011 |accessdate=12 September 2013 |doi=10.1016/j.ins.2011.05.020}}</ref><ref name="VigoCompUnc">{{cite journal |url=http://www.mdpi.com/2078-2489/4/1/1 |title=Complexity over Uncertainty in Generalized Representational Information Theory (GRIT): A Structure-Sensitive General Theory of Information |author=Vigo, Ronaldo |journal=Information |volume=4 |issue=1 |pages=01–30 |date=20 December 2012 |accessdate=12 September 2013 |doi=10.3390/info4010001}}</ref> Among other things, the framework aims to overcome the limitations of Shannon-Weaver information when attempting to characterize and measure subjective information.
One theory says information is a concept that involves at least two related entities in order to make quantitative sense: a dimensionally defined category of objects "S" and any of its subsets "R". In essence "R" is a representation of "S"; it conveys representational (and hence, conceptual) information about "S". The amount of information that "R" conveys about "S" is equivalent to the rate of change in the complexity of "S" whenever the objects in "R" are removed from "S". Under this theory, the universal scientific constructs of pattern, invariance, complexity, representation, and information are unified under a novel mathematical framework.<ref name="VigoRepInfo">{{cite journal |title=Representational information: A new general notion and measure of information |author=Vigo, R. |journal=Information Sciences |volume=181 |issue=21 |pages=4847–4859 |year=2011 |doi=10.1016/j.ins.2011.05.020}}</ref><ref name="VigoCompUnc">{{cite journal |title=Complexity over Uncertainty in Generalized Representational Information Theory (GRIT): A Structure-Sensitive General Theory of Information |author=Vigo, R. |journal=Information |volume=4 |issue=1 |pages=1–30 |year=2013 |doi=10.3390/info4010001}}</ref> Among other things, the framework aims to overcome the limitations of Shannon-Weaver information when attempting to characterize and measure subjective information.


===As an influence which leads to a transformation===
===As an influence which leads to a transformation===
Information can also be defines as any type of pattern that influences the formation or transformation of other patterns.<ref>{{cite book|last=Shannon|first=Claude E.|title=The Mathematical Theory of Communication|year=1949}}</ref><ref>{{cite journal|last=Casagrande|first=David|title=Information as verb: Re-conceptualizing information for cognitive and ecological models|journal=Journal of Ecological Anthropology|year=1999|volume=3|issue=1|pages=4–13|url=http://www.lehigh.edu/~dac511/literature/casagrande1999.pdf}}</ref> In this sense, there is no need for a conscious mind to perceive, much less appreciate, the pattern.{{Citation needed|date=July 2010}} Consider, for example, [[DNA]]. The sequence of [[nucleotide]]s is a pattern that influences the formation and development of an organism without any need for a conscious mind.
[[File:Image 7 Information Relationship Model.jpg|thumb|300px|right|Visual representation of the relationship between language, data/facts, information, and knowledge]]
Information can also be defined as any type of pattern that influences the formation or transformation of other patterns. In this sense, there is no need for a conscious mind to perceive, much less appreciate, the pattern. Consider, for example, DNA. The sequence of nucleotides is a pattern that influences the formation and development of an organism without any need for a conscious mind.<ref name="ShannonMath">{{cite book |url=https://books.google.com/books?id=dk0n_eGcqsUC |title=The Mathematical Theory of Communication |author=Shannon, Claude E. |publisher=University of Illinois Press |pages=117 |year=1949 |isbn=0252725484}}</ref><ref name="Casagrande">{{cite journal |journal=Georgia Journal of Ecological Anthropology |title=Information as Verb: Re-conceptualizing Information for Cognitive and Ecological Models |author=Casagrande, D.G. |year=1999 |volume=3 |issue=1 |pages=4–13 |doi=10.5038/2162-4593.3.1.1}}</ref>  


[[Systems theory]] at times seems to refer to information in this sense, assuming information does not necessarily involve any conscious mind, and patterns circulating (due to [[feedback]]) in the system can be called information. In other words, it can be said that information in this sense is something potentially perceived as representation, though not created or presented for that purpose. For example, [[Gregory Bateson]] defines "information" as a "difference that makes a difference".<ref>{{cite book|last=Bateson|first=Gregory|title=^ Form, Substance, and Difference, in Steps to an Ecology of Mind|year=1972|publisher=University of Chicago Press|pages=448–466}}</ref>
Systems theory at times seems to refer to information in this sense, assuming information does not necessarily involve any conscious mind, and patterns circulating (due to feedback) in the system can be called information. In other words, it can be said information in this sense is something potentially perceived as representation, though not created or presented for that purpose. For example, anthropologist and social scientist Gregory Bateson defined "information" as a "difference that makes a difference."<ref name="Bateson">{{cite book |url=https://books.google.com/books?id=HewJbnQmn1gC |title=Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology |author=Bateson, Gregory |publisher=University of Chicago Press |year=1972 |pages=448–466 |isbn=0226039056}}</ref>


If, however, the premise of "influence" implies that information has been perceived by a conscious mind and also interpreted by it, the specific context associated with this interpretation may cause the transformation of the information into [[knowledge]]. Complex definitions of both "information" and "knowledge" make such semantic and logical analysis difficult, but the condition of "transformation" is an important point in the study of information as it relates to knowledge, especially in the business discipline of [[knowledge management]]. In this practice, tools and processes are used to assist a [[knowledge worker]] in performing research and making decisions, including steps such as:
If, however, the premise of "influence" implies that information has been perceived by a conscious mind and also interpreted by it, the specific context associated with this interpretation may cause the transformation of the information into knowledge. Complex definitions of both "information" and "knowledge" make such semantic and logical analysis difficult, but the condition of "transformation" is an important point in the study of information as it relates to knowledge, especially in the business discipline of knowledge management. In this practice, tools and processes are used to assist a knowledge worker in performing research and making decisions, including steps such as<ref name="KMSysAlavi">{{cite web |url=http://knowledge.emory.edu/papers/1005.pdf |archiveurl=https://web.archive.org/web/20140326193732/http://knowledge.emory.edu/papers/1005.pdf |format=PDF |title=Knowledge Management and Knowledge Management Systems: Conceptual Foundations and Research Issues |author=Alavi, M.; Leidner, D.E. |publisher=Emery University |date=June 1999 |archivedate=26 March 2014 |accessdate=05 January 2022}}</ref><ref>{{Cite book |last=Riley, J. |year=2017 |title=Understanding Metadata: What Is Metadata, and What Is It For? |url=https://groups.niso.org/apps/group_public/download.php/17446/Understanding%20Metadata.pdf |format=PDF |publisher=National Information Standards Organization |isbn=978-1-937522-72-8}}</ref>:


* reviewing information in order to effectively derive value and meaning
*reviewing information in order to effectively derive value and meaning
* referencing [[metadata]] if any is available
*referencing metadata if any is available
* establishing a relevant [[context (language use)|context]], often selecting from many possible contexts
*establishing a relevant context, often selecting from many possible contexts
* deriving new knowledge from the information
*deriving new knowledge from the information
* making decisions or recommendations from the resulting knowledge.
*making decisions or recommendations from the resulting knowledge


Stewart (2001) argues that the transformation of information into knowledge is a critical one, lying at the core of value creation and [[competitive advantage]] for the modern enterprise.
The Danish Dictionary of Information Terms suggests, however, information only provides an answer to a posed question. Whether the answer provides knowledge depends on the informed person.<nowiki><ref name="DDIT"></nowiki>{{cite web |url=http://www.informationsordbogen.dk/concept.php?cid=902 |archiveurl=https://web.archive.org/web/20130824075255/http://www.informationsordbogen.dk/concept.php?cid=902 |title=information |work=Informationsordbogen |publisher=Informationsvidenskabelige Akademi |archivedate=24 August 2013 |accessdate=05 January 2022}}<nowiki></ref></nowiki> Thus a generalized definition of the transformation concept could be "information represents the answer to a specific question."
 
The Danish Dictionary of Information Terms<ref>[http://www.informationsordbogen.dk/concept.php?cid=902 Informationsordbogen.dk]</ref> argues that information only provides an answer to a posed question. Whether the answer provides knowledge depends on the informed person. So a generalized definition of the concept should be: "Information" = An answer to a specific question".
 
When [[Marshall McLuhan]] speaks of [[media (communication)|media]] and their effects on human cultures, he refers to the structure of [[cultural artifact|artifacts]] that in turn shape our behaviors and mindsets. Also, [[pheromone]]s are often said to be "information" in this sense.


===As a property in physics===
===As a property in physics===
{{Main|Physical information}}
Information has had a well-defined meaning in physics.<ref name="Zurek">{{cite book |url=https://books.google.com/books?id=RQpQDwAAQBAJ |title=Complexity, Entropy, and the Physics of Information: The Proceedings of the 1988 Workshop on Complexity, Entropy, and the Physics of Information |editor=Zurek, Wojciech Hubert |publisher=Westview Press |pages=530 |year=1990 |isbn=0201515067}}</ref> However, in 2003 theoretical physicist J. D. Bekenstein claimed a growing trend in physics was to define the physical world as being made up of information itself.<ref name="JDHolo">{{cite journal |url=https://www.scientificamerican.com/article/information-in-the-holographic-univ/ |title=Information in the Holographic Universe |author=Bekenstein, J.D. |journal=Scientific American |volume=289 |issue=2 |page=58–65 |date=August 2003 |accessdate=05 January 2022}}</ref> Examples of this include the phenomenon of quantum entanglement, where particles can interact without reference to their separation or the speed of light. Information itself cannot travel faster than light, even if the information is transmitted indirectly. This could lead to all attempts at physically observing a particle with an "entangled" relationship to another being slowed down, even though the particles are not connected in any other way other than by the information they carry.<ref name="Zurek" />
Information has a well-defined meaning in physics. In 2003 [[J. D. Bekenstein]] claimed that a growing trend in [[physics]] was to define the physical world as being made up of information itself (and thus information is defined in this way) (see [[Digital physics]]). Examples of this include the phenomenon of [[quantum entanglement]], where particles can interact without reference to their separation or the speed of light. Information itself cannot travel faster than light even if the information is transmitted indirectly. This could lead to all attempts at physically observing a particle with an "entangled" relationship to another being slowed down, even though the particles are not connected in any other way other than by the information they carry.
 
Another link is demonstrated by the [[Maxwell's demon]] thought experiment.  In this experiment, a direct relationship between information and another physical property, [[entropy]], is demonstrated.  A consequence is that it is impossible to destroy information without increasing the entropy of a system; in practical terms this often means generating heat. Another more philosophical outcome is that information could be thought of as interchangeable with [[Energy#Transformations of energy|energy]]{{citation needed|date=May 2013}}.  Thus, in the study of [[logic gates]], the theoretical lower bound of thermal energy released by an ''AND gate'' is higher than for the ''NOT gate'' (because information is destroyed in an ''AND gate'' and simply converted in a ''NOT gate''). Physical information is of particular importance in the theory of [[quantum computers]].


== Technologically mediated information ==
Another link is demonstrated by the Maxwell's demon thought experiment. In this experiment, a direct relationship between information and another physical property, entropy, is demonstrated. As a result, destroying the information is impossible without increasing the entropy of a system; in practical terms this often means generating heat.<ref name="PhysOrgMax">{{cite web |url=https://phys.org/news/2010-11-maxwell-demon-energy.html |title=Maxwell's demon demonstration turns information into energy |author=Edwards, L. |work=Phys.org |date=15 November 2010 |accessdate=05 January 2022}}</ref>
It is estimated that the world's technological capacity to store information grew from 2.6 (optimally compressed) [[exabytes]] in 1986 – which is the informational equivalent to less than one 730-MB [[CD-ROM]] per person (539 MB per person) – to 295 (optimally compressed) [[exabytes]] in 2007.<ref name="HilbertLopez2011">[http://www.sciencemag.org/content/332/6025/60 "The World’s Technological Capacity to Store, Communicate, and Compute Information"], Martin Hilbert and Priscila López (2011), [[Science (journal)]], 332(6025), 60-65; free access to the article through here: martinhilbert.net/WorldInfoCapacity.html</ref> This is the informational equivalent of almost 61 [[CD-ROM]] per person in 2007.<ref name="Hilbertvideo2011">[http://www.youtube.com/watch?v=iIKPjOuwqHo "video animation on The World’s Technological Capacity to Store, Communicate, and Compute Information from 1986 to 2010]</ref>


The world’s combined technological capacity to receive information through one-way [[broadcast]] networks was the informational equivalent of 174 newspapers per person per day in 2007.<ref name="HilbertLopez2011"/>
===As records===
Records are specialized forms of information, produced consciously or as by-products of business activities or transactions and retained because of their value. Organizations value records as evidence of activity, but they may also be retained for their informational value. Sound records management ensures the integrity of records is preserved for as long as they are required.


The world's combined effective capacity to exchange information through two-way [[telecommunication]] networks was the informational equivalent of 6 newspapers per person per day in 2007.<ref name="Hilbertvideo2011"/>
The international standard on records management, ISO 15489, defines records as "information created, received, and maintained as evidence and information by an organization or person, in pursuance of legal obligations or in the transaction of business."<ref name="ISO15489">{{cite web |url=https://www.iso.org/standard/62542.html |title=ISO 15489-1:2016 Information and documentation — Records management — Part 1: Concepts and principles |publisher=International Organization for Standardization |date=April 2016 |accessdate=05 January 2022}}</ref>


== As records ==
The International Committee on Archives (ICA), Committee on Electronic Records defined a record as "recorded information produced or received in the initiation, conduct, or completion of an institutional or individual activity and that comprises content, context, and structure sufficient to provide evidence of the activity."<ref name="ICA/CER">{{cite web |url=https://www.ica.org/en/networked-electronic-information-internet-and-intranet-environments-2000 |format=PDF |title=Networked Electronic Information in the Internet and Intranet Environments |publisher=International Committee on Archives (ICA), Committee on Electronic Records |date=27 August 2000 |accessdate=05 January 2022}}</ref>
Records are specialized forms of information. Essentially, records are information produced consciously or as by-products of business activities or transactions and retained because of their value.  Primarily, their value is as evidence of the activities of the organization but they may also be retained for their informational value. Sound [[records management]] ensures that the integrity of records is preserved for as long as they are required.


The international standard on records management, ISO 15489, defines records as "information created, received, and maintained as evidence and information by an organization or person, in pursuance of legal obligations or in the transaction of business". The International Committee on Archives (ICA) Committee on electronic records defined a record as, "a specific piece of recorded information generated, collected or received in the initiation, conduct or completion of an activity and that comprises sufficient content, context and structure to provide proof or evidence of that activity".
Records may be maintained to retain corporate memory of the organization or to meet legal, fiscal, or accountability requirements imposed on the organization. In 2005 legal expert Anthony Willis elaborated on this view, stating the sound management of business records and information delivered "...six key requirements for good corporate governance ... transparency; accountability; due process; compliance; meeting statutory and common law requirements; and security of personal and corporate information."<ref name="Willis05">{{cite journal |title=Corporate governance and management of information and records |author=Willis, A. |journal=Records Management Journal |volume=15 |issue=2 |page=86–97 |year=2005 |doi=10.1108/09565690510614238}}</ref>


Records may be maintained to retain [[corporate memory]] of the organization or to meet legal, fiscal or accountability requirements imposed on the organization. Willis (2005) expressed the view that sound management of business records and information delivered "...six key requirements for good [[corporate governance]]...transparency; accountability; due process; compliance; meeting statutory and common law requirements; and security of personal and corporate information."
==Technologically mediated information==
In 2011 scientists Martin Hilbert and Priscila López estimated the world's technological capacity to store information grew from 2.6 (optimally compressed) exabytes in 1986 – which is the informational equivalent to less than one 730-MB CD-ROM per person (539 MB per person) – to 295 (optimally compressed) exabytes in 2007.<ref name="HilbertLopez2011">{{cite journal |title=The World’s Technological Capacity to Store, Communicate, and Compute Information |author=Hilbert, M.; López, P. |journal=Science |volume=332 |issue=6025 |page=60-65 |year=2011 |doi=10.1126/science.1200970}}</ref> This is the informational equivalent of almost 61 CD-ROM per person in 2007.<ref name="Hilbertvideo2011">{{cite video |url=https://www.youtube.com/watch?v=iIKPjOuwqHo |title=World_info_capacity_animation |author=Hilbert, M. |work=YouTube |date=11 June 2011 |accessdate=05 January 2022}}</ref>


== Information and semiotics ==
Hilbert and López also stated the world's combined technological capacity to receive information through one-way broadcast networks was the informational equivalent of 174 newspapers per person per day in 2007.<ref name="HilbertLopez2011" />, while the world's combined effective capacity to exchange information through two-way telecommunication networks was the informational equivalent of six newspapers per person per day.<ref name="Hilbertvideo2011" />
[[Paul Beynon-Davies|Beynon-Davies]]<ref>Beynon-Davies P. (2002). Information Systems: an introduction to informatics in Organisations. Palgrave, Basingstoke, UK. ISBN 0-333-96390-3</ref><ref>Beynon-Davies P. (2009). Business Information Systems. Palgrave, Basingstoke. ISBN 978-0-230-20368-6</ref> explains the multi-faceted concept of information in terms of signs and signal-sign systems. Signs themselves can be considered in terms of four inter-dependent levels, layers or branches of [[semiotics]]: pragmatics, semantics, syntax, and empirics. These four layers serve to connect the social world on the one hand with the physical or technical world on the other...


[[Pragmatics]] is concerned with the purpose of communication. Pragmatics links the issue of signs with the context within which signs are used. The focus of pragmatics is on the intentions of living agents underlying communicative behaviour. In other words, pragmatics link language to action.
==Information and semiotics==
Scientists can also explain information in terms of signs and signal-sign systems. Signs themselves can be considered in terms of four interdependent levels, layers, or branches of semiotics: pragmatics, semantics, syntax, and empirics. These four layers serve to connect the social world with the physical or technical. The four branches of semiotics are described as such<ref name="InfoSysBD">{{cite book |url=https://books.google.com/books?id=cCqdQAAACAAJ |title=Information Systems: An Introduction to Informatics in Organisations |author=Beynon-Davies, P. |publisher=Palgrave Macmillan Limited |year=2002 |pages=595 |isbn=0333963903}}</ref><ref name="BISBD">{{cite book |url=https://books.google.com/books?id=pdUROgAACAAJ |title=Business Information Systems |author=Beynon-Davies, P. |publisher=Palgrave Macmillan Limited |year=2009 |pages=512 |isbn=023020368X}}</ref>:


[[Semantics]] is concerned with the meaning of a message conveyed in a communicative act. Semantics considers the content of communication. Semantics is the study of the meaning of signs - the association between signs and behaviour. Semantics can be considered as the study of the link between symbols and their referents or concepts particularly the way in which signs relate to human behavior.
#''pragmatics'': the purpose of communication - Pragmatics links the issue of signs with the context within which signs are used. The focus of pragmatics is on the intentions of living agents underlying communicative behavior. In other words, pragmatics links language to action.
#''semantics'': the meaning of a message conveyed in a communicative act - Semantics considers the content of communication, the meaning of signs, and the association between signs and behavior. The study of semantics links symbols and their referents or concepts, particularly the way in which signs relate to human behavior.
#''syntax'': the formalism used to represent a message - Syntax considers the form of communication in terms of the logic and grammar of sign systems. Syntax focuses on form rather than the content of signs and sign systems.
#''empirics'': the signals used to carry a message - Emperics focus on the physical characteristics of the medium of communication. Empirics is devoted to the study of communication channels and their characteristics, e.g., sound, light, electronic transmission, etc.


[[Syntax]] is concerned with the formalism used to represent a message. Syntax as an area studies the form of communication in terms of the logic and grammar of sign systems. Syntax is devoted to the study of the form rather than the content of signs and sign-systems.
In 2008, lexicographer Sandro Nielsen discussed the relationship between semiotics and information in relation to dictionaries. The concept of lexicographic information costs is introduced and refers to the efforts users of dictionaries need to make in order to, first, find the data sought and, secondly, understand the data so they can generate information.<ref name="Nielsen08">{{cite journal |url=http://lexikos.journals.ac.za/pub/article/download/483/179 |archiveurl=https://web.archive.org/web/20190428163910/http://lexikos.journals.ac.za/pub/article/download/483/179 |format=PDF |title=The Effect of Lexicographical Information Costs on Dictionary Making and Use |author=Nielsen, S. |journal=Lexikos |year=2008 |volume=18 |pages=170–189 |issn=1684-4904 |archivedate=28 April 2019 |accessdate=05 January 2022}}</ref>


Empirics is the study of the signals used to carry a message; the physical characteristics of the medium of communication. Empirics is devoted to the study of communication channels and their characteristics, e.g., sound, light, electronic transmission etc..
Communication normally exists within the context of some social situation. The social situation sets the context for the intentions conveyed (pragmatics) and the form in which communication takes place. We express out intentions through a mutually understood collection of inter-related signs. Mutual understanding implies agents involved understand the chosen language in terms of its agreed syntax (syntactics) and semantics. The sender codes the message in the language and sends the message as signals along some communication channel (empirics). The chosen communication channel will have inherent properties which determine outcomes such as the speed with which communication can take place and over what distance.


Nielsen (2008) discusses the relationship between semiotics and information in relation to dictionaries. The concept of [[lexicographic information cost]]s is introduced and refers to the efforts users of dictionaries need to make in order to, first, find the data sought and, secondly, understand the data so that they can generate information.
==See also==


Communication normally exists within the context of some social situation. The social situation sets the context for the intentions conveyed (pragmatics) and the form in which communication takes place. In a communicative situation intentions are expressed through messages which comprise collections of inter-related signs taken from a language which is mutually understood by the agents involved in the communication. Mutual understanding implies that agents involved understand the chosen language in terms of its agreed syntax (syntactics) and semantics. The sender codes the message in the language and sends the message as signals along some communication channel (empirics). The chosen communication channel will have inherent properties which determine outcomes such as the speed with which communication can take place and over what distance.
*[[Geographic information system]]
*[[Informatics]]
*[[Laboratory information management system]]
*[[Laboratory information system]]


== See also ==
==Further reading==
* [[Geographic information system]]
* [[Informatics]]
* [[Laboratory information management system]]
* [[Laboratory information system]]


== Further reading ==
* {{cite book |url=https://books.google.com/books?id=VupFqa3IJiUC |author=Floridi, L. |title=Information - A Very Short Introduction |publisher=Oxford University Press |pages=130 |year=2010 |isbn=0199551375}}
* {{cite book |url=http://books.google.com/books?id=QpuZgAR8DJwC |author=Von Baeyer, Hans Christian |title=Information: The New Language of Science |publisher=Harvard University Press |pages=258 |year=2004 |isbn=0674013875}}
* {{cite book |url=https://plato.stanford.edu/entries/information-semantic/ |title=Semantic Conceptions of Information |work=The Stanford Encyclopedia of Philosophy |publisher=Stanford University |author=Floridi, L. |editor=Zalta, Edward N |edition=Spring 2013}}
* {{cite book |url=http://books.google.com/books?id=VupFqa3IJiUC |author=Floridi, Luciano |title=Information - A Very Short Introduction |publisher=Oxford University Press |pages=130 |year=2010 |isbn=0199551375}}
* {{cite web |url=http://fp.optics.arizona.edu/frieden/fisher_information.htm |archiveurl=https://web.archive.org/web/20121205005223/http://fp.optics.arizona.edu/frieden/fisher_information.htm |title=Fisher Information, a New Paradigm of Science |author=Frieden, B.R. |publisher=Optical Sciences Center, Univ. of Arizona |date=20 August 2012 |archivedate=05 December 2012}}
* {{cite book |url=http://books.google.com/books?id=yX9QAAAAMAAJ |author=Young, Paul |title=The Nature of Information |publisher=Praeger |pages=192 |year=1987 |isbn=0275926982}}
* {{cite book |url=https://books.google.com/books?id=QpuZgAR8DJwC |author=Von Baeyer, H.C. |title=Information: The New Language of Science |publisher=Harvard University Press |pages=258 |year=2004 |isbn=0674013875}}
* {{cite book |url=https://books.google.com/books?id=yX9QAAAAMAAJ |author=Young, P. |title=The Nature of Information |publisher=Praeger |pages=192 |year=1987 |isbn=0275926982}}
==External links==
*[http://www.informationsordbogen.dk/ Informationsordbogen.dk], the Danish Dictionary of Information Terms / Informationsordbogen


== External links ==
==Notes==
* [http://plato.stanford.edu/entries/information-semantic/ Semantic Conceptions of Information] Review by [[Luciano Floridi]] for the [[Stanford Encyclopedia of Philosophy]]
Some elements of this article are reused from [https://en.wikipedia.org/wiki/Information the Wikipedia article].
* [http://www.optics.arizona.edu/Frieden/Fisher_Information.htm Fisher Information, a New Paradigm for Science: Introduction, Uncertainty principles, Wave equations, Ideas of Escher, Kant, Plato and Wheeler.] This essay is continually revised in the light of ongoing research.
* [http://www2.sims.berkeley.edu/research/projects/how-much-info-2003/index.htm How Much Information? 2003] an attempt to estimate how much new information is created each year (study was produced by faculty and students at the [[UC Berkeley School of Information|School of Information Management and Systems]] at the [[University of California at Berkeley]])
* [http://www.informationsordbogen.dk Informationsordbogen.dk] The Danish Dictionary of Information Terms / Informationsordbogen


== References ==
==References==
{{Reflist|colwidth=30em}}
{{Reflist|colwidth=30em}}


<!---Place all category tags here-->
[[Category:Informatics]]
[[Category:Information science]]
[[Category:Information science]]

Latest revision as of 21:13, 9 May 2024

Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as a message, recorded as signs, or transmitted as signals.[1] Conceptually, information is the message (utterance or expression) being conveyed. Therefore, in a general sense, information is "knowledge communicated or received concerning a particular fact or circumstance."[2]

From the stance of information theory, information is taken as a sequence of symbols from an alphabet, say an input alphabet χ, and an output alphabet ϒ. Information processing consists of an input-output function that maps any input sequence from χ into an output sequence from ϒ. The mapping may be probabilistic or determinate. It may have memory or be memoryless.[3]

Information cannot be predicted and resolves uncertainty. The uncertainty of an event is measured by its probability of occurrence and is inversely proportional to that. The more uncertain an event, the more information is required to resolve uncertainty of that event. The amount of information is measured in bits.[4] The concept that information is the message has different meanings in different contexts. Thus the concept of information becomes closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, understanding, stimulation, pattern, perception, representation, and entropy.[4]

Variations of information

As sensory input

Often information can be viewed as a type of input to an organism or system. Some inputs are important to the function of the organism (for example, food) or to the system itself (energy) and are called causal inputs. Other inputs (information) are important only because they are associated with causal inputs and can be used to predict the occurrence of a causal input at a later time (and perhaps another place). Some information is important because of its association with other information, but eventually there must be a connection to a causal input.[5]

In practice, information is usually carried by weak stimuli that must be detected by specialized sensory systems and amplified by energy inputs before they can be functional to the organism or system. For example, light is often a causal input to plants but provides information to animals. The colored light reflected from a flower is too weak to do much photosynthetic work. However, the visual system of the bee detects it, and the bee's nervous system uses the information to guide the bee to the flower, where the bee often finds nectar or pollen, causal inputs serving a nutritional function.[5]

As representation and complexity

One theory says information is a concept that involves at least two related entities in order to make quantitative sense: a dimensionally defined category of objects "S" and any of its subsets "R". In essence "R" is a representation of "S"; it conveys representational (and hence, conceptual) information about "S". The amount of information that "R" conveys about "S" is equivalent to the rate of change in the complexity of "S" whenever the objects in "R" are removed from "S". Under this theory, the universal scientific constructs of pattern, invariance, complexity, representation, and information are unified under a novel mathematical framework.[6][7] Among other things, the framework aims to overcome the limitations of Shannon-Weaver information when attempting to characterize and measure subjective information.

As an influence which leads to a transformation

Visual representation of the relationship between language, data/facts, information, and knowledge

Information can also be defined as any type of pattern that influences the formation or transformation of other patterns. In this sense, there is no need for a conscious mind to perceive, much less appreciate, the pattern. Consider, for example, DNA. The sequence of nucleotides is a pattern that influences the formation and development of an organism without any need for a conscious mind.[8][9]

Systems theory at times seems to refer to information in this sense, assuming information does not necessarily involve any conscious mind, and patterns circulating (due to feedback) in the system can be called information. In other words, it can be said information in this sense is something potentially perceived as representation, though not created or presented for that purpose. For example, anthropologist and social scientist Gregory Bateson defined "information" as a "difference that makes a difference."[10]

If, however, the premise of "influence" implies that information has been perceived by a conscious mind and also interpreted by it, the specific context associated with this interpretation may cause the transformation of the information into knowledge. Complex definitions of both "information" and "knowledge" make such semantic and logical analysis difficult, but the condition of "transformation" is an important point in the study of information as it relates to knowledge, especially in the business discipline of knowledge management. In this practice, tools and processes are used to assist a knowledge worker in performing research and making decisions, including steps such as[11][12]:

  • reviewing information in order to effectively derive value and meaning
  • referencing metadata if any is available
  • establishing a relevant context, often selecting from many possible contexts
  • deriving new knowledge from the information
  • making decisions or recommendations from the resulting knowledge

The Danish Dictionary of Information Terms suggests, however, information only provides an answer to a posed question. Whether the answer provides knowledge depends on the informed person.<ref name="DDIT">"information". Informationsordbogen. Informationsvidenskabelige Akademi. Archived from the original on 24 August 2013. https://web.archive.org/web/20130824075255/http://www.informationsordbogen.dk/concept.php?cid=902. Retrieved 05 January 2022.  </ref> Thus a generalized definition of the transformation concept could be "information represents the answer to a specific question."

As a property in physics

Information has had a well-defined meaning in physics.[13] However, in 2003 theoretical physicist J. D. Bekenstein claimed a growing trend in physics was to define the physical world as being made up of information itself.[14] Examples of this include the phenomenon of quantum entanglement, where particles can interact without reference to their separation or the speed of light. Information itself cannot travel faster than light, even if the information is transmitted indirectly. This could lead to all attempts at physically observing a particle with an "entangled" relationship to another being slowed down, even though the particles are not connected in any other way other than by the information they carry.[13]

Another link is demonstrated by the Maxwell's demon thought experiment. In this experiment, a direct relationship between information and another physical property, entropy, is demonstrated. As a result, destroying the information is impossible without increasing the entropy of a system; in practical terms this often means generating heat.[15]

As records

Records are specialized forms of information, produced consciously or as by-products of business activities or transactions and retained because of their value. Organizations value records as evidence of activity, but they may also be retained for their informational value. Sound records management ensures the integrity of records is preserved for as long as they are required.

The international standard on records management, ISO 15489, defines records as "information created, received, and maintained as evidence and information by an organization or person, in pursuance of legal obligations or in the transaction of business."[16]

The International Committee on Archives (ICA), Committee on Electronic Records defined a record as "recorded information produced or received in the initiation, conduct, or completion of an institutional or individual activity and that comprises content, context, and structure sufficient to provide evidence of the activity."[17]

Records may be maintained to retain corporate memory of the organization or to meet legal, fiscal, or accountability requirements imposed on the organization. In 2005 legal expert Anthony Willis elaborated on this view, stating the sound management of business records and information delivered "...six key requirements for good corporate governance ... transparency; accountability; due process; compliance; meeting statutory and common law requirements; and security of personal and corporate information."[18]

Technologically mediated information

In 2011 scientists Martin Hilbert and Priscila López estimated the world's technological capacity to store information grew from 2.6 (optimally compressed) exabytes in 1986 – which is the informational equivalent to less than one 730-MB CD-ROM per person (539 MB per person) – to 295 (optimally compressed) exabytes in 2007.[19] This is the informational equivalent of almost 61 CD-ROM per person in 2007.[20]

Hilbert and López also stated the world's combined technological capacity to receive information through one-way broadcast networks was the informational equivalent of 174 newspapers per person per day in 2007.[19], while the world's combined effective capacity to exchange information through two-way telecommunication networks was the informational equivalent of six newspapers per person per day.[20]

Information and semiotics

Scientists can also explain information in terms of signs and signal-sign systems. Signs themselves can be considered in terms of four interdependent levels, layers, or branches of semiotics: pragmatics, semantics, syntax, and empirics. These four layers serve to connect the social world with the physical or technical. The four branches of semiotics are described as such[21][22]:

  1. pragmatics: the purpose of communication - Pragmatics links the issue of signs with the context within which signs are used. The focus of pragmatics is on the intentions of living agents underlying communicative behavior. In other words, pragmatics links language to action.
  2. semantics: the meaning of a message conveyed in a communicative act - Semantics considers the content of communication, the meaning of signs, and the association between signs and behavior. The study of semantics links symbols and their referents or concepts, particularly the way in which signs relate to human behavior.
  3. syntax: the formalism used to represent a message - Syntax considers the form of communication in terms of the logic and grammar of sign systems. Syntax focuses on form rather than the content of signs and sign systems.
  4. empirics: the signals used to carry a message - Emperics focus on the physical characteristics of the medium of communication. Empirics is devoted to the study of communication channels and their characteristics, e.g., sound, light, electronic transmission, etc.

In 2008, lexicographer Sandro Nielsen discussed the relationship between semiotics and information in relation to dictionaries. The concept of lexicographic information costs is introduced and refers to the efforts users of dictionaries need to make in order to, first, find the data sought and, secondly, understand the data so they can generate information.[23]

Communication normally exists within the context of some social situation. The social situation sets the context for the intentions conveyed (pragmatics) and the form in which communication takes place. We express out intentions through a mutually understood collection of inter-related signs. Mutual understanding implies agents involved understand the chosen language in terms of its agreed syntax (syntactics) and semantics. The sender codes the message in the language and sends the message as signals along some communication channel (empirics). The chosen communication channel will have inherent properties which determine outcomes such as the speed with which communication can take place and over what distance.

See also

Further reading

External links

Notes

Some elements of this article are reused from the Wikipedia article.

References

  1. Von Baeyer, Hans Christian (2004). Information: The New Language of Science. Harvard University Press. pp. 258. ISBN 0674013875. https://books.google.com/books?id=QpuZgAR8DJwC. 
  2. "information". Dictionary.com Unabridged. Dictionary.com, LLC. https://www.dictionary.com/browse/information. Retrieved 05 January 2022. 
  3. Wicker, Stephen B.; Kim, Saejoon (2003). Fundamentals of Codes, Graphs, and Iterative Decoding. Springer. pp. 1. ISBN 1402072643. https://books.google.com/books?id=rMu-R3FFG54C&pg=PA1. 
  4. 4.0 4.1 Floridi, Luciano (2010). Information - A Very Short Introduction. Oxford University Press. pp. 130. ISBN 0199551375. https://books.google.com/books?id=VupFqa3IJiUC. 
  5. 5.0 5.1 Dusenbery, David B. (1992). Sensory Ecology: How Organisms Acquire and Respond to Information. W H Freeman Limited. pp. 558. ISBN 0716723336. https://books.google.com/books?id=6S1mQgAACAAJ. 
  6. Vigo, R. (2011). "Representational information: A new general notion and measure of information". Information Sciences 181 (21): 4847–4859. doi:10.1016/j.ins.2011.05.020. 
  7. Vigo, R. (2013). "Complexity over Uncertainty in Generalized Representational Information Theory (GRIT): A Structure-Sensitive General Theory of Information". Information 4 (1): 1–30. doi:10.3390/info4010001. 
  8. Shannon, Claude E. (1949). The Mathematical Theory of Communication. University of Illinois Press. pp. 117. ISBN 0252725484. https://books.google.com/books?id=dk0n_eGcqsUC. 
  9. Casagrande, D.G. (1999). "Information as Verb: Re-conceptualizing Information for Cognitive and Ecological Models". Georgia Journal of Ecological Anthropology 3 (1): 4–13. doi:10.5038/2162-4593.3.1.1. 
  10. Bateson, Gregory (1972). Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology. University of Chicago Press. pp. 448–466. ISBN 0226039056. https://books.google.com/books?id=HewJbnQmn1gC. 
  11. Alavi, M.; Leidner, D.E. (June 1999). "Knowledge Management and Knowledge Management Systems: Conceptual Foundations and Research Issues" (PDF). Emery University. Archived from the original on 26 March 2014. https://web.archive.org/web/20140326193732/http://knowledge.emory.edu/papers/1005.pdf. Retrieved 05 January 2022. 
  12. Riley, J. (2017) (PDF). Understanding Metadata: What Is Metadata, and What Is It For?. National Information Standards Organization. ISBN 978-1-937522-72-8. https://groups.niso.org/apps/group_public/download.php/17446/Understanding%20Metadata.pdf. 
  13. 13.0 13.1 Zurek, Wojciech Hubert, ed. (1990). Complexity, Entropy, and the Physics of Information: The Proceedings of the 1988 Workshop on Complexity, Entropy, and the Physics of Information. Westview Press. pp. 530. ISBN 0201515067. https://books.google.com/books?id=RQpQDwAAQBAJ. 
  14. Bekenstein, J.D. (August 2003). "Information in the Holographic Universe". Scientific American 289 (2): 58–65. https://www.scientificamerican.com/article/information-in-the-holographic-univ/. Retrieved 05 January 2022. 
  15. Edwards, L. (15 November 2010). "Maxwell's demon demonstration turns information into energy". Phys.org. https://phys.org/news/2010-11-maxwell-demon-energy.html. Retrieved 05 January 2022. 
  16. "ISO 15489-1:2016 Information and documentation — Records management — Part 1: Concepts and principles". International Organization for Standardization. April 2016. https://www.iso.org/standard/62542.html. Retrieved 05 January 2022. 
  17. "Networked Electronic Information in the Internet and Intranet Environments" (PDF). International Committee on Archives (ICA), Committee on Electronic Records. 27 August 2000. https://www.ica.org/en/networked-electronic-information-internet-and-intranet-environments-2000. Retrieved 05 January 2022. 
  18. Willis, A. (2005). "Corporate governance and management of information and records". Records Management Journal 15 (2): 86–97. doi:10.1108/09565690510614238. 
  19. 19.0 19.1 Hilbert, M.; López, P. (2011). "The World’s Technological Capacity to Store, Communicate, and Compute Information". Science 332 (6025): 60-65. doi:10.1126/science.1200970. 
  20. 20.0 20.1 Hilbert, M. (11 June 2011). "World_info_capacity_animation". YouTube. https://www.youtube.com/watch?v=iIKPjOuwqHo. Retrieved 05 January 2022. 
  21. Beynon-Davies, P. (2002). Information Systems: An Introduction to Informatics in Organisations. Palgrave Macmillan Limited. pp. 595. ISBN 0333963903. https://books.google.com/books?id=cCqdQAAACAAJ. 
  22. Beynon-Davies, P. (2009). Business Information Systems. Palgrave Macmillan Limited. pp. 512. ISBN 023020368X. https://books.google.com/books?id=pdUROgAACAAJ. 
  23. Nielsen, S. (2008). "The Effect of Lexicographical Information Costs on Dictionary Making and Use" (PDF). Lexikos 18: 170–189. ISSN 1684-4904. Archived from the original on 28 April 2019. https://web.archive.org/web/20190428163910/http://lexikos.journals.ac.za/pub/article/download/483/179. Retrieved 05 January 2022.