Information Theory Patterns

Per Wikipedia, informatics is used synonymously to mean information systems, information scienceinformation theory, information engineering, information technology, information processing, computer science, information and computer science, computer science and engineering, and "modern computer science".

Huh.

Ok, let me look at information theory.  Information theory relates to the transmission, processing, extraction, understanding, and utilization of information.  Information theory relates to the reduction of uncertainty in that endeavor; maximizing "signal" and reducing or removing "noise" which might cause the signal to be misinterpreted.

The following seem to be the logical patterns within information theory:

  • Logical statements: Logical statements are the building blocks of information.  Information is comprised of some set of true logical statements. For example, "The sky is blue." is a logical statement. (Note that a statement is a type of sentence. Not all sentences are statements; for example, "What time is it?", which is a question, is not a statement.)
  • Logical connectors: Logical connectors enable compound or complex logical statements to be created. For example, "AND" and "OR" and "NOT" and "IF...THEN" are connectors that let you build infinitely compound/complex logical statements.
  • Logical associations: Logical statements can be associated to other logical statements forming a relationship.  There are three broad categories of associations. (Note that a logical association is a type of logical statement.)
    • "Is-a" associations (a.k.a. generalization-specialization, type-subtype, class-subclass).  For example, "Cash and cash equivalents" is a type of Asset.
    • "Has-a" associations (a.k.a. whole-part, mereology). For example, "A balance sheet has part "Assets".
    • "Property-of" (a.k.a. trait, quality, attribute).  For example, "Assets has a property of balance type which is always "debit".
  • Logical assertions: A logical assertion (a.k.a. constraint, restriction, rule) is another category of logical statement.  For example, "Assets = Liabilities + Equity" is an assertion.
  • Logical structures: A logical structure is some set of logical statements (including logical associations) that go together for one reason or another.
  • Logical proof: A set of logical statements and logical associations (which is actually a type of logical statement) forms a theory.  A logical proof is used to make certain that there are no contradictions or inconsistencies within the set of logical statements that forms the logical theory. A logical proof demonstrates or establishes that all the logic within the system (set of logical statements which constitutes a logical theory) is working as would be expected.
  • Logical reasoning: Logical reasoning used to process the logical statements and arriving at a conclusion, a logical proof, in a rigorous way.  There two broad categories of logical reasoning: deductive and non-deductive.  Deductive reason is always certain. Non-deductive reasoning, which can never be certain because it is based on probability, but it can be helpful.
  • Machine-readable representation (physical format, technical format): Humans could read all of the logical statements and manually establish a logical proof of the information.  But to have a machine process the logical statements, you need to represent the logical statements in some form that the machine can understand.
All of the above can be done by humans. Automation is the process of getting a smart machine, such as a computer, to perform that processing.  Informatics is a set of philosophies, principles, and techniques to performing that processing.  In summary, information theory combines logic, mathematics, and computation to understand how information flows and can be effectively communicated!

Additional Information:

Comments

Popular posts from this blog

Relational Knowledge Graph System (RKGS)

Getting Started with Auditchain Luca

Evaluating the Quality of XBRL-based Financial Reports