“skmri.org, Directed By Henry Stevenson-Perez Md” Data. Information. Knowledge. Knowledge-physics.

The Power to Remember our Future
By removing "Probability" from Human & Machine reasoning

The Power to Remember our Future-


Getting started as an Intelligence-Entrepreneur




ZPK-HOST Connect With Us


Our Community - Caring Project

Connect With Us


The Power Of
The SKMRI Knowledge-Physics Lab

What Makes Our SKMRI Knowledge-Physics Lab Discoveries So Valuable To America In 2024?

The SKMRI Knowledge-Physics Team is the first group of scientists in the world to recognize how the Shannon’s Entropy Law equations precisely define the operation of Data, Information & Knowledge in biological-systems – in ALL biological-systems, including humans!

According to Shannon’s Entropy Law (a rigorously-tested scientific equation for measuring uncertainty), “data” is functionally silent (with regard to uncertainty). In a very troubling fashion, “information” mainly just demands attention (with regard to uncertainty).  Ultimately, only “knowledge” is capable of actually predicting the future events on a field (thus, reliably lowering Shannon’s Entropy & uncertainty). Therefore, the easiest & most-accurate way to describe the scientific difference between “data, information, and knowledge”, incorporating Shannon’s Entropy Law dynamics to illustrate their relationship with uncertainty, is as follows:

Data (DATA): In science, “data” represent the raw scientific “facts” that underpin our human understanding of the structure and operations of the universe. In the context of Shannon’s Entropy Law, “data” alone are not capable of reducing Shannon’s Entropy  (uncertainty) when dealing with complex problems.

Information (INFO):  In science, “information” emerges as patterns or clusters within a field of data-sets that somehow gain sufficient vibratory-energy to attract the attention of an organism through sensory modalities (such as light and sound waves). From the perspective of Shannon’s Entropy Law, the receipt of “information” from a field of concern can either reduce or increase uncertainty. When accurate and useful-information patterns from a field of concern are perceived, Shannon’s Entropy (uncertainty) decreases for the organism.  In contrast, if the information about the field of concern is misleading or ambiguous, uncertainty will likely INCREASE. This potential danger highlights the importance of discerning accurate data-patterns within a field (“useful-information”) from “noise” or “misinformation” or “nonsense”.

Knowledge (KNOW): In science, “knowledge” is created within the executive structures of an organism whenever organized patterns or sequences of accurate and useful-information are able to empower the organism to predict and influence future outcomes (“events”) within the specific field of concern. In the context of Shannon’s Entropy Law, all “knowledge” is presumed (by definition) to reduce uncertainty for the organism in the field of concern. However, it’s crucial to acknowledge the caveat that most **human knowledge** in science is associated with a “P-value,” which serves as an indicator of the statistical “probability of containing errors”. By leveraging knowledge with low P-values, humans can successfully navigate uncertainty more effectively and thus shape future outcomes within the field of concern with greater precision (by properly employing the specific motions that are specified by the reliable-knowledge).

Zero P-value Knowledge (ZPK): ZPK is knowledge about the total set of possible outcomes within a field of concern that has NO (Zero) statistical probabilities of containing errors. This is the most reliable form of human knowledge. From the physics-perspective of Shannon’s Entropy Law, ZPK is the only form of knowledge that is capable of reducing Shannon’s Entropy (uncertainty) to the lowest possible levels. This powerful 21st Century scientific realization, all based on Shannon’s Entropy Law, is sometimes called the “ZPK Advantage” – or “Data2Info2Know2ZPK”.