by Impact Journals LLC
Credit: Pixabay/CC0 Public Domain
An editorial was published in the journal Oncoscience on July 19, 2024, titled "Functional information offers individualized adaptive cancer therapies."
As introduced in this editorial, the Oxford Computer Science Dictionary offers both general and technical definitions of information. Generally, information is anything that can cause a change in a human mind's opinion about the current state of the real world. Technically, information is anything that reduces the uncertainty of a system's state.
Claude Shannon provided an objective measure of information, known as entropy (H), by mathematically defining it in terms of the uncertainty associated with transmitting a message from a source, through a channel, to a receiver in a noisy environment. According to the technical definition, information entropy (H) increases when uncertainty in a data set is reduced.
Here, R. Craig Herndon from the Department of Radiation Oncology at the Shannon Medical Center in San Angelo, Texas, also explains that Shannon information entropy, based on outcome probabilities or uncertainties in a data stream, can also be defined in terms of signal data uncertainty, often quantified by standard deviation. He suggests that precision medicine research efforts will benefit from integrating functional information.
"This application of functional information for adapting radiation therapy to individual patients requires an understanding of the systems involved to ensure there is an information exchange between the bioresponse and biomarker model," said the researchers.
More information: R. Craig Herndon, Functional information offers individualized adaptive cancer therapies, Oncoscience (2024). DOI: 10.18632/oncoscience.607
Provided by Impact Journals LLC
Post comments