-
Categories
-
Pharmaceutical Intermediates
-
Active Pharmaceutical Ingredients
-
Food Additives
- Industrial Coatings
- Agrochemicals
- Dyes and Pigments
- Surfactant
- Flavors and Fragrances
- Chemical Reagents
- Catalyst and Auxiliary
- Natural Products
- Inorganic Chemistry
-
Organic Chemistry
-
Biochemical Engineering
- Analytical Chemistry
-
Cosmetic Ingredient
- Water Treatment Chemical
-
Pharmaceutical Intermediates
Promotion
ECHEMI Mall
Wholesale
Weekly Price
Exhibition
News
-
Trade Service
Oct 19, 2020 // -- In a recent study published in the international journal Nature, scientists from the City University of New York and others challenged the use of artificial intelligence systems for breast cancer screening, and researcher Levi Waldron said, "We are concerned about the lack of transparency in artificial intelligence algorithms used in health applications."
Photo Source: Unsplash/CC0 Public Domain In a recent study, researchers from Google Health said they used artificial intelligence to diagnose breast cancer using information from mammoth X-rays, which is more accurate than human radiologists.
researchers believe that restrictive data access programs, lack of publicly available computer code, and unconfirmed model parameters may make it difficult for any researcher to confirm or extend the work.
addition, the findings highlight how appropriate measures can be put in place to protect patient privacy, while allowing a broader research team to contribute more ways to correct potential errors.
researcher Waldron says this back-and-forth battle is a striking example of the current state of the struggle over who controls data, which has been going on for decades in biomedical and other research.
Researchers who are well-funded and collect patient data rarely have an incentive to share, but they are the ones who sign the informed consent form and the one who decides to share the terms; protecting patient privacy, even at hypothetical risks, may be a way to keep valuable data (and even the resulting parameters of the predictive model) away from other researchers.
The researchers believe that patients who volunteer for medical research may not be told about the balance between the privacy and usefulness of the data, and certainly not have much say in it, so there may be a larger conversation between later researchers, at least at the individual level of IQ, and of course at least the patient population.
() Original source: Haibe-Kains, B., Adam, G.A., Hosny, A. et al. Transparency and reproducibility in aman intelligence. Nature 586, E14-E16 (2020). doi:10.1038/s41586-020-2766-y