08/02/2023
Consumer-level Generative AI in medicine should be regulated.
You probably expected me to be the last person to say this since we started presenting on GANs in 2015 and publishing in generative AI since 2016. But when you make generative AI for internal use or B2B applications, the resulting products will need to go through rigorous validation before these products reach the clinic. And businesses will make every effort to validate.
Consumers may self-medicate. And to allow for this, in the short term, we need benchmarking tools, large GPT-friendly biomedical training sets, and some warnings and some level of regulation in the short term.
Once you get addicted to conversational generative AI, you are very likely to trust the systems. And they are very likely to self-medicate.
Here is a brief article outlining these points.
Generative artificial intelligence tools such as chatGPT have many uses in medicine, but a lack of accuracy poses problems.