Pagination doctor but.

In a new study, researchers from the Lifespan Institute at the University of Kansas revealed that parents trust artificial intelligence (AI) like ChatGPT more than healthcare professionals.

“Participants found minimal differences between words written by experts and vignettes generated by the prompt-engineered ChatGPT,” says Kelisa Leslie Miller, doctoral student in clinical child psychology at the University and lead author of the study. “When the vignettes were statistically significantly different, ChatGPT was rated as more reliable, valid, and reliable.”

Parents trust ChatGPT’s advice to medical professionals. Maria Sabetova – stock.adobe.com

The team conducted a study with 116 parents aged 18 to 65 who were given health texts about children.

Each participant analyzed the content and determined whether they believed it was created by ChatGPT or by health care professionals without knowing the original author.

Although this study did not examine why parents trust ChatGPT more, it does detail factors that may contribute to their preference.

Jim Boswellpresident and CEO of OnPoint Healthcare Partners, who has experience in developing AI-based platforms, believes that ChatGPT’s direct way of presenting information makes it easier for people to digest.

“I can understand why [parents]Without knowing the source, would prefer the words of the AI,” says. Mordechai Heavey, MD, EDMChief Medical Information Officer and Director of Telemedicine at PM Pediatric Care. “Think of AI as the ultimate salesperson; it knows exactly what to say to win you over.

Parents prefer to rely on AI as they can get instant answers to their problems without waiting for a doctor’s appointment.

Each participant analyzed the content and determined whether they believed it was created by ChatGPT or by health care professionals without knowing the original author. Kaspar Grinvalds – stock.adobe.com

However, while using ChatGPT may be a quick fix for many parents, it has some drawbacks.

“Information may be inaccurate or may not be appropriate for specific circumstances. For example, prescribing a drug for a child who is too young or suggesting the wrong treatment can lead to a wide range of dangerous consequences,” says Leslie Miller.

Experts recommend that you check the source of your AI-generated answers or consult a medical professional before applying it to your condition.

“Reputable health content usually credits qualified medical authors or health professionals and links to research-backed sources,” Boswell added.

Parents enjoy ChatGPT’s fast response without waiting. AnnaStills – stock.adobe.com

Artificial intelligence like ChatGPT collects information from various online sources and compiles it into a single response. But when it comes to AI health information, the answers don’t have the medical expert’s opinion that is personalized to the patient.

“Relying on these tools for medical advice may miss symptoms, misinterpret serious conditions, or delay seeking appropriate care,” says Boswell. “For children, especially, small health problems can escalate quickly, so it’s important to have a qualified professional evaluate the condition.”

Leslie-Miller recommends that parents also use trusted online medical sources such as American Academy of Pediatrics (AAP), Centers for Disease Control and Prevention (CDC)and World Health Organization (WHO). Some hospitals also offer health information and advice from their healthcare providers.

“Reading and searching online can be very helpful,” says Dr. Raskas. “It just depends on the context and having a trusted source or professional to help you digest what you’ve read is important.”



Source link