Although many studies document the use of social media for sharing and requesting information on specific health conditions,1,2 whether individuals obtain diagnoses on social media platforms has not been investigated.3,4 The occurrence of requests for a diagnosis on social media (crowd-diagnosis) and determination as to whether the requested diagnosis was for a second opinion after seeing a health care professional were evaluated in a case study.

Methods

Reddit, a social media website with 330 million monthly active users that hosts more than 232 health forums,5 includes a large subreddit (r/STD) that allows users to publicly share “stories, concerns and questions” about “anything and everything STD [sexually transmitted disease]-related.” We selected r/STD because it focuses exclusively on a health topic of substantial public health concern.6 We first obtained all posts from inception of the subreddit in November 2010 through February 2019. r/STD metadata, including the number of posts and time stamps, were described for the full sample. To quantify the extent to which crowd-diagnosis occurred on r/STD, we drew a random sample of 500 posts. Three authors (A.L.N., E.C.L., and J.W.A.) independently coded whether each post requested a crowd-diagnosis and if so, whether that request was made to obtain a second opinion after seeing a health care professional, using directed content analysis. R statistical software version 3.5.3 (R Foundation) was used to compute the percent of posts (with 95% CIs) that requested a primary or second-opinion crowd-diagnosis, contained an image of the physical signs, or received a reply, and the time to first reply from the random sample. All analyses adhered to Reddit’s terms and conditions, relied on existing public data with nonidentifiable participants, and were exempted by the University of California, San Diego human research protections program.

Results

There were 16 979 total posts, 8 in November and December 2010, 2478 in 2017, 3375 in 2018, and 908 in January and February 2019 (Figure). There was 80% agreement among all coders on crowd-diagnoses (Cohen κ = 0.73) and 88% agreement on whether the crowd-diagnosis was a request for a second opinion (Cohen κ = 0.53) among an overlapping sample of 50 posts.

Fifty-eight percent (95% CI, 54%-63%) of posts requested a crowd-diagnosis, of which 31% (95% CI, 26%-36%) included an image of the physical signs. Of those requesting a crowd-diagnosis, 20% (95% CI, 15%-24%) did so to obtain a second opinion after receiving a previous diagnosis by a health care professional. Examples of posts obtained from r/STD are in the Table.

Eighty-seven percent (95% CI, 83%-91%) of all posts requesting a crowd-diagnosis received a reply (mean responses, 1.7 [SD, 1.2]). The median time for the first response was 3.04 hours (range, 59 seconds to 8.8 weeks), and 79% of requests (95% CI, 74%-84%) were answered in less than 1 day.

Discussion

In the case of r/STD, requests for crowd-diagnoses were frequent, with most receiving a reply within hours, and many of these requests were for second opinions after obtaining an original diagnosis from a health care professional.

Limitations include that a single social media platform and a single medical condition were assessed. Crowd-diagnosing may have been more or less common in other settings. Temporal trends were not analyzed. The characteristics of those posting and responding, the accuracy of the diagnoses, and whether individuals acted on the provided advice were not investigated.

Although crowd-diagnoses have the benefits of relative anonymity, rapid response, and multiple opinions, the underlying accuracy of crowd-diagnoses is unknown given that responders may be operating with limited information about the patient, and responders may lack medical training. Misdiagnosis could allow ongoing disease transmission, and others viewing a post may wrongly self-diagnose their own conditions.

Health care professionals could partner with social media outlets to promote the potential benefits of crowd-diagnosis while suppressing potential harms, for example by having trained professionals respond to posts to better diagnose and make referrals to health care centers.

Section Editor: Jody W. Zylke, MD, Deputy Editor.

Back to top Article Information

Accepted for Publication: August 20, 2019.

Corresponding Author: John W. Ayers, PhD, MA, Department of Medicine, University of California, San Diego, 9500 Gilman Dr, #333 CRSF, La Jolla, CA 92093 (ayers.john.w@gmail.com).

Author Contributions: Drs Nobles and Ayers had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: Nobles, Leas, Dredze, Longhurst, Ayers.

Acquisition, analysis, or interpretation of data: Nobles, Leas, Althouse, Dredze, Smith, Ayers.

Drafting of the manuscript: Nobles, Leas, Longhurst, Smith, Ayers.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Nobles, Leas, Althouse.

Administrative, technical, or material support: Longhurst, Smith.

Supervision: Dredze, Longhurst, Smith.

Conflict of Interest Disclosures: Dr Dredze reported receipt of personal fees from Bloomberg LP and Good Analytics. Dr Smith reported receipt of grants from the National Institutes of Health. No other disclosures were reported.