Scrolling for answers: how reliable is mental health and neurodivergence-related information on social media?


Imagine the following scenario. You’re lying in bed, phone in hand, scrolling through TikTok. A video pops up on your For You feed: “5 signs you have ADHD.” Another video claims trauma rewires your brain in ways therapists won’t tell you about. It feels relatable, perhaps even reassuring – but is it accurate, and can it be trusted?

Social media platforms such as Facebook, Instagram, TikTok and YouTube have become a major source of mental health information, particularly for young people. They offer quick answers, shared experiences, and validation (Loades et al., 2025). But as previous Mental Elf blogs have highlighted, the same platforms can also amplify distress and spread false or misleading information – otherwise known as misinformation (read blogs by Margherita and Sarah).

Health-related misinformation is increasingly common on social media, with previous studies suggested 80% of health-related content is misinformation, and that it is more common than accurate health information (Suarez-Lledo & Alvarez-Galvez 2021; Wang et al., 2019). But what about content that is specific to mental health and neurodivergence?

To address this gap in the literature, Carter and colleagues (2026) conducted a systematic review to understand how common mental health and neurodivergence-related misinformation is on social media, as well as assess the accuracy, quality and reliability of the information found.

Mental health and neurodivergence-related content on social media can be experienced as reassuring and validating – but it is always accurate, and can it be trusted?

Mental health and neurodivergence-related content on social media can be experienced as reassuring and validating, but is it always accurate, and can it be trusted?

Methods

The authors searched four databases for articles written in English that evaluated the quality and/or accuracy of mental health and neurodivergence-related information on social media. Studies were screened by one author at the title and abstract stage and the full-text stage, with 25% double-screened by another author; this process was repeated for data extraction. Google Scholar and the reference lists of included articles were also searched to identify any missing papers.

Study quality was assessed using a tool developed in a previous review on health misinformation on social media (Suarez-Lledo & Alvartez-Galvez, 2021). Again, one author critically appraised all studies, with 25% appraised by a second author. The mean quality rating for included studies was 65%, indicating good quality; however, studies ranged from 41% (poor quality) to 80% (high quality).

Results

Study characteristics

Twenty-seven studies were included in this systematic review, with the majority evaluating YouTube (n = 18) and TikTok (n = 5). Almost a third of studies focused on neurodivergence, specifically autism (n = 4) and attention-deficit hyperactivity disorder (ADHD; n = 4), with the rest exploring various mental health diagnoses, including anorexia (n = 3), bipolar (n = 2), and obsessive compulsive disorder (OCD; n = 2). A total of 5,057 social media posts were analysed across studies.

What is misinformation?

Thirteen studies provided clear definitions of misinformation, most of which defined it as, “content which contained factually inaccurate and/or scientifically unsubstantiated claims”.

How much information is on social media?

Prevalence rates for misinformation were reported in 17 studies. Misinformation was highest on TikTok (35%), whereas misinformation on YouTube was generally lower (22%), although this did vary by topic (e.g., 6.7% for dissociative identity disorder, 57% for MRI claustrophobia). YouTube Kids had the lowest rate of misinformation, with 0% for anxiety and depression and 9% for ADHD. The mean prevalence of misinformation on Facebook was 15% (n = 2), and the reported prevalence of misinformation on X/Twitter was 19% (n = 1). Generally, misinformation was more common for neurodivergence than mental health conditions.

What is the reliability and quality of information on social media?

YouTube content was generally more reliable and of higher quality than other social media platforms. However, this was not consistent and does not necessarily mean that the content was good quality or reliable.

Content created by professionals was usually more reliable and higher quality than content by non-professionals; however, some studies did suggest that professional and patient-created content was equally reliable.

Mental health and neurodivergence-related misinformation appear to be highest on TikTok and generally lower on YouTube, with very few studies considering misinformation on Facebook, X/Twitter, or Instagram.

Mental health and neurodivergence-related misinformation appear to be highest on TikTok and generally lower on YouTube, with very few studies considering misinformation on Facebook, X/Twitter, or Instagram.

Conclusions

Carter et al. (2026) conclude that the reliability and quality of mental health and neurodivergence-related information on social media is highly variable, both between and within platforms, and that this could be due to a variety of reasons.

Interestingly, the authors highlight that, “this variability suggests that platform-specific factors, such as algorithmic systems and content moderation, may influence the spread of misinformation”, with the search-based designs of YouTube and Facebook perhaps being less problematic than algorithm-driven TikTok, but much more research is needed to properly understand what drives misinformation on social media, and what we can do to mitigate it.

The quality of mental health and neurodivergence-related information varies widely across platforms, and algorithm-driven feeds may play a key role in shaping what people see.

The quality of mental health and neurodivergence-related information varies widely across platforms, and algorithm-driven feeds may play a key role in shaping what people see.

Strengths and limitations

This is a well-conducted systematic review that addresses a series of important and timely questions on mental health and neurodivergent-related misinformation on social media, providing a valuable contribution to the literature. The review was pre-registered on the Open Science Framework (but not PROSPERO, which is more commonly used) and adhered to appropriate guidelines, ensuring transparent and accurate reporting, which increases its reliability.

However, I can’t help but question why only 25% of articles were double screened at each stage, and inter-rater reliability statistics were not calculated or reported. We’re increasingly seeing this in systematic reviews due to pragmatics – 100% double screening is a task that takes a considerable amount of time. But in a method that is known for its rigour precisely because of this thoroughness, I can’t help but wonder if potentially eligible and informative studies were missed, and if this review is as comprehensive as it could have been. Then again, the search strategy for the review was quite broad, with no restrictions on study type, population, or publication date – but this does make it even more imperative that 100% of articles were double screened, as the inclusion criteria was relatively open to interpretation.

The included studies were highly heterogeneous, making direct comparison difficult, and there was a clear platform imbalance, with 18 studies assessing YouTube and only two assessing Facebook, as an example. This necessitates caution when interpreting the findings from this review; while we can be more confident of the findings in relation to YouTube, findings from other platforms seem tentative at best.

Finally, as is often the case with systematic reviews, there are methodological weaknesses associated with the included studies that have an impact on the review itself. While the mean rating of study quality was 65%, indicating good quality, it did dip as low as 41%. The authors provide a helpful table summarising the study quality ratings, and it seems like study quality was often lowered by the search strategy, such as mentioning search tools, using more than one search engine, and reporting initial hits, which limits replicability and overall transparency. More high-quality research is needed in this area.

While this is a well-conducted and timely systematic review, the lack of inter-rater reliability statistics and limited evidence across platforms means findings should be interpreted with caution.

While this is a well-conducted and timely systematic review, the lack of inter-rater reliability statistics and limited evidence across platforms means findings should be interpreted with caution.

Implications for practice

While the findings from this systematic review need to be treated with caution due to the overall lack of data that can be drawn on, there are interesting implications for anyone involved in public mental health. The important thing to remember is that people are already using these platforms for information on mental health and neurodivergence, regardless of reliability or quality. Now that we have some synthesised information about which platforms tend to be more or less reliable, we can think about how we might increase the reliability of this information, or how we can steer people towards more accurate, helpful content.

For clinicians, this study is an important reminder about the potential influence of social media on service users, and the kinds of conversations that may come up in practice. As a few of us at The Mental Elf wrote in a recent debate article, it is important to have open and honest conversations with individuals who are sharing or believe misinformation, ensuring that it is approached without judgement of dismissal (Higson-Sweeney et al., 2026). There are numerous ways for clinicians to support service users to think critically about the information they may see on social media, and to find content that it relatable and validating, as well as accurate and reliable.

For researchers, this reinforces the fact that sharing evidence-based findings with the general public is a vital part of the research process; if you’re not sharing your findings, then perhaps someone else will, and it may not be as accurate. There is also a need for further research in this area. Personally, I would also love to see larger-scale research that focuses on specific mental health and neurodivergence-related misinformation on social media platforms. Research should directly compare misinformation across conditions, platforms and types of content (e.g., written posts, short-form videos, long-form videos), as well as considering how people interpret and use this information – just because it is available does not mean people are taking it at face value (Loades et al., 2025).

For policymakers, there is a need to develop clearer standards for social media platforms about the moderation of health-related information, including content on mental health and neurodivergence. Comprehensive definitions regarding what counts as misinformation and how it should be addressed (including discussions of algorithms) might help to reduce the amount of misinformation that is circulated or at least prompt those engaging with this content to be critical and to not necessarily believe everything they are reading.

Finally, I think it is important to caveat that misinformation should not be conflated with sharing lived experience. I wrote a blog last year about portrayals of CAMHS on TikTok, and while the videos analysed may not have presented a comprehensive picture of the service and accounted for different perspectives, it nevertheless reflected how these young people felt and what they had experienced. Again, I think there is a need to clarify what misinformation is and isn’t, to make sure that accurate information is shared without dismissal or disempowerment.

People are already using social media to seek out mental health and neurodivergence-related information – so what can we do to mitigate the impact of misinformation on these platforms?

People are already using social media to seek out mental health and neurodivergence-related information, so what can we do to mitigate the impact of misinformation on these platforms?

Statement of interests

Nina Higson-Sweeney frequently collaborates with one of the authors of the current study, but had no knowledge or involvement in this study. Beyond this, she has no other conflicts of interest to declare.

Links

Primary paper

Alice Carter, Fergus Gracey, Joanna Moody, Amber Ovens, & Eleanor Chatburn. (2026). Quality, reliability and misinformation in mental health and neurodivergence content on social media: a systematic review. Journal of Social Media Research3(1), 30-47. https://doi.org/10.29329/jsomer.84

Other references

Hetrick, S. (2018). Social media: good and bad experiences and the impact on depression. The Mental Elf.

Higson-Sweeney, N. (2025). “I don’t need a cup of tea, I need some @#$%&! help”: #camhs through the lens of TikTok. The Mental Elf.

Higson‐Sweeney, N., Badenoch, D., & Tomlin, A. (2026). Debate: Standing up for science–how to combat misinformation in child mental health? Five recommendations for disentangling fact from fiction. Child and Adolescent Mental Health31(1), 74-76. https://doi.org/10.1111/camh.70055

Loades, M. E., Higson‐Sweeney, N., Teague, B., Leas, J., Payne‐Cook, C., Slastikova, A. V., … & Biddle, L. (2025). What do they look for and what do they find? A coproduced qualitative study on young people’s experiences of searching for mental health information online. Psychology and Psychotherapy: Theory, Research and Practice98(2), 373-395. https://doi.org/10.1111/papt.12550

Suarez-Lledo, V., & Alvarez-Galvez, J. (2021). Prevalence of health misinformation on social media: systematic review. Journal of Medical Internet Research23(1), e17187. https://doi.org/10.2196/17187

Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019). Systematic literature review on the spread of health-related misinformation on social media. Social Science & Medicine240, 112552. https://doi.org/10.1016/j.socscimed.2019.112552

Zenoni, M. (2021). Social media peer support groups for OCD and related disorders: helpful or harmful? The Mental Elf.

Photo credits

Hot this week

Vitamin A and Thyroid Hormone Are Crucial for Good Eyesight

The tiny patch of retina that lets you...

6 Foods That Are Excellent For Diabetes

Keeping diabetes under control is critical to living a...

What Frequent Urination Means for Your Health and How to Address It Naturally

A New Series of Health Insights Is on...

Live Updates From the AACR Annual Meeting 2026: Monday, April 20 | Blog

Day four of the AACR Annual Meeting 2026...

Weight loss tips || fast weight loss for women || weight loss drinks # gynaecologist #doctorsvlogs

Dr.Silpahasa Samalla is a gynaecologist,obstetrician and laparoscopic surgeon Fertility...

Topics

Vitamin A and Thyroid Hormone Are Crucial for Good Eyesight

The tiny patch of retina that lets you...

6 Foods That Are Excellent For Diabetes

Keeping diabetes under control is critical to living a...

Weight loss tips || fast weight loss for women || weight loss drinks # gynaecologist #doctorsvlogs

Dr.Silpahasa Samalla is a gynaecologist,obstetrician and laparoscopic surgeon Fertility...

Intercepting Oral Cancer, Sparing Surgery, Preserving Quality of Life | Blog

Injecting immunotherapy directly into precancerous oral lesions may...

SỰ BÌNH TĨNH CỦA ANH GYMER THẬT ĐIÊN RỒ 💀 | Date With Gym

► Kênh của mình chia sẻ mọi thứ liên...

Related Articles

Popular Categories

\