A Parent’s Guide to Understanding Research

with Dr. Eric Youngstrom and Melinda Wenner Moyer

We’re back! This season, we’re diving into the newly published research about the impacts of COVID-19 on children, parents, and families. Along the way, we’ll talk with the researchers who conducted these studies and identify key insights we can incorporate into our lives right now.

But before we review the research, we want to set the stage. How do we, as parents, understand the science behind articles, videos, and social media posts where we find this information?

Join Dr. Amanda Zelechoski as she discusses the scientific process, understanding research jargon, and how to vet news sources with Dr. Eric Youngstrom, Professor of Psychology and Neuroscience at UNC-Chapel Hill and Co-Founder of Helping Give Away Psychological Science (HGAPS), and Melinda Wenner Moyer, a science journalist, author, and faculty member at NYU’s Arthur L. Carter Journalism Institute.

Included in this episode about science literacy:

  • What does “good science” look like?

  • Why does it take so long for scientific findings to be published?

  • What does it mean when something is said to be “research-based” or “science-based”?

  • What tools can parents use to identify and access science reporting that they trust?


Additional Resources for Understanding Research

Resources from Our Guests


4 Considerations When Vetting Information Sources

Advice from Pandemic Parenting Guest Expert, Melinda Wenner Moyer

  1. Where does the information come from? Are you reading a press release or a published journal article? Was the article peer-reviewed?

    Be cautious of studies that haven’t been peer-reviewed because this means they haven’t been evaluated by other scientists. Melinda Wenner Moyer advises parents to find sources they can trust, such as science reporters at major newspapers, academic journals, or government agencies.

  2. Are there any limitations to the research?

    Make sure the study is open and transparent about what the limitations are. An example of a limitation could be that the study only involved people from a certain region in the U.S. or moms of a certain race or ethnicity.

  3. Does the study involve people?

    Although there is important research done using animals, those results don't necessarily extend to people. Melinda Wenner Moyer says, "Just because something works on mice doesn't mean it's going to work on people."

  4. How many people are in the study? How large is the sample size? Does it represent the larger population?

    Depending on the type of study, researchers should have a large enough sample size to draw conclusions that are credible. Often, the more people there are in a study sample, the better, in terms of being able to trust that the research findings apply to a larger population.


Bite-Sized Excerpts from This Episode

 

Talk to Your Kids About Where to Get Information

 
 
 
 

What does evidence-based mean?

 
 
 
 

Questions a Science Journalist Asks When Reading New Research

 
 
 
 

A Parent’s Guide to Understanding the Research Process

 
 
 

Meet Our Guests

Eric Youngstrom , PhD

Eric Youngstrom , PhD, is a professor of Psychology and Neuroscience and Psychiatry at the University of North Carolina at Chapel Hill, where he is also the Acting Director of the Center for Excellence in Research and Treatment of Bipolar Disorder. He earned his PhD in clinical psychology at the University of Delaware, and he completed his predoctoral internship training at Western Psychiatric Institute and Clinic before joining the faculty at Case Western Reserve University.

Dr. Youngstrom is a licensed psychologist, a teacher, President and Past-President of Divisions 5 & 53 of APA, and a parent of two daughters who are now thinking about "adulting" and careers. He married another psychologist (a romance from freshman year!) who has been his foil and "Reality Fairy" throughout his professional journey. He is passionate about bringing the best information to the people who would benefit. This focus has transformed his teaching and research.

He is the Co-Founder and Executive Director of Helping Give Away Psychological Science (HGAPS), a nonprofit service-based education organization that brings the best psychological information to the most who will benefit from it.

Melinda Wenner Moyer

Melinda Wenner Moyer is an award-winning science journalist, professional speaker, and author. Publisher’s Weekly described her 2021 book, How To Raise Kids Who Aren’t Assholes, as a “delightful mix of strategy and humor that shouldn’t be missed.” Her book has been excerpted in The New York Times, The Atlantic, and Parents magazine and is being translated into seven languages. 

Moyer has authored close to 500 articles and is a weekly contributor to The New York Times, a contributing editor at Scientific American magazine, and a former Slate columnist. Her articles have been featured in The Washington Post, The Atlantic, Nature, and Real Simple, and she is a faculty member at NYU’s Arthur L. Carter Journalism Institute. 

Moyer has been the recipient of first place prizes from the Association of Health Care Journalists and the American Society of Journalists and Authors and was shortlisted for a James Beard Journalism Award and a National Magazine Award. Her work was featured in the 2020 Best American Science and Nature Writing anthology.


Full Audio Transcript

[THEME MUSIC UNDER INTRO] 

Dr. Amanda Zelechoski: Can you say... “This is Dr. Amanda Zelechoski.” 

Child 1: Why do we have to? 

Dr. Amanda Zelechoski: Just try it! 

Child 1: I can't. 

Dr. Amanda Zelechoski: Deep breath 

Child 2: [Laughter] This is Dr. Amanda Zelechoski. 

Child 3: [Yelling] Lindsay Malloy! Ah! 

Dr. Lindsay Malloy: Wait, say Doctor Lindsay Malloy. 

Child 3: [Yelling] Dr. Lindsay Malloy! 

Dr. Lindsay Malloy: [Laughing] No, come back! 

Child 4: This is Dr. Lindsay Malloy. 

Child 2: Welcome to the [unintelligible] Parenting Podcast (laugher).  

Dr. Amanda Zelechoski: [Laughter]. 

Dr. Lindsay Malloy: One more time. 

Child 4: And then after that can I have a candy? 

Dr. Lindsay Malloy: [Laughing] No. 

Child 4: Please, Mommy!  

Dr. Lindsay Malloy: Okay, ready? 

Child 4: The Pandemic Parenting Podcast! 

Dr. Lindsay Malloy: Excellent! 

[MUSIC INTERLUDE] 

Dr. Amanda Zelechoski: Welcome to season two of the Pandemic Parenting Podcast! I'm Dr. Amanda Zelechoski. My co-host Dr. Lindsay Malloy, is away on a much deserved research sabbatical, so I'm going to do my best to hold things down for the both of us on this season of the podcast. Back in 2020, Lindsay and I founded Pandemic Parenting in order to share science based research and to help all who care for kids navigate this challenging time together. Researchers around the world, including Lindsay and myself, have been hard at work researching the impact of COVID-19 on children and families. We've tracked more than two hundred of these studies being published, and this season we're going to discuss many of the direct implications for us parents. So let's dive in.

Please note that the information contained in this podcast and on the Pandemic Parenting website are intended for educational purposes only. Nothing discussed in this podcast or provided on the website are intended to be a substitute for professional psychological advice, diagnosis or treatment. No doctor-patient relationship is formed between the hosts or guests of this podcast and listeners. If you need the qualified advice of a mental health or medical provider, we encourage you to seek one in your area. 

[MUSIC ENDS]

Dr. Amanda Zelechoski: When Dr. Malloy and I started hosting webinars, and later recording this podcast in the past two years, we dove into specific questions that parents were asking. As we started to navigate the challenges of the pandemic. With the help of over 50 different guest experts, we translated information about the science of child development, trauma, mental health, family relationships and more. We used what the science community already knew about these topics to try and support parents in real time as the pandemic surged on. But now in 2022 we have even more relevant research available. Researchers that launched studies right as COVID-19 was becoming a reality, are gradually publishing their findings. These findings build on the knowledge we already had, like many of the conversations we facilitated last season, and they'll continue to inform us as parents and caregivers, as we also continue to navigate the pandemic and its aftermath. So that's what we're going to do this season. We're going to dive into the newly published research about the impacts of COVID-19 on children and families. Along the way, we'll talk with researchers who conducted these studies and identify practical insights and key takeaways we can incorporate into our lives right now. So before we dive into the research in future episodes, we wanted to set the stage. How do we as parents understand the science behind all these articles, videos and social media posts where we're finding this information? What does it mean to develop the skills to read and think critically about this information to develop what is referred to as science literacy? 

Dr. Eric Youngstrom: The science is slow. Especially science about human beings, and especially especially about kids. 

Dr. Amanda Zelechoski: That's Dr. Eric Youngstrom, one of our two guest experts. I sat down with him to talk about science literacy and delve into some specific concepts and terms we will be returning to time and time again throughout this season of the podcasts. 

Dr. Eric Youngstrom: Yeah, Amanda, I'm thrilled to be here. I'm a professor of Psychology and Neuroscience at the University of North Carolina at Chapel Hill, and I also am the co-founder of a nonprofit helping give away psychological science, but I also am a parent and a licensed clinical psychologist. And so we have all of those roles and also have a family history of mental illness and have had a lot of conversations with my daughters as they've been growing up and so there are many many different hats that I've been trying to figure out how to wear over the last two decades. 

Dr. Amanda Zelechoski: I also spoke with a science journalist Melinda Wenner Moyer. Melinda is also the author of the book “How to Raise Kids, Who Aren't A**holes: Science based strategies for better parenting, from tots to teens.”

Melinda Wenner Moyer: I'm a science journalist and a mom of two kids. I have a 10 year old and a 7 year old. I have been writing about science and medicine and science in parenting now for like 15 years. I guess the parenting part is more like 10 years. And I write for different publications, I freelance, but mostly for Scientific American and the New York Times. And I also teach now at NYU in their journalism program

Dr. Amanda Zelechoski: You might have heard a scientist or reporter say something like, “well, there's very good research to suggest XYZ.” But what does good research look like? And why does it take so long? The pandemic started almost two years ago and we're only now seeing some of the first publications about the impact on children and families being published. Why is that? 

Dr. Eric Youngstrom: It's a very scripted thing. And one of the things, if you– if you're not inside that area, one of the things, which is surprising, is how formulaic it is, and, uh, always– the best analogy that I could describe is like writing limericks. There are these rules and a good limerick has to have this many words and then this many words and then this many so for the first 80% of my career, the premium, if you wanted to get nerd famous then you would do a lot of these nerd haikus. That sort of gradually extended and moved the ball. In the last five or six years, we've had a nerd crisis-– science is supposed to be reproducible here. The analogy that I use is cooking. A good research article should describe the methods. Here's the steps that we took to do the science and the idea is that if anybody else picked up the same recipe and got the same ingredients, they should get the same result. And if you followed the same steps with the same ingredients and you don't get the same result, that's a problem for the idea. A big thing that's happened in the last 5-6 years is that people have looked at all of our classic nerd haikus, a hundred of the most most cited influential articles and tried to follow the same recipe, and what? You're shaking your head? Do you already know what percentage of our recipes actually duplicated?

Dr. Amanda Zelechoski: I don't. 

Dr. Eric Youngstrom:  Less than half, around ⅓. So then there was a big food fight in the kitchen and —

Dr. Amanda Zelechoski: I love these analogies. 

Dr. Eric Youngstrom: — all the people that came up with the famous recipe– it was really awkward that their famous recipe didn't reproduce and all the people that did it were like, yeah, that is pretty awkward and so then in food fight mode they're like, “but you're a hack, you're someone that no one's ever heard of, and we're famous, important people.” And it turns out that that's not supposed to be relevant to science. It doesn't matter whether you're famous or not. The good science– anybody that follows the same recipe is supposed to get the same thing. So reproducibility is key. 

Dr. Amanda Zelechoski: Which, if people have heard that term, the replication crisis in some of these disciplines. That's, yep. That's what you're describing.

Dr. Eric Youngstrom: Right. 

Dr. Amanda Zelechoski: OK, so at this point, Eric and I talked for more than 1/2 hour about the process of designing and conducting scientific research, but you are a busy parent and may not want to hear all of our nerdy academic tangents, so I'll give you a quick summary. To stick with Eric's recipe analogy, here's how it typically works. First, you have to come up with a hypothesis, this is like an educated guess about what's going to happen. It's like writing a recipe. So if I use these ingredients and these appliances, this is what I anticipate will happen based on my past experience with cooking and what I know about combining these ingredients in this way. More and more often, researchers are also now pre-registering their hypothesis, so this is like giving the recipe to somebody else before you go into the kitchen. That way I can't say that I'm going to bake a casserole and then when it doesn't come out the way I wanted, mash it up and claim that I was trying to make beef stew all along. Then, when your research study involves animals or human participants, you also have to get special approval. Typically an Institutional Review Board or an IRB reviews your “recipe” to make sure that the benefits of the study will outweigh the risks and that you are taking proper precautions to prevent harm and deal with any unexpected adverse events. When you do research that involves children or other vulnerable populations, they review your study in even more depth to make sure that the procedures are safe and that you're minimizing any potential harm. Next, depending on the type of study, you have to make sure you have a large enough sample size. Melinda will talk more about this in a bit, but for now just know that you need enough people in your study to be able to draw conclusions that are credible. Then you run the experiment or launch the survey or whatever your particular “cooking method” might be, once you are done collecting data, once you complete your experiment, it's time to analyze the data. So in our analogy, this would be like taking a bite and thinking “did it come out the way I anticipated it?” If so, why is that the case? If not, then you spend time analyzing and interpreting the results to try and understand and explain why things didn't come out the way you expected. And finally, you write up your results. And then after all that – which is typically months or even years from the time you first came up with your hypothesis to writing up your results – it's finally time to publish or share your “recipe” with others, but even that can take a while. 

Dr. Eric Youngstrom: Second, you write it up. You pick a place that you want to send it and you send it off to them and they send it out for peer review. 

Dr. Amanda Zelechoski: So are we talking, just to clarify, like I could send it to People Magazine or I could send it to the Journal of Abnormal Child Psychology, why? 

Dr. Eric Youngstrom: And you, you could– how do I put it? And you also could publish it on your own website, but for it to be taken seriously as potential science, you need to send it to somewhere that's going to add this extra quality control layer of peer review. So we could take my dissertation as an example. So dissertation is the project that you do and then write up and present to a committee for them to decide, did you do good enough work that we would give you your Ph.D.? So take that, cut it into about a third of the size, follow all of our rules about formatting, and then send it in. And we will then take your name off it and send it to other people. That's two to five other experts for them to decide, is it good enough or not? 

Dr. Amanda Zelechoski: They take your name off it because of what you were talking about before. Which is, we don't want there to be bias. If this is a really famous big name scientist, we don't want to just say “yes, this is a good study because so and so did it.”

Dr. Eric Youngstrom: Right. Right. No, it's fascinating. It really levels the playing field. So if you were some famous big dog, they'd take your name off and nobody would be like “oh wow, this is by famous big dog” and when you're a nobody starting out like I was with my dissertation, you're not penalized for being a nobody just starting out, so that's, that's cool, right? I just got chills describing the level playing field that ideally science is supposed to be. You take the ingredients that we usually are working with, you are allowed to try your hand at this, that's incredible. So if everything goes right — you get in on the revision and resubmit — it's now six months since you already knew the answer, but the rest of the world doesn't know if they can trust it, yeah. Now, as soon as it's accepted, they turn it into a PDF and they put it up online, which sounds really good, but the problem for parents and the problem for clinicians is that especially in psychology 80-90% of what we're doing is behind a paywall so it's really expensive and really frustrating for parents and clinicians. That is the part that's so dumb.

Dr. Amanda Zelechoski:  When so much published research is behind a paywall, it can be hard for those who aren't a researcher or student to access that information. And like we mentioned, all of these findings are written to an audience of other scientists, so there's a lot of jargon that the average parent won't have time to figure out. Even if they did get past the paywall, this is where science journalists like Melinda Moyer can help. I asked Melinda to walk us through some key terms to understand when reading about research. Whether you're reading the original publication or maybe a news article that's summarizing the findings. 

Melinda Wenner Moyer: You can say “evidence-based” anything and it just sounds more credible, right? And there's really, you know, there's nothing limiting anybody from using that term. It’s not like you have to apply for a license to be able to say something evidence-based or science-based or research-based so they get used a lot. I think it is okay to make this particular claim or method or whatever it is it's using evidence-based, science-based, it has published peer reviewed research backing it up like backing up the claims, backing up their recommendations. It is rooted ultimately in science and in the scientific process. So that's how I think about it. But again, if you just see that term, it doesn't necessarily mean anything, you know? People can use it just to make something sound better and more credible. So you have to be a little careful when you come across that. 

Dr. Amanda Zelechoski: Yeah. And do that further digging you described, like just because something says it's “evidence-based” or “science-based,” you know, go further and find out like “well what was the science they based it on?” or “what is the evidence backing up these claims?”

Melinda Wenner Moyer: Yeah, absolutely yeah. 

Dr. Amanda Zelechoski: So now what about samples, right? This is a term that's often used in these studies? So can you talk a little bit about that? Like how do you tell the difference between what is a quality study, meaning, how many people were in the sample or group of people they evaluated or asked that research question about and things like what does randomized mean? What does representative mean? So talk about sort of samples and studies. 

Melinda Wenner Moyer: Yeah. So this is something I didn't mention when I talked about how I look at a study, but certainly size is important. If this is a study of people, which hopefully it is, although there's very great important research that's done in animals, I don't mean to denigrate it, but you know, the more people generally the better. If you have a study with just 10 people in it, then you can probably trust the findings less than you could if it was 1000 people. Just because you know 10 people, there's more room for there to be bias, or you know, it's just fewer people that you're studying, and so you can trust that a little bit less. But in addition to the size of a particular study, especially when you're looking at survey based studies where researchers are asking people questions, you really want to check that the sample is a representative sample. That’s one term and that means that the sample is from a subset of the population, that accurately reflects the larger population, so let's say you're looking at a study on how mothers in the U.S. are coping with the pandemic, you want to be sure that that particular study wasn't just asking, you know, wealthy moms or moms in Washington DC, or moms whose kids go to a particular school, you know? You really want to make sure that if you want to extrapolate from this, this is how moms everywhere feel, you better be pretty sure that the sample that they surveyed reflects this greater population of moms in the U.S. and not just some sort of biased small sample that's different from the larger population. And randomized is similar in the sense that, you know, it's like a random sampling of people rather than a sampling from a particular subset, or you know subgroup of the population. So those, yeah, those are terms that you hopefully see if you're reading a study or a press release about a survey that's being done, you want that sample to be reflecting of the bigger sample that you're extrapolating about, if that makes sense. 

Dr. Amanda Zelechoski: Yeah. Or if it's not to, as you kind of said, you know, to be skeptical about that, but also look for whoever is writing about that study or that sample to be forthright, you know, forthcoming, about here are the limitations or, yeah, you know, what this study ended up being only people from a certain region in the U.S. or moms of a certain race or ethnicity, and so there are limitations to what we can interpret from these results. It doesn't mean there might not be helpful findings that emerge, but we just want to make sure that that scientist or whoever is writing about that study is being, you know, open and transparent about what those limitations are. 

What is your process when you're reading research, you know, and you're saying things like “how do I know if something's trustworthy?” or “where does this information come from?” Yeah. What are some of those questions you ask yourself when you read things? 

Melinda Wenner Moyer: Yeah. So the first big one is, where is the information coming from? Is this a press release that I'm reading? And so, you know, a press release is like something that's written by people in a company or people at an academic institution? And really, the goal is to, you know, make the research sound really important and interesting, either to, you know, sell something if it's in the case of a company, or to, you know, just benefit the reputation of the scientists, if it's scientists at a university or at a hospital or medical center. So, you know, press releases are great and I read them all the time and I get bazillions of them in my inbox because a lot of people have figured out you know, I cover science, so let's send Melinda press releases. They're great, but, you know, you have to read them with a big grain of salt, because the people who are writing press releases have really strong incentives to hype the claims that they're making in their press releases to make the science seem more exciting, more interesting, more important than it really is sometimes, because again, they're trying to really sell something a lot of the time, so you know when I read a press release, there's also like different qualities of press releases. Some of them are actually really, really well done, especially at academic institutions, you know they will, they will include information like, you know, what are the limitations of this research? What are the weaknesses? What are the things that we don't know? And when I read a press release and I see information like that, that’s when I know I can trust this press release a little more than I can trust one that's like a “new cure for cancer will change the world,” you know? So I'm looking at the language. I'm looking at, you know, is this couched in nuance? Is it not? Is it really hyperbolic? Is it not? You know, what kinds of things are included and that helps me understand whether it's trustworthy. That's if, like it's a press release, right? Some science information I'm getting is directly from studies that are published, and so in that case I'm looking at it again. I am reading the language like what is, you know, how are these scientists writing about their findings? Are they being careful? Are they, again, kind of you know, seeming to be a little too excited about them? Where was the study published, you know? Was it published in a journal that is well respected, that has the peer review process built into it, which you know, in order for a study to get published, it has to be read by a panel of scientists in the same field who were not involved in the research who really, you know, read it carefully who understand what kinds of design aspects should be, you know included in this kind of study, who really, you know, vet the study and make sure that it's good enough to be published in a journal so, you know. Is it published in a peer reviewed journal? Is it not? Is it not yet published in a peer reviewed journal? And during the pandemic there have been a lot more preprints as they're called, which are studies, you know, and because we want the information as quickly as we can get it with COVID, a lot of studies are being published on servers before they have gone through that peer review process. And so if I'm looking at a study like that, I'm a little more cautious 'cause I know it hasn't really been vetted by other scientists. I'm also looking at the numbers. The statistics in it. I'm not going to get into the weeds here, but there's information you can look at in a study that reflects, essentially, like the likelihood that the results were not just due to random chance, that you can actually trust that, you know, if there was an intervention being tested, that the benefits of this were real to some, you know, and were the result of the intervention. So you can, you can look at study design, you can look at, you know what kind of study it is, it's a clinical trial? Is it just a sort of observational study where, let's say, you know, researchers are finding associations between things like “okay, a study finds that people who say they eat a lot of spinach are less likely to get into car accidents, and the connection that they make is like eating spinach, prevents car accidents.” And you have to think, well, wait a minute, this is just a correlation and association between two things, and we have to be really careful about how we interpret that, you know, we can't really say eating spinach prevents car accidents if we're looking at just an association, but we, you know, there could be some other reason. Maybe just people who are very health conscious tend to eat spinach and also drive more carefully, and it's not that this minute has anything to do with the car accident. So you have to sort of think about how it was designed. What are the conclusions we can make based on how it's designed? There's a lot of stuff here. Oh! And another really big one: “is this a study that actually involved people or was it done in animals or just cells?” So there's– I remember learning in graduate school, one of my professors used to say “we've cured cancer in mice millions of times.” So if you're looking at a study and it says we've now cured, you know, breast cancer with this new drug in mice, you know, okay, that doesn't mean that it's going to work in people. So you really have to look at details like that too, but honestly, one of the best ways—  'cause it can be really hard when you're reading a study that's in a field that, you know, I don't— I'm not a scientist, and so I'm reading studies in so many different fields that do things so differently and there's so many different considerations. So what I find is, you know, I might be able to get some indication looking at a study or a press release of how well the study was done, but I also might not. And what I always do, though, if I'm going to cover a study or if I'm thinking about it, is I will reach out to other scientists in the field who were not involved in the study and, you know, well respected scientists and say, “you know, can you take a look at the study?” Does this look like a good study? Does this look like information that we can trust because they are often the best people to ask because they're in this field, they know how things work, they know what the important, you know, statistical controls need to be that I'm not necessarily going to know—and so that's something that I always almost always do when I'm looking at covering a study, you know, talk to outside researchers, they can really, really be helpful.

Dr. Amanda Zelechoski: Oh, so many great examples there. Your example of being— reminded me, you know, when studies are in mice versus humans like that — it reminded me of when I was pregnant with my first son and I read a study that talked about how the brain shrinks. You lose 7% of your cognitive capacity with each pregnancy and I was losing my mind like “oh wow, that is—” and then realized it was a study on rats. Oh my gosh. So, you know, we're not quite ready to extrapolate that to humans, but that was reassuring to me. Once I was teasing through also– oh let's remember what this study was for, so. 

Melinda Wenner Moyer: Yes, so important. But, you know, some outlets that will cover science, like we'll cover press releases or cover studies, they will not make that clear, like they will miss that, and they will, you know, they will cover it in such a breathtaking way, and that's really tricky, I mean, and that's really hard for the average person to be able to tell, like this is, you know, is this piece of journalism covering this study well or not? And these are all– this is – there's so many layers in which things can and ways in which things can sort of get misconstrued or overhyped. And it's very, very hard for the average person to be able to tell like “can I address these needs?”

Dr. Amanda Zelechoski: Right. I'm thinking about, yeah, as you're describing your process, you know the time it takes and the nuances that you described. And you know, peeling away all these layers to truly understand what a study looked at. And I mean just thinking about you know your normal everyday parents capacity for that and just you know I have 5 seconds. I'm scrolling social media. I see a headline or I see a statistic and how do I, the typical consumer of news and media, tell the difference between this statistic in this headline that's very sensationalized versus an actual you know legitimate kind of credible reporting of findings. So yeah, I'm wondering if you have any sort of thoughts or suggestions for that, because you know most of us are not able to do the thorough process you described. 

Melinda Wenner Moyer: Yes, no. I know. We don't have time to do anything other than the things that we're doing to survive right now, let alone, you know, spend 20 minutes looking at a study. Yeah, it's really hard. I mean what I would say is it can be helpful to find sources that you trust. So, you know, I know, for instance, that there are some really great science reporters at the New York Times, and there are great science reporters at, you know, different journalistic outlets where you know, and I've gotten to the point where I could say, I know that if Apoorva Mandavilli is saying this, I can pretty much trust the she's a reporter with the New York Times, for instance. Because she is a trained science journalist and she knows how to do this and she is paid to do this and she does this with every study. So I suggest finding journalistic sources, and of course also turning to sources of information that you can trust, like the American Academy of Pediatrics or the, you know, CDC, where you know you know that what they're putting out is being vetted by scientists or doctors and it's it's trustworthy. But It's hard like you have to settle on what, what are those sources going to be, I mean, and I would say like some of my favorites are some of the places that I love to write for because they do like to hire science journalists. So this is, you know, Scientific American Popular Science does a great job. The New York Times. The Washington Post. You know a lot of the big newspapers have very good science journalists on staff, and so if you can get your information from these sort of mainstream places where they're really doing their homework. They're trying to ensure that they're putting out, you know, vetted scientific information, then you're going to be better off than just clicking the first thing that comes up on Google or clicking a headline from a place you've never heard of where this could be a place that's sort of like a science information mill where people are just like regurgitating press releases and publishing them as if they're articles that you can trust. And I'll talk to you for a minute, uh, or a little bit later, I think about like, how do you figure out whether to trust a website or an organization or a journalistic source? 

Dr. Amanda Zelechoski: I was just thinking about that. Feel free to talk about it now 'cause I was thinking about like what is that healthy skepticism we should have, you know, when we're reading things? 

Melinda Wenner Moyer: Yeah, yeah so, again, this is, you know you come up against the issue of time again, so I actually just wrote a piece for Scientific American about how do we teach like news literacy to kids, which I know you mentioned that you saw and there's a lot of disagreement in the field about how do we do this well? And there's– we need more research. But there is one approach that was discussed that was really really beneficial in a study and I'll describe the study. And I mean it takes a little time, but it really works, so researchers at Stanford– they had this study done I think in 2019. They asked a handful of history professors and a handful of journalism fact checkers and a handful of just undergraduate students at Stanford to try to figure out how trustworthy a particular website was. And they watched what these different groups of people did and how they tried to figure out whether this website was trustworthy and they found that the history professors and the undergraduates tended to, you know, when they were given the website, they tended to scroll around and click around on that one website to see like you know, the ‘About Me’ section. Or you know, “About Us’ or whatever and reading about the organization. On the website itself and looking at, you know, is this a “.com” or “.org” or “.edu?” And these can be useful things to look at, but ultimately, when they do this, you know some of these websites that are maybe biased. They are very good and very slick, and they sometimes look like, oh, we are, you know we are totally trustworthy. You can completely trust us the way they write stuff, you know. They're very manipulative. And so these historians and the undergraduate students could not generally figure out if a website was actually, you know, credible or trustworthy. They often said “Yes this is great, this is trustworthy.” When actually it was a website that was tied to an industry funded group or, you know, a website that's really kind of been known to have issues with bias. The journalism fact checkers, on the other hand, they did not spend time on the website itself, looking through the different pages and reading about the website, they immediately opened a bunch of new tabs and they googled the organization and they started reading, you know, they looked down at the results and said, oh, that's interesting, you know? There's a New York Times article that talks about this organization. Even just Wikipedia sometimes has really useful information saying like this particular organization is a, you know, conservative think tank or a very liberal think tank, or this or that or the other, and essentially they spent no time on that website itself, but they immediately went to look for other trustworthy sources of information that could give them an indication of this organization's merit and credibility. So this is something I think, you know, we can learn from this, that if we are on a website and it's saying, you know, oh my gosh, there's this new crazy thing happening in the world and or new thing to worry about or whatever it is. And it seems like maybe you want to do a little more research. The best thing to do is to Google that organization name, you know, if it's a journalistic organization, if it's a nonprofit, whatever it is you know, do some Googling. Look in the Google results for websites that you know and trust that you know are good sources of information. See if they've had anything to say about this group, and you can just with like 5 minutes or less of this kind of Googling, you can get a lot of information 

Dr. Amanda Zelechoski: While you're doing your lateral research to understand more about something you might have found online, there's a website you're bound to find in your Google results, Wikipedia. Maybe you've been told to never use Wikipedia. After all, can't anyone edit it? I know I've told many students that they aren't allowed to cite Wikipedia, but that's part of their academic research. So can Wikipedia be useful to us everyday parents trying to understand science? These were just some of the questions I was really eager to ask Dr. Eric. I want to talk about age gaps or helping give away psychological science, because that's exactly what you're trying to solve, and so how do people, especially parents or caregivers, find their way to psychological science? 

Dr. Eric Youngstrom: Yes please. 

Dr. Amanda Zelechoski: That is, in language I can understand and immediately you know how to implement it in my life. So first, what is it, Age Gaps?

Dr. Eric Youngstrom: What is Age Gaps? Age Gaps is a charity founded in North Carolina and recognized by the IRS as a 501C3, like you guys are. It is the product of a conversation between me and a graduate student, Yan Liang Hu, who is originally from Singapore and now is back in Singapore by way of the Mayo Clinic. He and I were talking about how do you make it easier for people to use what we talk about in class and what we do in our research? When they're actually trying to help people, so that was 2014, so it's been an incredible journey. Ever since then, 'cause like everything that I thought I knew about Wikipedia has turned out to be wrong. 

Dr. Amanda Zelechoski: Well, that's what–  I, yeah–  can you unpack that a little bit? Because you think about, you know, what our kids are taught and even what we teach our graduate students, undergraduate students, you know, be wary of sources like that, how do you know it's credible? So yeah, talk us through what– what did you learn about Wikipedia through this? 

Dr. Eric Youngstrom: From circa 2014, as an academic, my reaction was Wikipedia, eww. “Can't anybody write that?” Like anybody could write that, there is no peer review? So anybody could write and put anything on it. It's a lot harder than that. As of right now, at the sound of the tone, here's what Wikipedia looks like. The largest encyclopedia in the world. More pages than Encyclopedia Britannica more 15,000 pages, 15,000 articles about psychology and topics related to psychology. Fifty thousand 100,000 related to medicine. So this is a, uh, gargantuan amount of information available for free. Nothing like it has ever existed before in human history. At this exact second, there are hundreds of people editing it all around the world. We could hop on. We could try to edit a page, most of them that we would try first are locked, so “can anybody can edit it” is not true. Especially for political figures, celebrities, and medicine, anything that is tied to politics, popular media or that affects health is very likely to be locked and you have to level up to two. OK. Have a history of reasonable edits before they'll change your accounts so that you could actually edit the page. If any of our listeners have ever used Microsoft Word or Google Docs. If you've used an editor that lets you track changes so you can see that, here are the suggestions, or here are the words that got changed, Wikipedia automatically does that, so it's got track changes forever. Since the beginning of when that article got started and it tells you, here is the account that made that change. So what Age Gaps is doing is trying to figure out what's the good stuff. What's the good information about psychology? What I'm coaching the students and the members to do is look for those SparkNotes. Look for the executive summaries. And if you're a student or working at a university, you're on the inside of the paywall, so we have a free subscription to a whole bunch of stuff that a parent wouldn't be able to get access to. The hack or the way to speed this up because like you, just walked me through, science is slow, especially especially science about human beings and especially especially about kids. And we're like, OK, so let's go straight to the close notes, the SparkNotes. Let's make sure that we're looking at the most recent accurate ones of those. And then let's write a summary of it and put that out there somewhere a parent could find it for free. Overtime, as we've gotten the hang of it, they are actually the same people that pay for Wikipedia pay for several other projects that that our listeners probably know as much about as I did a couple of years ago. So Wikipedia is the rock star that everyone has heard of. Wikiversity is a sister site that is much smaller. It's the same computers, the same web face. The web pages look very, very similar but much smaller and that actually is where we're putting the information for clinicians and for students who are like I, I think I want to become a psychologist. The newest thing that we're doing is using it for researchers. Being like here, put the detailed version of the recipe here. Put more of the like the tips about if you're if you're going to try to cook duck or you really you really want to try to pluck a chicken, right? Here's the nitty gritty. So Wikiversity is like at the chef school? Here's the place to find more details about how to do the prep, how to gather the right ingredients, and Wikipedia is where we're writing more for parents and the general public. So let's sort of bring it back to the pandemic when the United States entered lockdown back in March of 2020. Is that right? 

Dr. Amanda Zelechoski: Yep, in the “before times” right? We don't even talk about years anymore. It's just the before and the– yeah right? Yes, March of 2020. 

Dr. Eric Youngstrom: I know. God. There was a Wikipedia page about the coronavirus. That page right as we entered lockdown that page was edited hundreds of times a day and that page was read millions of times a day. So any change that got made to that page, there were hundreds of people fact checking it and looking at it. But we talked about how slow science is. When the good stuff finally arrives, “hey we got an updated Cliffs notes”, you could hop on Wikipedia and say we just got the newest edition of the Cliffs notes and here's what changed based on that. And you could do that within seconds. So the speed is like nothing that we have ever had before. The fact checking, so to make a change to that page is actually really, really difficult and you'd better be right. And then the last part is just the reach. So like my dissertation. Super duper proud when I got that done. I think 6 human beings have actually read it. My committee and my wife and my mom, I think, finished it and everyone else, their eyeballs glazed over. And now I got it into a top clinical journal which helped me get a job, helped me get tenure, helped me a lot professionally, but that article has only been cited a couple 100 times. And then the page about cognitive behavioral therapy on Wikipedia, 2,000,000 views a year. 

Dr. Amanda Zelechoski: What is the difference? 

Dr. Eric Youngstrom: Yeah, so one of the things that Age Gaps is trying to do is work with– They're professionals and the experts, and hey, have you looked at your Wikipedia page? And this is happening right now! The Association for Behavioral and Cognitive Therapy, which is a society of experts on behavioral and cognitive therapies. They looked at the article that gets 2,000,000 readers a year and they like it, it's good, but it could be better and it could have more up-to-date information on it and then if you flip the page and look at how it looks on Wiki. They're like, yeah, it used to be excellent, but now it's kind of– It's a B class article. And so what we're trying to do is actually get the expert editors and the expert psychologists together to update the page and have everybody agree, this once again is a top top shelf article. 

Dr. Amanda Zelechoski: Accurate, updated yeah, oh that's amazing. 

Dr. Eric Youngstrom: Because as people are looking at like, hey what do we have as options for depression? 

Dr. Amanda Zelechoski: That's right, and if somebody recommends to me as a parent, you know you might want to think about cognitive behavioral therapy for your kid and I don't even know what that means, that's where I'm going, right? 

While I was talking with Eric and Melinda about science literacy for parents, I couldn't help but think about my own kids. They watch YouTube. They hear things on the playground. They eavesdrop on our conversations. So how do I help them develop their own sets of tools for understanding science and thinking critically about what they hear? How we teach children and teens media literacy could be an entire episode on its own, but I really appreciated this one piece of advice that Melinda shared. 

Melinda Wenner Moyer: Talk to your kids, like bring them into your process for how you go about understanding science and learning about and you know and deciphering whether information is trustworthy. Like bring your kids into that, have talks about it, even like sharing with your kids that this is hard and it's complicated, but like here's how I'm going to make– try to make sense of it, you know? We don't want to scare our kids and for good reason. Or we don't want to overwhelm them, but sometimes I think a lot of the times actually it can be very, very helpful to have the conversations to lean into these types of topics instead of leaning away from them, even if they're complicated, even if they're tough, because if we don't have these conversations with our kids, essentially, then, where are they going to get this information right? They're just going to start trusting everything they hear from their YouTube influencers or whatnot. So we really need to give them these tools– 

Dr. Amanda Zelechoski: Or on the playgrounds, 'cause let me tell you what they're learning out there isn't right either. 

Melinda Wenner Moyer: Absolutely, right! So I mean we want to make sure that they're getting the nuances and the complexity in away from us 'cause they're not going to get it from anywhere else, and we want to be giving them the tools to ask the right questions and to you know, not take everything at face value and to really pause and think like how do I make sure that I should trust this and so yeah, talk about it. 

Dr. Amanda Zelechoski: So I hope it's clear from these conversations that understanding science and the way the media talks about science is important. Science and media literacy help us identify what information is and is not credible as we make decisions for ourselves and for families. In the show notes we've included links to the resources mentioned in this episode, as well as additional resources to help you and your family continue developing science literacy. 

A bit of housekeeping. We are changing up our schedule a bit this season to release one episode per month. Each month you can expect a thoughtful, succinct, and accessible discussion of some of the most recent research being published on the impact of COVID-19 on children and families. 

This has been an episode of the Pandemic Parenting Podcast. We'd love to connect with you via email or on social media. Follow the links in our show notes or look for our blue and yellow logo when you search Pandemic parenting on Twitter, Facebook, Instagram, LinkedIn, YouTube or TikTok. And this podcast isn't all we do. Pandemic Parenting is a 501C3 nonprofit, providing free science based resources for parents and all who care for children while navigating the COVID-19 pandemic. To learn more about our organization and access our extensive library of webinars, videos, blogs and more, visit www.pandemic-parent.org. This season of the podcast is produced by Victoria Bruick, Carmen Vincent and myself, Dr. Amanda Zelochoski with strategic support from Dr. Lindsay Malloy and Pandemic Parenting's executive Director Jennifer Valentine. Many thanks to the Pandemic Parenting team for their work in making this show happen, including Miranda Dauphinee, Lydia Lang, Julianne Matthews, Mark Snow and Paula Sillers. If you want to support the work of Pandemic Parenting, you can do so in a few ways. You can leave us a five star rating and review on Apple or Spotify. You can donate to our nonprofit at www.pandemic-parent.org/support. And lastly and most importantly, you can share this episode with parents and caregivers in your communities. Until next time, thanks for listening and please give yourself some grace today.

Previous
Previous

How Families Experienced Uneven Effects of the Pandemic

Next
Next

Roadmap to Resilience: How Parents Can Foster Resilience