Parents in horror movies and ESPECIALLY horror shows for kids always do this to their kids. When a kid sees something it is always just their imagination or them acting out for attention. What I learned watching these shows growing up was "Never tell your parents, always try to solve every problem by yourself, especially if it is super dangerous and scary"
But you know what really sucks? Women get the same treatment in horror movies. Every woman and child gets gaslighted (a phrase which originated with someone trying to convince a woman she had gone insane) because of course they are just acting up, they are not cool and logical like men. This is saying that women are just like children, unable to comprehend natural phenomenon.
Of course, that scenario belongs in a horror movie, since it's so horrifying. But it happens in real life, too. Just think about all the times a woman has been sexually assaulted and her word was written off as one thing or another-- just a woman trying to ruin a man's life, just a woman not understanding how she had given consent, just a woman "asking for it." Women aren't to be trusted, they let their emotions get the best of them, they can't tell the difference between a nightmare and reality. It scares me and sickens me to think that if anything ever happened to me, my word wouldn't be trusted. When I was a kid, I had my parents PROMISE that if I ever told them something horrible was happening, they would trust me. I wish I could make society do the same.