Hello all. I know its been a while since, I've posted on here due to work, but I finally get to take a vacation due to it being the holidays. Somethings been pondering me ever sense my friend mentioned adopting a child when he gets older and the plans he had in store for him/her. From what he told me, it made wonder; have you ever thought about all the things parents told you when you were young?
Like for example the Easter Bunny, "Where babies come from", Santa, the Tooth Fairy etc. Well it got me wondering... is it okay to tell children "little white lies" growing up and letting them learn the truth of the matter on their own later, or should you be honest with a child from the beginning? I know there are Pros and Cons to both of them.
Some Pros include:
Children get to feel magic is real, thus broadening their imagination. (It did wonders for Walt Disney,) they have something to really look forward to as a child, and it can help keep them innocent before they learn the real truths about the world.
Some Cons include:
When they do learn the truth, some children feel devastated that they were lied to by their own parents which can lead to withdrawal towards the parent and or adults in general, they can then start to doubt what is and isn't true which can lead to "trust no one" (even if its subconsciously,) and other things that are similar to that nature.
I don't think I will ever have children, but for the rare event that I do, I don't know if I could teach them to believe in such things. I mean when you get down to it, isn't it a lie that you're telling? I myself see it on a scale as the government being the parent, and the citizens being the children. They lie to us constantly for reasons that can be considered similar to those as parents to their children.
I do apologize for this being so long. I wanted it to be shorter, but when I started typing, more came out than I thought would Anyway thanks for reading and any feedback would be welcome