There's another thing i've been thinking on.
Telling your kids that you're always right is extremely bad.
There is a psychological problem with many people today, in that they believe what they're told, especially if it's about them. What i'm specifically thinking of here is insults and self-image.
Say, someone walks up to Sue. They say, "Sue, you're a moron." Rather than disagree with the person, Sue is likely to assume that the other person knows something she does not. "I guess it must be true", Sue thinks. Then Sue becomes depressed, and starts to hate herself.
Now, this might not be you. But I can tell you that a great many people behave this way. If someone says something about them, good or bad, they generally just accept it to be true. That's why we're always asking people for compliments. We're not content with our own opinions of ourselves -- it's not true until another person says so.
I've been thinking that maybe this insidious behavior is taught in childhood. It makes sense, because otherwise where else would we learn it?
Perhaps our kids should be allowed to tell another person that they're wrong. Perhaps we should back up our kids if they decide to have thier own opinions.
What if you and some friends were in the living room talking about something, and your 6 year old walks by and gives his opinion on the matter. With our current culture, the kid is most likely to be laughed at and dismissed. He will be told that we are right and he is wrong.
I'm just saying -- maybe we could teach our kids to have their own minds for a change?
|