We’re taught from a young age to always tell the truth.
And I used to preach to my step-son about honesty. Lies and coverups get found out eventually and the consequences will always be worse.
But is not revealing the truth the same as telling a lie?
To clarify, is it better to be completely honest and upfront about something – or are there times when withholding the truth is the best way to deal with it and protect someone else?
What a dilemma.