

It’s rarely true.
You can aim to do something good, with a risk of something bad happening (e.g. as another poster said, rolling the dice on surgery to alleviate suffering at the risk of the patient dying)
…or you can do evil.
The “lesser of two evils” is just used as justification for something that can’t be morally justified otherwise.
…and unless we recover it all with robots, that’s exactly where it’s staying. It’s far too unstable to move with people.