Consequentialism is a blanket term for ethical systems under which whether an action is ethical is determined by its consequences
will this be on the test
There’s Virtue Ethics, which states that morality is gained through aspects cultivated within oneself; this one’s out of fashion, but you never know when aristoleanism might come back into fashion.
There’s consequentialism, which Emi and Arete just explained.
There’s Deontological Ethics, which is where the virtue of an action is solely based on whether it violates absolute moral principles defined universally through logic.
and then there’s nihilism
Is that a different name for teleology?
Teleology refers to the analysis of objects based on purpose rather than casuse.
Wtf?
How did I only just now find out that Ici also has a special interest in ethical philosophy
bounces
The last one is me
Sorry I’m not a native english speaker
Teleology and Deontology are the two extremes
Deontology refers to general rules
While telos = goal is refering to the consequences you wanted to achieve
Virtue Ethics and Nihilism are generally not very credible at the moment, one being very weirdly old-fashioned and religious and the other being the official ethical system of nazbol, so moral philosophy is basically an argument between judging actions based on consequences and judging them based on roles. At this point, Hegel is weeing his pants with excitement, because two opposing forces have a synthesis: utlitiarianism.
unfortuantely utilitarianism is kind of terrible
My personal philosophy is a weird mix of machiavelli and pierce’s pragmaticism
if there’s a hypothetical afterlife remind me to go find out what hegel, nietzche and marx thought about the 20th and 21st century. that should be fun.
Excuse me
Then again that implies somebody can understand Hegel’s prose. Here is a delightfully understandable excerpt from Hegel’s The Phenomonlogy of Spirit:
this is one of the understandable ones
Act utilitarianism is ok
Utilitarism is kinda reasonable
it seems that way, but there are deep problems with it, most notably the fact that it’s hard to gague happiness externally, and it neccecarily cannot reward superperogative acts; utilitarianism says that the most moral thing to do is always the only moral thing to do.
Hot take
This is correct and everyone just pretends it isn’t because the alternative is, like, constant paralyzing guilt
i think there are logical consequences of that which I will not elaborate on presently because I have maths homework to do