interventionism

(noun)

noun

1. (politics) The political practice of intervening in a sovereign state's affairs.

2. (medicine) The medical practice of trying to prolong someone's life.