|
Definition of Hollywood ending
1. Noun. An outcome considered to be typical of certain movies produced in Hollywood, California, in which all desirable results are achieved, with protagonists being rewarded, antagonists being punished or destroyed, and positive sentiments (love, happiness, peace) prevailing over negative. ¹
¹ Source: wiktionary.com