Despite the economic and geopolitical rise of non-western emerging powers, the global film industry still remains, in many ways, western-centric. The West’s hegemony in the field of cinema, among other implications, yields the U.S. and certain European countries substantial political dividends