Awards & Festivals:Hollywood Film Awards:What are the Hollywood Film Awards?
What are the Hollywood Film Awards?
The Hollywood Film Awards® were started in 1997 by Carlos de Abreu and Janice Pennington to honor excellence in filmmaking. The honorees are chosen by an advisory panel composed of people across the industry.