OAAA Writes In “Human Approval” For Digital OOH Ads To Reduce Risk Of “Problematic Content”
August 23, 2024 by Dave Haynes
This may hit industry veterans as a forehead-slappingly obvious thing to do, but the Out of Home Advertising Association of America (OAAA) has written in the human approval of ads before they go up on screens as a best practice in an amended version of its Digital Billboard Security Guidelines.
Those guidelines now include a section called “Human Approval of Content for Digital Displays” that is seen as a measure to help reduce the risk of digital billboards and other DOOH formats being hacked and what it calls problematic content appearing on screens. That content is described as “material that is obscene, fraudulent, criminal, or hateful, as well as content that violates community standards or OAAA’s Code of Industry Principles. While Problematic Content can arise from owner-created material, the risk is significantly higher with programmatic or automated content, which may lack sufficient human oversight.”
There is already quite a bit of automation in the scheduling and delivery of ads, using programmatic and targeting based on data descriptions of such things as the location, audience and creative. But the rise of generative AI and content automation tools make it possible for dozens, hundreds, even thousands of ads to be largely created, published and distributed by machines. It can save a huge amount of time and cost, but it can come with the risk of incorrect, inappropriate or just plain old crappy creative showing up on screens – hence the urging to get sets of eyes on each generated piece.
Despite advancements in software and artificial intelligence (AI) filters, these tools cannot fully replace human judgment. The subtle nuances and context required to accurately assess content appropriateness necessitate human review. Therefore, to mitigate risks and uphold content standards, it is recommended that all digital display content undergo human approval.
“Such human review should be performed,” the OAAA continues, “by as many people and departments within an out of home (OOH) company as possible, from sales and graphic design to IT and operations and all points in between.”
Talk to companies like Plainly that have tools that use software and AI for content generation, and they’ll tell you the idea is not to turnover creative production to computers, but to minimize the laborious or grunt work that can be part of building ads, and particularly slight variations of the same ad.
There’s also the simple best practice that was around long before AI became a content and scheduling tool – that a second set of eyes should have a look at pretty much everything new being made available on servers to remote media players. That catches typos, bad grammar and incorrect information, but a network should also have policies around everything like the nature and tone of content, to rules about what content can’t be on screens, for a variety of reasons.
Leave a comment