The results of a new research found that 20% of all ad breaks for children’s programming contained at least one inappropriate ad, including ads for alcohol, casinos, gambling, adult hygiene, pharmaceuticals, and foods with high sugar or fat content.
The research was conducted by GumGum, a contextual-first, global digital advertising platform, and it found significant brand safety violations in advertising on kids’ connected TV (CTV) content.
GumGum conducted a human review of over 100 childrens’ shows that aired on a representative sample of leading video streaming apps, including both free and paid streaming services. The study was conducted to test what audiences see in multiple states over the span of 4 months. The types of ads flagged as inappropriate for children were compiled according to the US Federal Trade Commission (FTC)’s rules, regulations, & recommendations.
“We are living in a video-first world where basing insight on just a simple keyword or generic metadata description isn’t going to work – not only to avoid specific content, like childrens’ shows, but for targeting and placing relevant ads as well,” said GumGum, CEO, Phil Schraeder.
“There is a huge gap in the CTV ecosystem that most advertisers and publishers aren’t aware of and there is something we can do about it,” Schraeder said.
Most advertisers today still rely on basic ad verification and brand safety technologies that only analyze the generic metadata descriptions for videos. When content information is shared, it is self-declared and not consistent across supply sources leaving advertisers to make decisions based on limited metadata, such as genre or channel name.
This results in ads intended for adults routinely appearing alongside children’s programming, violating strict regulations in the United States and other countries.
Advances in artificial intelligence mean advertisers now don’t need to rely on patchy video description data and can analyze video content at scale on a much deeper and forensic level.
GumGum’s accredited contextual intelligence platform, Verity, for example, can evaluate videos on a content level or a frame-by-frame basis, without relying on the presence of metadata and video descriptions, giving a more precise reading of what the video content is about.
GumGum is developing a machine learning model for Verity specifically trained to identify made-for-kids content, which will result in a specialized classifier to predict whether a web page or video is made-for-kids.
“There is a major gap in the CTV supply chain and it’s something we can’t ignore,” said Schraeder.
Schraeder said they are also working with video-focused data platform IRIS.TV to create IRIS_ID, a content identifier that publishers can use to securely share their content’s video-level data.
“We are also encouraging advertisers to evaluate the tech they are using to support their growing video strategy. Having technology that can understand all the elements of a video, is critical now but also a key component to navigating video and future interactive environments like In-Game and the Metaverse,” he said.