How Moderation & Brand Safety is Applied for Destination Brands (Publishers) & Domain Genuin Ecosystem

Introduction

Content moderation is the main feature that ensures the content is safe, respectful and engaging. It moderates video content to see if it adheres to all three levels i.e. platform, brand, and community guidelines.

How it works

The moderation process has three steps: It begins with setting up the platform guidelines, brand guidelines, and community guidelines. The system ensures the content is aligned with all the three levels i.e. platform, brand and community guidelines.

This way, we make sure everything stays clear, consistent, and in line with what the platform, the brand, and the community stand for.

Platform Guidelines

Platform guidelines are the foundation of content moderation. These guidelines aim to ensure a safe and equitable environment through the prohibition of offensive content, such as hate, violent, sexual, graphic violence, self harm, or harassment/threatening. Content that may be considered NSFW includes: graphic violence, pornography, profanity, nudity, and slurs is also moderated.

Moderation systems carefully scan both the audio and visual components of a video along with its metadata in search of possible transgressions. This guarantees that the platform is a safe place for the users and brands to post, view, share and engage with the content.

Here, the aim is to preserve the integrity of the platform.

Brand Guidelines

Once content passes the platform’s safety checks, it is reviewed against brand-specific guidelines. These guidelines are tailored to reflect the identity, values, and messaging of each advertiser or sponsor.

In this stage, it is verified that the content complies with a brand persona by excluding competitor content (as per guidelines) or potentially harmful content that may tarnish the brand image.

For example, a health and wellness brand may prefer not to have its contents displayed next to content promoting unhealthy or harmful habits. This stage not only safeguards the brand but also helps to reinforce it as the brand approaches its target market by maintaining consistency both in terms of messaging, content and appearance and being relevant to their needs.

Community Guidelines

The final layer of moderation focuses on the platform’s community guidelines, which are often shaped by the expectations of community admins. These rules ensure that content is relevant, respectful, and engaging for the community it serves.

For instance, a gaming community will value content that does not have any content other than gaming i.e. movies, books, health, education, et. al., whereas a health education community will place a premium on tips to lose weight, exercise tips, healthy diets, et. al.

This step also contributes to the creation of a feeling of belonging and mutual respect, that contributes to developing a positive and dynamic community.

After the content has gone through the checks above, the video is available to be posted.

Brand Safety & Context

Following the detailed moderation workflow described above, the system implements proactive measures to maintain safety and relevance for content on the platform. Moreover, through the removal of non-relevant content (hate speech, violence, nudity, etc.), the moderation process can create a safe space for each user. This creates trust and confidence, prompting users to interact more freely while minimizing the potential for exploitative or damaging interactions or the exposure to unsuitable content.

The information of the moderated videos, including its community details, category, keywords, video details and other relevant information of the posted video is given to advertising partners. This information guarantees that advertisements that appear alongside the content are considerate and appropriate for the target audience.

This continuous linking between content moderation and ad targeting not only improves the user viewing experience, but it also improves advertiser value.

Ensuring Safe & Relevant Ad Experiences

Ads coming from the ad exchanges are moderated to make sure it is aligned with the platform, brand, and community guidelines. This ensures that every ad is relevant and right for the audience in the community. This mechanism ensures the brand safe space for advertiser brands.

For example, if the competitors of a brand (e.g. Samsung) are banned in a brand community (e.g. Apple enthusiast community of Apple brand) then the ad content (e.g. Samsung mobile battery) which is falling under the competitor brand category (e.g. Samsung in this case) is restricted. This creates a brand safe environment where brands (e.g. Apple) can connect with their audience in a meaningful way.

It leads to a better experience for everyone, ensuring the ads are respectful and consistent.