Image Moderation

Learn how to filter unsafe images.

It's difficult to control the types of images being shared on your platform. So the Image Moderation extension analyzes every image to check if it's unsafe.

Settings

  1. Login to the CometChat Dashboard and select your app.
  2. On the Extensions page simply add the Image Moderation extension.
  3. On the Installed page you can go to Settings and choose to Drop messages with NSFW images.

How does it work?

After analyzing, it classifies the image into four categories:

  1. Explicit Nudity
  2. Suggestive Nudity
  3. Violence
  4. Visually Disturbing.

Along with that, you will receive the confidence, on a scale of 0 to 100.

"@injected": {
  "extensions": {
    "image-moderation": {
      'unsafe': 'yes/no',
      'confidence': '99',
      'category': 'explicit_nudity/suggestive/violence/visually_disturbing'
    }
  }
}

A value for confidence that is less than 50 is likely to be a false-positive. So we recommend moderating only if confidence is higher than 50.

If the data is missing, it means that the extension has timed out.

Implementation

You can then either show a warning or drop the image message. This is how Instagram shows a warning for sensitive content:

At the recipients' end, from the message object, you can fetch the metadata by calling the getMetadata() method. Using this metadata, you can fetch information whether the image is safe or unsafe.

var metadata = message.getMetadata();
if (metadata != null) {
  var injectedObject = metadata["@injected"];
  if (injectedObject != null && injectedObject.hasOwnProperty("extensions")) {
    var extensionsObject = injectedObject["extensions"];
    if (
      extensionsObject != null &&
      extensionsObject.hasOwnProperty("image-moderation")
    ) {
      var imageModerationObject = extensionsObject["image-moderation"];
      var unsafe = imageModerationObject["unsafe"];
      var confidence = imageModerationObject["confidence"];
      var category = imageModerationObject["category"];
    }
  }
}
JSONObject metadata = message.getMetadata();
if (metadata != null) {
  JSONObject injectedObject = metadata.getJSONObject("@injected");
  if (injectedObject != null && injectedObject.has("extensions")) {
    JSONObject extensionsObject = injectedObject.getJSONObject("extensions");
    if (extensionsObject != null && extensionsObject.has("image-moderation"))
        {
          JSONObject imageModerationObject = extensionsObject.getJSONObject("image-moderation");
          String unsafe = imageModerationObject.getString("unsafe");
          String confidence = imageModerationObject.getString("confidence");
                    String category = imageModerationObject.getString("category");
        }
    }
}
let textMessage = message as? TextMessage
var metadata : [String : Any]? = textMessage.metaData
if metadata != nil {
            
  var injectedObject : [String : Any]? = (metadata?["@injected"] as? [String : Any])!

  if injectedObject != nil && (injectedObject!["extensions"] != nil){
                
    var extensionsObject : [String : Any]? = injectedObject?["extensions"] as? [String : Any]

    if extensionsObject != nil && extensionsObject?["image-moderation"] != nil {
                    
      var imageModerationObject = extensionsObject?["image-moderation"] as! [String :  Any]

      let unsafe = imageModerationObject["unsafe"] as! String
      let confidence = imageModerationObject["confidence"] as! String
      let category = imageModerationObject["category"] as! String
      }
   }
}

Did this page help you?