Skip links

Outsourcing Content Moderation: An In-Depth Look

The internet is a vast ocean of content, ranging from harmless cat videos to more insidious and harmful material. With so much being posted every second, content moderation has become a monumental undertaking. Many companies have turned to outsourcing moderation work in order to keep up. But what exactly does this entail?

Let’s dive into the what, why, how, who, and when of outsourced content moderation.

What is Outsourced Content Moderation?

In a nutshell, outsourced moderation involves contracting outside companies to handle the review and removal of inappropriate, illegal or otherwise undesirable content. With social media and UGC (user-generated content) sites exploding in popularity, the amount of content needing moderation has grown exponentially. Companies have found it more efficient to outsource these moderation operations rather than handle them internally.

Moderators review all types of UGC including text posts, images, videos, and comments. Their goal is to identify and swiftly remove any content that violates the platform’s policies and terms of service. Some examples include:

  • Violent, gory, or sexually explicit material
  • Hate speech, bullying, or harassment
  • Spam, scams, and other fraud
  • Illegal content like child pornography -Copyright violations and intellectual property theft
  • content that is against the policies of the company or organization

This keeps the online environment safer and brand-friendly for users and companies alike.

Why Outsource Content Moderation?

Outsourcing moderation provides a few key advantages:

Cost Savings: Paying an outside company for moderation is often cheaper than hiring large internal teams, especially for smaller or growing platforms.

Flexibility: Moderation needs fluctuate, sometimes wildly. Outsourcing provides flexibility to scale operations up and down.

Focus: Handing moderation to a third party lets in-house teams concentrate on other priorities and innovations.

Expertise: Outsourcing firms specialize in moderation and invest in training and knowledge to maximize accuracy.

Geography: Many outsourcing companies operate overseas, allowing for 24/7 coverage across time zones.

Objectivity: Third-party moderators avoid biases that internal or community-based moderation may have.

Outsourcing is not a panacea, however. Ethical issues around working conditions and moderator well-being remain a serious concern, as discussed below.

How Does Outsourced Moderation Work?

The moderation process typically functions like this:

  1. The platform sends content flagged as problematic to the outsourcer.
  2. Content is assigned to moderators based on specialty (e.g. hate speech, nudity).
  3. Moderators review content quickly, making snap judgments on violations.
  4. They input decisions on whether to remove content or leave it be.
  5. The platform automatically applies determinations at scale.
  6. Appeals can be made to human supervisors if requested.
  7. Moderators are monitored for speed and accuracy with quotas.

The work is usually fast-paced, demanding, and repetitive – like other forms of content gig work. Moderators must view thousands of potentially disturbing posts per shift, across multiple platforms. It’s vital but taxing work.

Who Performs Outsourced Moderation?

While some moderation is automated with AI, human moderators remain essential for accuracy. Most companies outsource to large call center-style operations in the Philippines, India, Mexico, and other lower-cost regions. Some key facts about human moderators:

  • Mostly young adults, often college educated
  • Frequently employed by staffing agencies rather than tech companies directly
  • Trained in regional cultural norms and platform rule enforcement
  • Work long hours under constant monitoring and quotas
  • Tend to burnout and turnover quickly due to mental health impacts

Many see it as a foot in the door to the tech industry, despite difficult conditions. Others view it as preferable to local opportunities. But moderator well-being remains a pressing moral issue.

When is Moderation Outsourced?

Moderation is outsourced in a few key situations:

Rapid Growth: When a platform experiences sudden viral growth, outsourcing allows them to quickly scale moderation operations to match.

New Platforms: New social networks frequently rely on outsourcers to moderate until they can build internal capabilities.

Small Teams: Smaller platforms without deep resources utilize outsourcing for cost efficiency.

** Peak Periods:** Spikes in certain content – elections, protests, crises – may necessitate short-term outsourcing to handle surges.

New Users: Geo-specific moderation is sometimes used to educate new user markets on norms and policies.

Pilot Testing: Some companies outsource moderation of new experimental features to test the waters first.

Specialized Content: Outsourcers can provide expertise in moderating niche content like child exploitation and terrorism.

Backlogs: Content backlogs that accumulate beyond internal capacity may be outsourced to clear.

Key Considerations Around Outsourcing

While outsourcing moderation has its purposes, companies need to weigh ethical factors as well:

  • Is mental health support provided for moderators? This is crucial.
  • Are unrealistic quotas driving errors and low morale? Speed vs. accuracy must be balanced.
  • Is cultural familiarity and empathy required for sound decisions? Local moderators may perform better.
  • How transparent are operations? Moderators deserve communication and agency.
  • Are moderators viewed as essential but expendable? This will stifle retention and growth.
  • Do appeals reach actual humans? Automation alone leads to mistakes.

There are no perfect solutions. But treating moderators as valued professionals, not just faceless contract labor, will go a long way.

Looking Ahead With Moderation

Content moderation remains crucial but flawed in its current outsourced state. There are opportunities to enhance processes through steps like:

  • Integrating automation more seamlessly to aid human moderators
  • Creating specialized auditor roles to assess tough judgment calls
  • Enabling platforms and outsourcers to share feedback on decisions
  • Establishing industry standards and certifications for moderators
  • Localizing operations closer to platform users to increase cultural fluency
  • Giving moderators more of a voice and role in policy shaping

By combining human compassion and technological advances, the field can continue evolving in healthier ways. The future remains challenging but ultimately bright.

Outsourced moderation keeps the wheels of the internet turning, though few may realize the difficult human work behind it. With more understanding and open communication between parties, we can build a system that works for all involved. The aim is an open internet – but not at any cost.

Related Posts

Leave a comment