More Than Just Deleting Nasty Comments
Now, you might think this is all about hiring more moderators to police content. But it’s far more sophisticated than that. The DSA demands robust systems for everything from verifying a user's age to providing regulators with a transparent audit trail of how harmful content is handled. This requires some seriously complex technology.
We’re talking about advanced identity verification platforms, AI-driven risk assessment tools, and intricate data analytics software. Frankly, this is the kind of dreary, specialised work that big tech companies are not particularly good at and would much rather outsource. Why would a company focused on building a metaverse want to become an expert in the minutiae of European compliance law? They wouldn’t. They’ll pay a specialist to make the problem go away, creating a huge opportunity for those specialists.
Of course, there’s always the risk that these giants could try to build these tools in-house. But given the complexity and the ever-shifting regulatory landscape, I suspect they’ll find it’s far more efficient to buy a best-in-class solution off the shelf. This isn't their core business, and trying to make it so would be a costly distraction.