When Algorithms are Too Big to Ignore

It feels like debate is a generation behind practice here. The greatest danger isn't from data collection, but from algorithmic recommendation.
The data collection horse is out of the barn, has legitimate uses, and is an incredibly thorny issue to legislate appropriately. And will be adjusted for the next 20+ years.

What needs to be done now is to say that (1) at a certain user count ("too big to democratically ignore") then (2) production recommendation algorithms must be auditable.

Allow the details of audits to be kept secret, but the DoJ should be able to go to Google, Facebook, Netflix, or TikTok and say "Show me how this works, now."

It's burying your head in the sand to suggest this doesn't have a clear impact on democracy and is currently solely in control of private companies and completely opaque. And it's the opaqueness that's the biggest danger.

Fundamentally, modern media/social is different from everything that came before because of the economic feasibility of microtargeting. Newspapers couldn't afford to track their customers individually & print a unique paper for each of them, which resulted in an auditable public record. All of the companies above can do exactly that, without leaving any public record.

ethbr0, Hacker News

Karu

Devops guy, Docker fanboy, your average everyday opinionated nerd.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.