"No one pretends that democracy is perfect or all-wise. Indeed, it has been said that democracy is the worst form of government except all those other forms that have been tried from time to time." [0]
-- Winston Churchill
For any kind of platform that allows arbitrary users some level of control over the content that will be hosted, the options for ensuring that content is not harmful are:
* Community Moderation: users control contributing and moderating content. The platform chooses moderators, or enables voting for them. This gives users and all people (i.e. citizens) the most power, but has the most potential for abuse of the system to enable using the platform to host, spread and share harmful content.
* Platform Moderation: wholly moderated by platform chosen moderators. This is probably the most common system. The platform will use its own set of values and policies to decide what to moderate, and will likely target the most popular content deemed harmful. Per platform, this gives platforms the most power, but platforms much compete with each other
* Government Moderation: moderation likely by the platform, but with oversight from government - policy and values may be defined by the government; failure to moderate according to the government legislation could result in penalties or termination of the platform. If the government has sufficient checks and balances and citizen influence, this may be a desirable system, but if the government is not "of and for the people", it could also be used by the government to moderate opponents of the government as decided by that government. Anyone opposing the absolute power of the government may find their content "moderated" away. This is the stuff of nightmares for the founding fathers.
"When government fears the people, there is liberty. When the people fear the government, there is tyranny." [1]
-- Winston Churchill
For any kind of platform that allows arbitrary users some level of control over the content that will be hosted, the options for ensuring that content is not harmful are:
* Community Moderation: users control contributing and moderating content. The platform chooses moderators, or enables voting for them. This gives users and all people (i.e. citizens) the most power, but has the most potential for abuse of the system to enable using the platform to host, spread and share harmful content.
* Platform Moderation: wholly moderated by platform chosen moderators. This is probably the most common system. The platform will use its own set of values and policies to decide what to moderate, and will likely target the most popular content deemed harmful. Per platform, this gives platforms the most power, but platforms much compete with each other
* Government Moderation: moderation likely by the platform, but with oversight from government - policy and values may be defined by the government; failure to moderate according to the government legislation could result in penalties or termination of the platform. If the government has sufficient checks and balances and citizen influence, this may be a desirable system, but if the government is not "of and for the people", it could also be used by the government to moderate opponents of the government as decided by that government. Anyone opposing the absolute power of the government may find their content "moderated" away. This is the stuff of nightmares for the founding fathers.
"When government fears the people, there is liberty. When the people fear the government, there is tyranny." [1]
-- Thomas Jefferson
[0] https://en.wikipedia.org/wiki/Criticism_of_democracy
[1] https://www.monticello.org/site/research-and-collections/whe...