Newcoin.org
Search…
πŸ•΅
Decentralised Moderation
Moderation is a major concern for all social platforms, wether the content is illegal is an easy one, but how about content that go against community guidelines such as content that could contextually offend a specific community.
When it comes to spam and exploits, please refer to the Mechanism design page.
For content policy and moderation, four democratic and voluntary models can provide solutions to protect communities from undesired behaviour:

App level

While the blockchain layer is immutable and provides complete autonomy and ownership to account holders, Apps are allowed to choose who they accept or reject, in the same way account holders can opt-in and opt-out of each app.
This checks and balances enables a more fluid and community-driven approach to content moderation with different approaches competing to achieve the most relevant and successful content policies.
Apps are therefore able to filter and organise content by providing different algorithms, different user experiences and enforce blocking between users at the app level.

Decentralised reputation system

As presented in the UNS page, each account issuer has a score relative to the reputation of the accounts issued by them. For instance, if too many accounts issued with the .xyz are receiving complaints from members of the network, the issuer might be penalised by decision of the DAO and would be unable to issue more accounts for a defined period.

Decentralised blacklist

In some cases, an elected counsel within the Newcoin Governance portal will review and potentially decide to freeze, ban or even delete accounts and data from the RAM by requesting intervention from the block producer through a multi-sig process.
Thos blacklist could be invoked in case of obviously illegal activities. The blacklist would then be provided to all block producers running nodes on the network.
​
​
​
Last modified 25d ago