Yes there are LLMs useful for such things and you could use them to make moderation decisions. YMMV with how "good" you want your moderation to be.
Yes there are LLMs useful for such things and you could use them to make moderation decisions. YMMV with how "good" you want your moderation to be.