Guides

Safety & moderation

How readers report issues in the web app, how moderation fits with our policies, and what creators should keep in mind.

Last updated: 2026

How moderation works (overview)

When someone submits a report, it is available for staff review (see our FAQ). Depending on severity and our policies, outcomes may include content removal, warnings, or account actions. We can't guarantee a public outcome for every report, but misuse of the reporting system may itself violate our terms.

Reporting on the reader side

Reports are filed in the product so moderators can see what you flagged. You'll get confirmation in the UI when a report is submitted.

Series page

On a public series page (/series/[slug]), use the Report control (flag icon). You must be signed in — the button explains if you aren't. You choose a reason, optionally add a note, and submit. The API prevents duplicate reports of the same series from the same account (you'll see a message if you already reported it).

Chapter reader

While reading a chapter (/reader/c/[id]), open Report from the chapter toolbar (near reactions). This opens a Report chapter dialog: pick a reason, add an optional note for moderators, and submit. You must be signed in — if you aren't, you'll be prompted to sign in. You can only submit one report per chapter per account; a second attempt will tell you that you already reported it.

Comments

On chapters where comments are enabled, you can report an individual comment from the comment thread (Report comment modal). Comment reports are rate-limited to protect the service: roughly one report per minute and five per hour per account (you may see a retry message if you hit the limit). You can only have one report per comment per account; a duplicate will be acknowledged without creating a second ticket.

Reasons you can select (chapter)

These match the chapter reader and server validation. Use the option that best fits; add a note when context helps.

Reason
Copyright/IP
Plagiarism/Impersonation
Sexual content w/ minors
Unmarked age-restricted labeling
AI-generated content
Graphic gore (unmarked)
Hate/Extremism
Doxxing / personal data
Defamation
Malware / scam
Spam / ads
Wrong series / duplicate
Broken/blank pages
Pages out of order
Misleading/mislabeled
Other

Reasons you can select (series)

Series reports use a slightly shorter list (e.g. no “broken pages” — those are usually chapter-level).

Reason
Copyright / IP
Plagiarism
Sexual content involving minors
Unmarked age-restricted labeling
AI-generated content
Graphic gore (unmarked)
Hate / extremism
Doxxing / PII
Defamation
Malware / scam
Spam / ads
Mislabeled / wrong rating
Other

Reasons you can select (comments)

Reason
Spam
Harassment
Hate
Sexual content
Self-harm
Other

Copyright & DMCA

In-app reports help our team triage issues, but formal copyright disputes may also follow the process described in our Terms and on our DMCA page (including submitting a notice).

What creators should avoid

Align your work with our terms and the same categories readers can report. In particular:

  • AI-generated visuals — Generative AI art/panels are not allowed on Inkstra. See the FAQ and Terms.
  • Ratings & age restrictions — Use accurate series/chapter labeling so mature content is properly flagged (readers report “unmarked age-restricted labeling” or “mislabeled” when it isn’t).
  • Illegal or exploitative material — Including sexual content involving minors; such reports are treated seriously.
  • Harassment, hate, doxxing, scams — In pages or comments; keep community interactions respectful.
  • Rights & originality — Don't post work you don't have rights to publish; plagiarism and impersonation are reportable.

Your Publishing tools and public series/chapter pages are where most creator-side controls live; enforcement may also affect monetization eligibility if applicable.

Need help or not signed in?

For general questions, use the Contact page or the options described in the FAQ. Signed-in users with account-specific issues may use Support where available.