Content Monitoring Policy
1. Purpose and Scope
This Content Monitoring Policy defines the principles, responsibilities, and processes used by mobileTREND GmbH to review, manage, and moderate digital content on its platforms, in particular the dating application yoomee.
The purpose of this policy is to:
- Maintain user safety and a respectful environment
- Comply with applicable laws and regulations, including the EU Digital Services Act (DSA) and GDPR
- Prevent illegal, harmful, or misleading content
- Protect the rights, privacy, and dignity of users
This policy applies to user-generated content within yoomee, including profile information, images, messages, and reported interactions.
2. Legal and Regulatory Framework
Content monitoring at mobileTREND GmbH is conducted in accordance with:
- EU Digital Services Act (DSA)
- General Data Protection Regulation (GDPR)
- Applicable national laws (Germany / EU)
- yoomee Privacy Policy and Terms of Service
Monitoring is performed proportionately, transparently, and with respect for users’ fundamental rights, including freedom of expression and data protection.
3. Types of Content Subject to Monitoring
The following categories of content may be reviewed or moderated:
- User profile text, usernames, and bios
- Uploaded photos and media
- In-app messages and interactions (only when reported or flagged as described below)
- Reported or otherwise flagged content
mobileTREND GmbH does not perform continuous manual review of private communications. Review occurs only when:
- Content is reported by users
- Automated systems flag content based on defined indicators
- Legal obligations require review
4. Monitoring Methods and Technologies
4.1 Automated Monitoring
Automated tools are used to support moderation at scale, including:
- Image detection to identify nudity, violence, or illegal material
- Keyword/pattern detection to identify hate speech, threats, scams, or prohibited content
- Fraud/spam detection and behaviour-based safety signals
Automated systems are designed to minimize false positives and are periodically reviewed for accuracy and fairness.
4.2 Human Review
Human moderation is performed by trained internal staff or authorised service providers and is required for:
- Final decisions on content removals and account actions
- Escalated, sensitive, or ambiguous cases
- Appeals submitted by users
Where decisions have a significant impact on users, mobileTREND GmbH ensures appropriate human oversight and an appeal path.
5. Roles and Responsibilities
- Management: Defines moderation strategy, risk tolerance, and ensures resources and compliance.
- Moderation Team: Reviews reported/flagged content, applies rules consistently, documents decisions.
- Data Protection Officer (DPO): Oversees GDPR compliance and reviews monitoring involving personal data.
- External Service Providers: May support moderation under strict contractual confidentiality and data protection obligations.
6. User Reporting and Complaint Handling
Users can report content or behaviour in yoomee. Reports are logged and reviewed without undue delay. Outcomes may include content removal, warnings, or account restrictions.
Users are informed of significant moderation decisions where applicable and can request review/appeal as required by the DSA.
7. Content Moderation Actions
Possible actions include:
- No action (content compliant)
- Content removal or restriction
- Warning or temporary limitation of account features
- Temporary suspension or permanent termination
Actions are proportionate to severity, context, and repeat violations.
8. Data Protection and Privacy
Content monitoring follows the principles of data minimisation, purpose limitation, storage limitation, confidentiality, and integrity. Access to personal data for moderation is restricted to authorised personnel.
Personal data processed during monitoring is handled in line with the yoomee Privacy Policy.
9. Transparency and Documentation
mobileTREND GmbH maintains internal documentation on:
- Moderation rules and decision criteria
- Use of automated tools and key parameters (where appropriate)
- Training and quality assurance for moderators
Where required by law, transparency reporting obligations under the DSA are fulfilled.
10. Review and Updates
This policy is reviewed regularly and updated as necessary to reflect changes in legal requirements, platform features, and identified risks or incidents.