Accenture is a trusted, innovative, comprehensive, and experienced partner to leading platform companies. The Trust and Safety offering within Accenture Operations helps keep the internet safe and helps platform companies accelerate, scale, and improve their businesses. You will be responsible for the quality assurance of Content Moderation whose role includes analyzing and reviewing user profiles, audio, videos, and text-based content and/or investigating, escalating and/or resolving issues that are reported by users or flagged by the system. Due to the nature of the role, the individual and you may be exposed to flashing lights or contrasting light and dark patterns. Content moderation is meaningful work that helps keep the internet safe. It may also be challenging, at times. In the context of this role, individuals may be directly or inadvertently exposed to potentially objectionable and sensitive content (e.g., graphic, violent, sexual, or egregious). Therefore, any role supporting content moderation needs strong resilience and coping skills. We care for the health and well-being of our people and provide the support and resources needed to perform their role responsibilities. Active participation in Accenture's well-being support program, designed specifically for the Trust & Safety community, provides valuable skills to promote individual and collective well-being.
What are we looking for?
Roles and responsibilities: