Large enterprise networks can contain thousands of endpoints. Identifying inappropriate, NSFW images and videos stored on these endpoints can be like looking for a needle in a haystack. Most corporations, academic and public sector organisations have no visibility of the type of images and videos stored on their file shares and workstations. This critical lack of knowledge does not preclude them from their duty of care to protect their employees from a hostile working environment and therefore it may expose them to legal liability.
Pornographic and Not Safe for Work images in the workplace can degrade company culture, damage brand reputation and contribute to the creation of a hostile working environment. In most countries, employers have a duty to protect their employees from sexual harassment. They can be held vicariously liable for the actions of an employee, unless they can demonstrate they have taken all reasonable steps to prevent it from taking place.
“The proliferation of pornography and demeaning comments, if sufficiently continuous and pervasive may be found to create a hostile working environment.”
US Equal Employment Opportunity Commission
Image Analyzer allows software vendors to integrate AI powered visual threat intelligence into their endpoint solution. As an embedded feature or add-on module, the company helps provide a competitive differentiator while adding incremental revenue from the existing customer base.
- Advanced AI delivers high detection and near zero false positives
- Identifies sexually explicit and NSFW image and video files
- Investigate computer misuse – provides visibility of misuse
- Verify employee misconduct
- Perform internal audits
- Helps organisations demonstrate reasonable steps
- Provides incremental revenue from existing customer base
- Embedded SDK available – no content sent to the cloud