‘Everyone’s doing it’ was the justification offered by one employee at Qantas Airlines for viewing pornographic images on his work computer. Another said he thought he had sent it to the recipient’s private computer and the offensive material must have been viewed inadvertently on the corporate network.
Qantas has acted highly professionally in suspending 10 employees for alleged breaches of their acceptable use policy involving viewing and distributing pornographic material but the corporate reputational damage has already been done with a negative article in the Sydney Morning Herald available for the world to read online.
With the vast increase of video and image content that is being uploaded to the Internet daily, it is increasingly important for organisations to review their internet security practices to ensure they are using technology, such as Image Analyzer, that can eliminate inappropriate image and video content before it can be viewed or disseminated online via a corporate network.
Applying more effective image detection software is likely to be easier for most organisations to do than attempting to change the online behaviour of all employees no matter how much training, awareness and reminding about acceptable use policy is done.
Image Analyzer is a world leader in the provision of automated content moderation and user generated content moderation for image, video and streaming media. Our AI content moderation delivers unparalleled levels of accuracy, producing near-zero false positive results in milliseconds.
If you have any questions about our Image moderation software or content moderation software, please get in contact today.