By Wendy Shore on August 05, 2019

4 percent of government agency’s email classified as inappropriate

Irish State agency Fas and several government departments
have been implicated in a widespread distribution of “pornographic”
or highly inappropriate material by staff in their emails, the Sunday
Independent can reveal.

The shocking findings are disclosed in an audit conducted by
the tainted State job training agency, which uncovered heavy viewing of
thousands of pornographic and “highly inappropriate images”,
following a trawl of its email system.

Thousands of images were sent and received by staff at Fas,
including some containing “graphic nudity, focus on genitalia or
suggestive poses” as well as “sexual acts including foreplay and
self-sex”.

Almost four per cent of all reported content in Fas
mailboxes was classified as “highly inappropriate”, the audit found.
The images were described as depicting “adult pornography, graphic
violence, extreme prejudice or severe injury”. No illegal or child
pornography was discovered.

The problem was not confined to within Fas, as the audit
discovered that 24 per cent of the images were “identified as being sent
from the organisation to external recipients, indicating the area of highest
potential reputational risk”.

The audit revealed that Fas staff sent dozens of
inappropriate images to fellow civil servants in the HSE and the Department of
Justice, with many others sent to private gmail or hotmail accounts.

But the audit also found Fas staff received inappropriate
images contained in emails from the Revenue Commissioners, the Department of
Justice, the HSE, South Dublin County Council, Fingal Council, Sligo County
Council and Kerry County Council.

Fas told the Sunday Independent that “any necessary
corrective action was taken and the recommendations of the report were
addressed”.

At Image Analyser, we continue to be disappointed at the
regularity with which these stories of corporate email systems and networks
being used for the inappropriate dissemination of pornographic and other
inappropriate content continue to appear in the media.

It reinforces our view that all organisations need to be vigilant
and apply three simple steps to avoid this happening to them:

Constantly review and communicate your
organisation’s acceptable use policy (AUP) for use of corporate email and
internet access. For example, publish the key points prominently on the home
page of your intranet.
Implement technology that includes image-scanning capabilities such as those offered by
Image Analyser to identify and prohibit the sending or viewing of inappropriate
images and video content online.
Apply the sanctions in your HR policy to anyone found to be responsible for the misuse of
corporate email and internet resources and let everyone in the organisation
know how serious the consequences can be.


Image Analyzer is a world leader in the provision of automated content moderation and user generated content moderation for image, video and streaming media.
Our AI content moderation delivers unparalleled levels of accuracy, producing near-zero false positive results in milliseconds.
If you have any questions about our Image moderation software or content moderation software, please get in contact today.

Published by Wendy Shore August 5, 2019