As a Johnnie-come-lately to the Apple Fan Club I am both pleased and disappointed by one of the company’s recent announcements. To be clear: it has nothing to do with Apple’s plans for addressing child sexual abuse material (csam), all of which is illegal. On that we still await the final outcome. Fingers crossed.
The announcement I have in mind concerns how the company proposes to treat “sexually explicit” content that turns up on their devices courtesy of their messaging service.
Where the sexually explicit content is of the type it would be legal for adults to own, buy, consume or exchange, and it appears on a device ordinarily used by an adult, nothing at all changes. The new policy addresses sexually explicit content sent or received on an Apple device that is ordinarily used by a child.
How Apple will know if a device is ordinarily used by an adult or a child is an important but separate question which we can discuss some other time. For these purposes let’s just assume they already know or could easily find out in ways which we would be happy with (or at any rate the great majority of us would be). Whether or not Apple knows the actual age of the child is unclear but it seems they will or could.
Stories with a happy ending are few and far between
Stories about children exchanging any kind of sexually explicit content with adults never end well. In the UK and other jurisdictions they are almost always illegal exchanges anyway ( s.67, SCA 2015).
Where children have been exchanging sexually explicit material with other children I found innumerable accounts where, particularly but not exclusively, in relation to self-generated material, it ended either in betrayal and hurt, even tragedy, up to and including suicide, in “shaming” or in acute embarrassment with long term and large consequences for one or more of the children involved.
Then there are also lots of stories with headlines like this ” Sexual harassment in school: Children pressured for nudes” suggesting what we think of as self-generated images often really aren’t. They are the result of varying degrees of coercion, manipulation or grooming of some kind or another.
But even where there is little or no evidence of coercion or manipulation by an adult or other children, there’s still the challenge of children creating images which are then classified as csam.
Once the images are hashed and in a database a whole other regime kicks in so, leaving those aside, here we are probably talking about images at or near the moment of first creation and that is a crucial time when rapid intervention could avoid disaster later.
I would hazard a guess 99.9% of the children, when they were taking or making the pictures or videos, had no idea they were committing a serious crime yet in several countries this has led to them going on sex offenders’ registers or similar, causing likely irreparable lifetime harm. Among other things, certain occupations will be forever closed to them.
Even if we think, as I do, these are largely stupid and reactionary laws because they punish a child simply for engaging in childish behaviour, we cannot ignore the fact that they exist.
For all these reasons and more Apple is definitely on to something. Bravo.
What has Apple decided?
Back in August Apple said they were introducing a suite of new measures designed to improve child safety. One of these measures utilises an image scanner which has been trained on sexually explicit content. As noted, it is designed to be used on Apple devices which are ordinarily used by children.
If a sexually explicit image is detected on such a device Apple will blur it and ask if the device holder really wants to send or receive it.
Apparently if someone taps in the affirmative i.e. they say they want to send or receive the image, and if the user is known to be under 13, the system has the capacity to notify the child’s parents but the parents won’t see the actual message (which I think here means the image which prompted the alert). Note – it’s a “capacity”. This doesn’t necessarily mean it will happen. Read on.
If the capacity is being utilised children will get a message telling them their parents will receive the notification but the system doesn’t report anything to Apple moderators or indeed to any other parties.
Here’s where it starts to get tricky
Where the capacity is not being utilised how could it still be OK, for example, for Apple to know an adult is sending sexually explicit content to a child but then do nothing, particularly in those jurisidictions where, pima facie, that is a crime?
But jurisdictions aside, when it comes to pornography (which I’m guessing a lot of sexually explicit content will be) of what relevance is 13 as an age limit anyway? And is Apple happy to be passing on porn from adults to minors whom they know are 14 or 15?
And trickier still
Bizarrely, having built these capacities, Apple seems to have decided to make all of the above an opt-in feature. This makes no sense at all.
Safety by design. Security by default
Devices and services directed at or known to be used by children should be as safe as they possibly can be at the get-go. Safety by design. Security by default.
Parents must retain a discretionary power to change default settings on any device which their children use, and we must trust or insist they do not use that discretion in ways which restrict a child’s rights, but if any hoops have to be jumped through it should be hoops which liberalise the settings or lower the barriers to risk.
I am trying to imagine the circumstances in which Apple knows that nude pictures or any kind of sexually explict content has been received or is about to be sent by a nine year old to an adult, or another child for that matter, but they do nothing because whoever was responsible had not opted in to their service. Nope. My imagination has failed me.
Then last week
It appears, however, that Apple has relented on one small but important detail, or rather has changed it.
I may not have all the information but it seems if the child views the image (what happens at the sending end?) Apple will now no longer send a notification to the parent, or indeed anyone, but the child will be offered an option to have another trusted person put in the place of their parent. An option, mark you, not a requirement. So there could be no adult on the scene at all?
This is presuming a lot about the child’s level of understanding or else it is putting a lot of responsibility on the child’s shoulders. That doesn’t feel right, particularly where a very young child is involved.
Apparently the justification for this is that it could otherwise out a queer or transgender child to their parents or the parent could be violent or abusive. If either or both of those things were true then I completely get why an amendment of that nature would commend itself to Apple, as it does to me. But suppose, for example, the device has been given to the child by someone who isn’t one of its parents, as part of a pattern of abuse and exploitation? This has happened a lot.
There are probably endless scenarios we could conjure up where tech represents either a threat or an opportunity but if we start from the proposition that every adult in a child’s life is an actual or potential abuser or oppressor I am afraid we are lost and so is the child. Most parents and adults in a child’s life are a force for good even if many of us mess up from time time.
The exchange of any kind of sexually explicit material by very young children has to be something someone somewhere should know about. That someone should truly have the child’s best interests at heart and they need to be informed so they can check things out and support the child who may be in any number of different kinds of trouble or turmoil. Designing a system intentionally to keep this kind of information away from a child’s parents or a responsible adult does not sit well with me, particularly in instances where the child is very young.
Back to the drawing board
At the very least Apple should abandon 13 as any kind of reference point and think again about how they will treat children across the full spectrum of ages below 18.