Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.
I had classmates in high school with balding or even graying hair and full beards. Some adults older than me, look younger than my nephews. Revenge porn and creepshots are common. (or atleast were, I'm not on platforms where these are popular)
Without context, porn will always be a morally grey area. Even commercialized hyper-capitalist porn is still an intimate affair.
That's why I didn't use pornhub for example, before every user had to verify themselves before posting. Before that I only read erotica or looked at suggestive drawings.
I understand your perspective tho. You get hardly paid to keep this instance running, looking at pictures that without context could be CSAM could make this volunteer work very mentally taxing. This is how NSFW works tho.
Without context, any pornographic material featuring real humans could in truth be some piece of evidence for a horrible crime.
I thought about this some more and I can feel a lot more sympathy for your decision now.
It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.
Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.
It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.
I'm sorry for coming across as ignorant, I just did not consider your perspective that much really.