Per one tech forum this week
Stop spreading misinformation.
This is a most excellent place for technology news and articles.
Per one tech forum this week
Stop spreading misinformation.
To quote the most salient post
The app doesn't provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.
Which is a sorely needed feature to tackle problems like SMS scams
If the app did what op is claiming then the EU would have a field day fining google.
Google says that SafetyCore “provides on-device infrastructure for securely and privately performing classification to help users detect unwanted content
Cheers Google but I'm a capable adult, and able to do this myself.
People don't seem to understand the risks presented by normalizing client-side scanning on closed source devices. Think about how image recognition works. It scans image content locally and matches to keywords or tags, describing the person, objects, emotions, and other characteristics. Even the rudimentary open-source model on an immich deployment on a Raspberry Pi can process thousands of images and make all the contents searchable with alarming speed and accuracy.
So once similar image analysis is done on a phone locally, and pre-encryption, it is trivial for Apple or Google to use that for whatever purposes their use terms allow. Forget the iCloud encryption backdoor. The big tech players can already scan content on your device pre-encryption.
And just because someone does a traffic analysis of the process itself (safety core or mediaanalysisd or whatever) and shows it doesn't directly phone home, doesn't mean it is safe. The entire OS is closed source, and it needs only to backchannel small amounts of data in order to fuck you over.
Remember the original justification for clientside scanning from Apple was "detecting CSAM". Well they backed away from that line of thinking but they kept all the client side scanning in iOS and Mac OS. It would be trivial for them to flag many other types of content and furnish that data to governments or third parties.
I didn't have it in my app drawer but once I went to this link, it showed as installed. I un-installed it ASAP.
https://play.google.com/store/apps/details?id=com.google.android.safetycore&hl=en-US
I also reported it as hostile and inappropriate. I am sure Google will do fuck all with that report but I enjoy being petty sometimes
Thnx for this, just uninstalled it, google are arseholes
I switched over to GrapheneOS a couple months ago and couldn't be happier. If you have a Pixel the switch is really easy. The biggest obstacle was exporting my contacts from my google account.
Thanks for bringing this up, first I've heard of it. Not present on my GrapheneOS pixel, present on stock.
I suppose I should encourage pixel owners to switch from stock to graphene, I know which decide I rather spend time using. GrapheneOS one of course.
I just un-installed it
Anyone know what Android System Intelligence does? Should that be un-installed as well?
Well then I hope they like seeing my butthole.
My older brother swipes through your phone's photos without asking, so I put some colonoscopy pictures in there. He hasn't tried to look at photos on my phone since.
Oh Google what have you done to yourself.
laughs in GrapheneOS
Even with the latest update from Samsung, I am not seeing this app. My OnePlus did get it with the February update and I had to remove it.
Is there any indication that Apple is truly more secure and privacy conscious over Android? Im kinda tired of Google and their oversteps.
For true privacy you'll want something like GrapheneOS on a Pixel, with no Google apps or anything. Some other ROM with no gApps as a second choice.
Other than that, Apple SEEMS to be mildly better. I'll give you an example: Apple pulls encryption feature from UK over government spying demands
While it's a bad thing that they pull the encryption feature, it's a good sign - they either aren't willing or able to add a backdoor for the UK security services. Then there was this case. If the article is to be believed, they started working on security as of iOS 8 so they could no longer comply with government requests. Today we're on iOS 18.
Apple claims their advertising ID is anonymized so third party apps don't know who you are. That said, they still have the advertising ID service so Apple themselves do know a whoooooole lot about you - but this is the same with Google.
Then regarding photo scanning - Apple received a LOT of backlash for their proposed photo scanning feature. But it was going to be only on-device scans on photos that were going to be uploaded to iCloud (so disabling iCloud would disable it too) and it was only going to report you if you had a LOT of child pornography on your phone - otherwise it was, supposedly, going to do absolutely nothing about the photos. It wasn't even supposed to be a categorization model, just a "Does this match known CSAM?" filter. Google and Microsoft had already implemented something similar, except they didn't scan your shit on-device.
At the end of the day, Apple might be a bit more private, but it's a wash. It's not transparent and neither is Google. I like using their devices. Sometimes I miss the freedom of custom ROMs, but my damn banking apps stopped working on Lineage and I couldn't be arsed to start using the banks' mobile websites again like I'd done in the past. So I moved to iOS, as Oneplus had completely botched their Android experience in the meantime while I'd been using Lineage so I was kinda pissed at what I had considered one of the last remaining decent Android manufacturers (Sonys are overpriced and I will never own a Samsung, I hate them, I didn't like my Huawei or Xiaomi much either).
So if you want to run custom ROMs, get a Pixel or something. If not, Apple is as good a choice as Android. A couple of years ago it was the better choice even, as you'd get longer software support, but now the others have started catching up due to all the consumer outrage.
The short answer is: Apple collects much of the same data as any other modern tech composite, but their "walled garden" strategy means that for the most part only THEY have access to that info.
It's technically lower risk since fewer parties have access to the data, but philosophically just about equally as bad because they aren't doing this out of any real love for privacy (despite what their marketing department might claim)
Fuck these cunt
Thanks. Just uninstalled. What a cunts
I uninstalled it, and a couple of days later, it reappeared on my phone.
Do we have any proof of it doing anything bad?
Taking Google's description of what it is it seems like a good thing. Of course we should absolutely assume Google is lying and it actually does something nefarious, but we should get some proof before picking up the pitchforks.
It didn't appear in my apps list so I thought it wasn't installed. But when I searched for the app name it appears. So be aware.
More information: It's been rolling out to Android 9+ users since November 2024 as a high priority update. Some users are reporting it installs when on battery and off wifi, unlike most apps.
App description on Play store: SafetyCore is a Google system service for Android 9+ devices. It provides the underlying technology for features like the upcoming Sensitive Content Warnings feature in Google Messages that helps users protect themselves when receiving potentially unwanted content. While SafetyCore started rolling out last year, the Sensitive Content Warnings feature in Google Messages is a separate, optional feature and will begin its gradual rollout in 2025. The processing for the Sensitive Content Warnings feature is done on-device and all of the images or specific results and warnings are private to the user.
Description by google Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares. - https://9to5google.com/android-safetycore-app-what-is-it/
So looks like something that sends pictures from your messages (at least initially) to Google for an AI to check whether they're "sensitive". The app is 44mb, so too small to contain a useful ai and I don't think this could happen on-phone, so it must require sending your on-phone data to Google?