is there a human behind such decisions or is it just an automated algorithm?
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Maybe they could allow some of the text to show in order for spiders to index the tweets correctly, something like the first 280 characters could work.
Not entirely related, but I wonder how things like Lemmy/mastodon/other fediverse things compare to Reddit/twitter in terms of search engine indexing. Would posts like this even be indexed? Since posts are accessible through many instances would it be indexed multiple times? Would this affect ranking?
Doesn’t sound like retaliation to me, it sounds like their scheduled web crawlers are finding that content they used to index is now no longer viewable and this removed from search results. Pretty standard. My guess is that there were 400 million URLs listed and as the crawler uncovers that they are no longer available, that number will keep dropping to reflect only content publicly viewable. If only 500 URLs are now publicly viewable (without logins) then that’s what they will index. Google isn’t a search engine for private companies (unless you pay for the service) they are a public search engine so they make an effort to ensure that only public information is indexed. Some folk game the system (like the old expertsexchange.com) but sooner or later google drops the hammer.