this post was submitted on 30 Jan 2024
24 points (90.0% liked)

Lemmy Integrations

248 readers
1 users here now

A community about all integrations with the lemmy API. Bots, Scripts, New Apps, etc.

founded 9 months ago
MODERATORS
24
submitted 9 months ago* (last edited 9 months ago) by [email protected] to c/[email protected]
 

As inspired by the bots on Reddit that respond to certain words, I've thrown together this code which allows anyone to set up their own response bot.

There is a bit more detail on Github, but in summary you can set your own trigger word and responses, and you have two modes of operation, "Exclude" which is the default and covers every community you're federated with (and allows moderators of a community to PM the bot to exclude it) and "Include", where you can pick a single community for the bot to be active in.

This is really early days and rough, but should work at the most basic level. Anyone who can provide some ideas/feedback/improvements - I'm totally open to them.

And to prove it works, I'm running Legolas Bot. Any comment you make below with the word "legolas" in will get a response (probably).

Small updates to reduce spaminess - will only reply to top level comments now.

Edit: Little updates include customisable polling rates and the ability to tag the comment creators name in a response.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (1 children)

Thanks, this doesn't pull only unread comments - if I pull the latest 5 comments and then mark those overarching posts as read, I get this:

2024-02-02 09:52:11,278 - INFO - Requesting API Request.GET /comment/list
2024-02-02 09:52:11,507 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9335073
Comment ID = 6915381
2024-02-02 09:52:11,629 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9007864
Comment ID = 6915380
2024-02-02 09:52:11,742 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9319139
Comment ID = 6915382
2024-02-02 09:52:11,916 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9334778
Comment ID = 6915379
2024-02-02 09:52:12,100 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9283396
Comment ID = 6915378

If I then pull the 5 latest comments again:

2024-02-02 09:52:12,238 - INFO - Requesting API Request.GET /comment/list
2024-02-02 09:52:12,380 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9335073
Comment ID = 6915381
2024-02-02 09:52:12,521 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9007864
Comment ID = 6915380
2024-02-02 09:52:12,673 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9319139
Comment ID = 6915382
2024-02-02 09:52:12,835 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9334778
Comment ID = 6915379
2024-02-02 09:52:12,977 - INFO - Requesting API Request.POST /post/mark_as_read
Post ID = 9283396
Comment ID = 6915378

Which is the same 5 comments - so what I'm looking for is a way to pull only previously "unseen" comments - that would reduce the amount of data returned from the api each time i check the list if there was only 1 or 2 comments rather than returning all 25.

Apps can indicate that there are new unread comments on a post, but I assume they're not doing this via the api and its a UI thing to do with caching?

I may not have explained myself clearly here, though!

[–] [email protected] 2 points 9 months ago (1 children)

On GET /api/v3/post/list there is a field posts[0].unread_comments which the ui uses, probably based on mark as read endpoint. But that doesnt give you the comments themselves. So I think its better to call /api/v3/comment/list like once a minute, the amount of data returned is nothing to worry about. Still if you want to minimize it, call with limit=1 and compare the comment to see how many you missed in between, then make additional requests for those comments you dont have yet.

[–] [email protected] 2 points 9 months ago

Nice solution, thank you :)