Endless Danbooru Pages Greasemonkey Script

Posted under General

葉月 said:
It'd work better if you haven't mistyped , for . in the domain names.

Doh. Yes spelling the domain names correctly would help. A lot. Corrected now, thanks.

LaC said:
Bug 189308 has been fixed since 1.8.
[/quote/]

A remnant of the old script I started off with. I guess I should go back and update them as well.

ttfn said:
I've updated the script to work with the new changes.

Let me know if I've made any more hideously embarrassing typos.

Thank you, sir, for the prompt response to my comment. It works better than ever now.

Shinjidude said:
I've noticed that this script doesn't do post highlighting properly

I don't think that information's accessible through the API, so I'll have to ask Albert if it's possible to add it.

Thanks a lot, ttfn!

A note on installation:
If you're logged into miezaru.donmai.us instead of danbooru.donmai.us, you'll have to change the "included pages" preference accordingly. Obvious to me in hindsight, but I looked around for a bit before noticing that.

Would it be possible to set the default to include both?

FWIW: I played around with doing something similar (in site scripts, not GM). Not to get rid of pagination, but rather to keep pagination and have indexes rendered client-side, in order to fix the "incomplete pages due to blacklists/deleted posts/priv posts" problem.

I was aiming to have rendering and pagination look and work roughly the same, but to push a larger set of post data (say, 10 pages of posts) in javascript, and to have a script render the page with as much of that data as necessary. (The extra data shouldn't be a performance problem; it can be cached.)

A tricky part is pagination. Since the script is rendering an arbitrary number of posts, the "next page" isn't a multiple of a page size. It's a fractional number; it would be strange to navigate to page 3, the navigate to it in a different way, and getting a different set of posts even if no posts have actually changed. Displaying "3.2" as the current page would be terrible. I tried making the paginator a percentage, but I didn't like the look of that.

(Of course, it's impossible to actually know in advance how many pages to display in the paginator, since the script doesn't have access to all posts at once.)

All in all, while I don't think this concept is unsalvagable, I don't like the way it was turning out. I'm open to ideas, though, either to make this work or some other way to fix the index holes problem, and more generally, the inability to render indexes dynamically in any way without breaking caching.

I don't know about here, but a fairly big problem on moe is that when people P/C posts, nobody ever sees the child posts. I'd like to have a way to let users "minimize" a pooled set that they're not interested in. This is the problem I was ultimately aiming to solve with this.

That is a tricky one - like you say once you start getting an arbitrary number of posts you lose the relationship between the pages of images you're displaying and the pages in the API. You'd end up displaying half a page of images and then repeating them when the user clicks next.

I think the only way you could make it work is to drop the page numbers entirely and limit the navigation to previous and next, but that's not very satisfactory and you'd lose the bookmarkability of the pages.

If you just want to fix the index holes problem you could write a script to count the number of non-blacklisted images display, and if it's less than say 10, use the API to load the next page and append it to the current one, and then rewrite the page numbering and next links to reflect the fact. (and if it needs to it can keep requesting pages until it has enough non blacklisted images to pass the threshold.)

I think all that you'd need to add to the API to do a pure JS interface is the ability to get the number of posts for a given tag/combination of tags to allow you to handle the pagination yourself.

I do like the idea of minimizing posts and making child posts more accessible. I'll have to have a think about that.

When navigating to the next page, it passes the last post ID that was displayed; the rendering on the next page then starts working at that point, so it doesn't normally repeat posts (other than when new posts arrive and shift things, of course). There's some serious ugliness here, too: when navigating backwards, we have to send the *first* post displayed, and a flag indicating "backwards", so the script starts from there and works backwards. (I really don't like that; it's weird and ugly.)

I have it send the JSON data in the page, because I don't want to have extra latency (and server load) by requesting more pages. Also, you need to know the state of the page in order to render a paginator; I don't want it changing the paginator dynamically (clicking a link and having it change right before you click it is evil, etc). That's why I had it send a fair bit of data. I did have it requesting data dynamically earlier on; removing it and doing it all statically is a major simplification.

I'm leery about doing any kind of dynamic requests in the normal case for server load reasons, too. (It probably doesn't matter if a Greasemonkey script does a lot of JSON requests, since ultimately a tiny minority of people will use it; it's another matter for a script internal to the site that's used by everyone.)

Another issue (that's actually newer than my working on this): you can unhide blacklisted posts. If the script has filled in the holes, unhiding may cause lots of posts to be displayed. (If all of the other issues can be figured out, this is probably an acceptable oddity, though.)

1 2