One of my feeds typically produces some 150 to 300 entries over the course of the night. I went to bed late last night, so there were only 144 unread entries in the morning. NewsBlur, however, only showed exactly 100, and I couldn't scroll further into the past. Is that a temporary limitation that will be lifted again when it’s migrated?
I can’t lose any of those entries, and I need to be able to see all past read entries as well.
My account name is Ponyfeeder (premium account). Below a screenshot.
I can’t lose any of those entries, and I need to be able to see all past read entries as well.
My account name is Ponyfeeder (premium account). Below a screenshot.
- 18 Posts
- 19 Reply Likes
Posted 2 years ago
- 15 Posts
- 4 Reply Likes
If this is another hard limitation I'm just going to cancel my premium and switch to TinyRSS.
- 18 Posts
- 19 Reply Likes
Oh c’mon, it’s not even a bug!
It’s Feed.trim_feed that cuts off the feeds and discards all my unread entries before I can read them in the morning.
A feed with only one premium subscriber—me—is trimmed to 100 entries.
And I was worried about how to import my backlog of ten thousands of entries I need to keep a record of.
What is this trimming good for? You have a cluster of 4+ servers running MongoDB, which has “humongous” in its name. I’m sure you can store more than a few hundred entries per feed.
This is a complete show blocker for me. A backlog of 1000 entries would be barely acceptable, since we would at least get to read them, but Google Reader has always functioned as an archive for us that could use in our research. The lack of a search function is already a problem in that regard.
@happy.cerberus Yep, I’ve used TinyTinyRSS for a while, and our server doesn’t have much load anyway. It works great, I just don’t like the frontend so much, but that should be the least of my worries.
It’s Feed.trim_feed that cuts off the feeds and discards all my unread entries before I can read them in the morning.
A feed with only one premium subscriber—me—is trimmed to 100 entries.
And I was worried about how to import my backlog of ten thousands of entries I need to keep a record of.
What is this trimming good for? You have a cluster of 4+ servers running MongoDB, which has “humongous” in its name. I’m sure you can store more than a few hundred entries per feed.
This is a complete show blocker for me. A backlog of 1000 entries would be barely acceptable, since we would at least get to read them, but Google Reader has always functioned as an archive for us that could use in our research. The lack of a search function is already a problem in that regard.
@happy.cerberus Yep, I’ve used TinyTinyRSS for a while, and our server doesn’t have much load anyway. It works great, I just don’t like the frontend so much, but that should be the least of my worries.
I would like to see a response to this too - I've been coming back to this topic hoping for one, and given the developer's responsiveness, I'm surprised not to see one yet.
- 18 Posts
- 19 Reply Likes
tonyduckles has already posted this issue on GitHub, including code for an alternative date-based approach. Also no response.
- 18 Posts
- 19 Reply Likes
Setting up Tiny Tiny RSS was a breeze. Now I can compare better. (Careful, version 1.7.5 is buggy; I used the dev version from the repo where it is already fixed.)
Major bummer, yeah. Hopefully such issues will be fixed soon, after scaling work is done.
- 1 Post
- 1 Reply Like
I inquired via Twitter about the maximum number of unread items that NewsBlur will keep track of. The developer was good enough to respond. He stated that the current cap is 300 items. Though the said this limit only applied to popular feeds. He also said he'd be raising it in the future.
I'm in the same situation. I follow several photoblogs. Some of them produce 500 posts per day easily.
I'm in the same situation. I follow several photoblogs. Some of them produce 500 posts per day easily.
- 18 Posts
- 19 Reply Likes
Let’s hope so.
I linked the exact numbers above. 300 is the limit or feeds with 31 to 50 subscribers in total or 4 to 5 premium subscribers.
I linked the exact numbers above. 300 is the limit or feeds with 31 to 50 subscribers in total or 4 to 5 premium subscribers.
I don't think RSS is the right use case for those kinds of high-volume feeds...
Maybe it should work as you expect and not just cut you off, but I believe that your approach to the problem is flawed. In my opinion RSS is best used for feeds with relatively low amounts of output, not such 300 posts/night "monster feeds".
Maybe it should work as you expect and not just cut you off, but I believe that your approach to the problem is flawed. In my opinion RSS is best used for feeds with relatively low amounts of output, not such 300 posts/night "monster feeds".
- 15 Posts
- 4 Reply Likes
I love how people try to impose their own habits on other users.
RSS is an absolute necessity for sites with huge traffic and sites that don't have regular update schedule.
You can't manually track such sites.
RSS is an absolute necessity for sites with huge traffic and sites that don't have regular update schedule.
You can't manually track such sites.
What do you suggest as an alternative to RSS in these 'high-volume feeds'?
1) Your use case is not the only true use case.
2) The cut is actually could be as low as 100 posts.
3) Google reader works fine here.
The issue is not only about 300 post/night feeds, but also with 20+ posts/day news feeds that read once a week and similar cases.
2) The cut is actually could be as low as 100 posts.
3) Google reader works fine here.
The issue is not only about 300 post/night feeds, but also with 20+ posts/day news feeds that read once a week and similar cases.
- 2 Posts
- 0 Reply Likes
Samuel Clay, Official Rep
- 5251 Posts
- 1172 Reply Likes
The limit is 500 stories for most feeds, but single subscriber feeds can be as low as 100. That's the limit to the server and I won't be able to raise it until I can handle the current load. You're free to run your own hosted instance, but the limits are in place for a reason.
- 11 Posts
- 3 Reply Likes
You should publish these limits in a clearly visible place. That would be much more helpful than people running into them and taking months to discover the cause of the problem.
I wish this was more visible, I'm not sure I would have purchased if i had known what the limits were along these lines. Its not uncommon for me to end up with thousands of things unread that I'll go through in batches when I have time.
For reference, I pay $16/year for 10GB of email storage (that's years of not deleting anything for me). $24 for 100 messages per feed? I understand that there are growing pains at present so I don't feel too bad about this for now, but do understand that it seems unreasonable in the long term.
I mean this in the most positive and supportive way possible. I look forward to when you have made it through the big transition. Have a great weekend!
I mean this in the most positive and supportive way possible. I look forward to when you have made it through the big transition. Have a great weekend!
Samuel Clay, Official Rep
- 5251 Posts
- 1172 Reply Likes
Headsean, if you were on Reader then I doubt you had more than a thousand unread items, as Reader itself had a 1,000 item max. In fact, I cut Reader's limits in half for both stories and unread states, hence why I have the defaults I have.
The only limit that I know is marking items as read after one month, but they are still accessible after that. There is no quantity limits.
- 36 Posts
- 3 Reply Likes
This is not only about unread items, these stories are *completely* deleted from the feed. Google Reader never did that.
- 7 Posts
- 3 Reply Likes
Reader easily handled thousands of unread items. It just didn't show the correct unread count. It has *never* marked anything as read that you haven't actually read yet.
^^. I would actually be happy if the counts just stopped after ~1000 (or even 500), but keeping the read state of `everything` is the whole point of using a reader in the first place.
I'm sorry Sam, but that simply isn't even close to true.
http://imgur.com/Qz4u99M
It would take an incredible amount of time, but I can guarantee that each of those entries would be counted down until 0 if I went through the standard view pane, story by story.
http://imgur.com/Qz4u99M
It would take an incredible amount of time, but I can guarantee that each of those entries would be counted down until 0 if I went through the standard view pane, story by story.
That screen shot is showing folders which is slightly different
There's a picture of a single feed with 1k+. I can browser through that feed for quite some time before the plus gets dropped and it starts counting down. Only the display gets capped at 1000.
There's a picture of a single feed with 1k+. I can browser through that feed for quite some time before the plus gets dropped and it starts counting down. Only the display gets capped at 1000.
- 18 Posts
- 19 Reply Likes
As far as I’m aware, there is a 1000 entries limit for the number of entries that (or whose status) can be retrieved by a client in a single request. Maybe Sam mixed that up. (I’ve never worked with the API myself, though, and only skimmed through the Lightread sources to find some (perceived) bugs, so I’m hazy on the details.)
- 18 Posts
- 19 Reply Likes
Welp, sorry such a detail had to spoil the otherwise fluffy service. I canceled the PayPal subscription and deleted my two feeds to save you the traffic.
Please feel free to downgrade the account (Ponyfeeder) and donate the difference to Your Siblings. :-)
And please update this thread when you have lifted the limits. I get email notifications of new posts here.
Please feel free to downgrade the account (Ponyfeeder) and donate the difference to Your Siblings. :-)
And please update this thread when you have lifted the limits. I get email notifications of new posts here.
Samuel, it'd be useful – at least – to have this limit shown on feed's statistics page ("This feed is limited to X stories").
That shouldn't be that hard, IMHO. Feed.trim_feed() needs to return trim_cutoff, which is then saved and shown in /rss_feeds/statistics. Still better than nothing :-)
That shouldn't be that hard, IMHO. Feed.trim_feed() needs to return trim_cutoff, which is then saved and shown in /rss_feeds/statistics. Still better than nothing :-)
Wow. @Samuel, you never mentioned any of these limits and now I'm just disappointed. I'll be switching over to something else and canceling NewsBlur, won't even recommend it to anybody else.
Too add to the reasons why this is a horrible design choice. It totally screws up personalized feeds. Examples would be feeds that have a hash in them because they were paid for (eg: ars technica), or feeds that are specific in some other way (eg: flickr friends feed, yahoo pipes feed etc)
- 1 Post
- 1 Reply Like
Thats a serious bummer. I subscribe to a feed that generates about 50 entries per hour...
- 1 Post
- 2 Reply Likes
sry but this is shit. i bought newsblur to replace google reader but it wasnt a good idea. digg reader im coming
Related Conversations
Double Entries In Feed List
- Bryn Hughes, 2 years ago
- Last reply: Samuel Clay, 2 years ago
- 3
- 3
- Problem
Dreamwidth Feed with Cut
- Alexandra Ulbrich, 10 months ago
- 1
- 0
- Problem
Spurious entries in the feed for "In the Pipeline"
- db48x, 8 months ago
- Last reply: db48x, 8 months ago
- 3
- 1
- Problem
Group entries by source/feed
- Lansett, 3 years ago
- Last reply: Schla, 7 months ago
- 3
- 6
- Idea
Related Categories
-
NewsBlur.com
- 2634 Conversations
- 41 Followers






