I have a site that relies on the aggregator plugin. This site has aggregated a LOT of content. The site used to reside on its own physical server, but that machine was dying, and I was tired of having it maintained. I moved the site to rackspace.
Everything about the site works just fine with 16Mb allocated to php. However, the aggregator plugin requires 128Mb or it crashes!!!! Since this is a server-wide setting, this is limiting the number of web visitors I can have on a 2 Gb server. Why does the aggregator plugin require such a huge amount of RAM, and can anything be done to reduce the RAM requirement???
EDIT: I am using 0.25 of the plugin. I upgraded to the latest (0.29?). This latest version re-aggregates EVERYTHING, even if it has already been aggregated and should be fixed.
Aggregator plugin requires huge amount of memory
-
Don Chambers
- Regular
- Posts: 3657
- Joined: Mon Feb 13, 2006 2:40 am
- Location: Chicago, IL, USA
- Contact:
-
garvinhicking
- Core Developer
- Posts: 30022
- Joined: Tue Sep 16, 2003 9:45 pm
- Location: Cologne, Germany
- Contact:
Re: Aggregator plugin requires huge amount of memory
XML parsing of multiple sites creates a huge memory demand. You can only reduce it by subscribing to smaller sites. You might want to subscribe to Yahoo Pipes! feeds instead, where you reduce the number of items, for example...
As for the 0.29 problem - what do you mean? That every run duplicates your feed entries? What are your settings exactly (most importantly the used parsing engine)
Regards,
Garvin
As for the 0.29 problem - what do you mean? That every run duplicates your feed entries? What are your settings exactly (most importantly the used parsing engine)
Regards,
Garvin
# Garvin Hicking (s9y Developer)
# Did I help you? Consider making me happy: http://wishes.garv.in/
# or use my PayPal account "paypal {at} supergarv (dot) de"
# My "other" hobby: http://flickr.garv.in/
# Did I help you? Consider making me happy: http://wishes.garv.in/
# or use my PayPal account "paypal {at} supergarv (dot) de"
# My "other" hobby: http://flickr.garv.in/
-
Don Chambers
- Regular
- Posts: 3657
- Joined: Mon Feb 13, 2006 2:40 am
- Location: Chicago, IL, USA
- Contact:
Re: Aggregator plugin requires huge amount of memory
The site in question is aggregating specific categories from 8 other s9y sites. s9y is configured to show 24 entries per feed. Would reducing the number of entries in the feeds reduce the memory requirements? Does the number of entries per feed even apply if the method I am fetching it this way:
In other words, although I have s9y configured to display 24 entries per feed, does the method above result in much more than 24 entries? EDIT: I tried it myself, there are exactly 24 entries in the above example feed.
EDIT #2, In the above example, I am getting a total of the most recent 24 entries from 6 different categories. Would it be more efficient from a RAM perspective to instead get only 4 entries from each of these categories, one category at a time? ie:
Plugin Configuration:
parser: Onyx.
Save aggregated entries as... publish
Expire content: 0
Expire checksums: 0
Ignore updates? No
Remove dependent entries?: No
Allow comments to this entry: No
Disable markup plugins for aggregated entries: NL2BR
Debug Output: No
I reverted from the latest back to 0.25. I am nearly certain the latest version was creating duplicate entries. There is a table that tracks what has already been aggregated, and it was as if this table was being ignored, or maybe entries were not being written to that table. The site is live, and the problem was enormous when it happened, so I really do not want to risk upgrading again. This would be time consuming to set up in a test environment, but I will consider doing so if absolutely necessary.
Code: Select all
http://www.mysite.com/rss.php?serendipity[category]=1;2;4;5;9;13&version=1.0EDIT #2, In the above example, I am getting a total of the most recent 24 entries from 6 different categories. Would it be more efficient from a RAM perspective to instead get only 4 entries from each of these categories, one category at a time? ie:
Code: Select all
http://www.mysite.com/rss.php?serendipity[category]=1&version=1.0
[code]http://www.mysite.com/rss.php?serendipity[category]=2&version=1.0
[code]http://www.mysite.com/rss.php?serendipity[category]=4&version=1.0
[code]http://www.mysite.com/rss.php?serendipity[category]=5&version=1.0
[code]http://www.mysite.com/rss.php?serendipity[category]=9&version=1.0
[code]http://www.mysite.com/rss.php?serendipity[category]=13&version=1.0parser: Onyx.
Save aggregated entries as... publish
Expire content: 0
Expire checksums: 0
Ignore updates? No
Remove dependent entries?: No
Allow comments to this entry: No
Disable markup plugins for aggregated entries: NL2BR
Debug Output: No
I reverted from the latest back to 0.25. I am nearly certain the latest version was creating duplicate entries. There is a table that tracks what has already been aggregated, and it was as if this table was being ignored, or maybe entries were not being written to that table. The site is live, and the problem was enormous when it happened, so I really do not want to risk upgrading again. This would be time consuming to set up in a test environment, but I will consider doing so if absolutely necessary.
=Don=
-
garvinhicking
- Core Developer
- Posts: 30022
- Joined: Tue Sep 16, 2003 9:45 pm
- Location: Cologne, Germany
- Contact:
Re: Aggregator plugin requires huge amount of memory
Hi!
Reducing the number of entries per feed on the s9y sites would help, yes. The fewer items, the less XML to parse. And the RSS Item limit applies to every RSS feed, yes.
Also, reducing the number of entries per feed to 4 would also help in terms of memory, but in terms of script running time it would take more time...
I'd suggest to switch to the Magpie Parser for trying, this should be much less memory consuming.
About the duplication of items, I'm afraid I don'T really know where a problem is unless someone like you could setup a test environment where this is reproducable...
HTH,
Garvin
Reducing the number of entries per feed on the s9y sites would help, yes. The fewer items, the less XML to parse. And the RSS Item limit applies to every RSS feed, yes.
Also, reducing the number of entries per feed to 4 would also help in terms of memory, but in terms of script running time it would take more time...
I'd suggest to switch to the Magpie Parser for trying, this should be much less memory consuming.
About the duplication of items, I'm afraid I don'T really know where a problem is unless someone like you could setup a test environment where this is reproducable...
HTH,
Garvin
# Garvin Hicking (s9y Developer)
# Did I help you? Consider making me happy: http://wishes.garv.in/
# or use my PayPal account "paypal {at} supergarv (dot) de"
# My "other" hobby: http://flickr.garv.in/
# Did I help you? Consider making me happy: http://wishes.garv.in/
# or use my PayPal account "paypal {at} supergarv (dot) de"
# My "other" hobby: http://flickr.garv.in/