External feed has duplicate rows causing problems
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
| Mahara |
High
|
Unassigned |
Bug Description
Older sites can end up with multiple rows in the blocktype_
This causes problem when one tries to add a new rss feed block to a page if there is more than one row in the db of the same feed url.
[WAR] bc (lib/errors.
Call stack (most recent first):
log_
log_
SQLExceptio
get_
get_
ensure_
PluginBlock
To sort this out we are going to need a script in the the lib/db/upgrade.php to handle the clearing out of the duplicate rows and making sure the block instance configdata now points to the 1 row left in the db.
Robert Lyon (robertl-9) wrote : | #1 |
Robert Lyon (robertl-9) wrote : | #2 |
Changed in mahara: | |
status: | New → In Progress |
Changed in mahara: | |
milestone: | 1.8.1 → 1.8.2 |
Reviewed: https:/
Committed: http://
Submitter: Robert Lyon (<email address hidden>)
Branch: master
commit 887d5deb3cde9fb
Author: Robert Lyon <email address hidden>
Date: Mon Dec 2 13:53:18 2013 +1300
Removing duplicate feed url rows (bug #1239461)
Older sites can end up with duplicate rows in the
blocktype_
the site is using latest mahara as it stops one being able to add
/edit a feed if there are duplicate rows for that feed's url.
Change-Id: I47ee3952dbe75b
Signed-off-by: Robert Lyon <email address hidden>
Signed-off-by: Aaron Wells <email address hidden>
Changed in mahara: | |
status: | In Progress → Fix Committed |
Changed in mahara: | |
status: | Fix Committed → Fix Released |
Changed in mahara: | |
status: | Fix Released → Fix Committed |
milestone: | 1.8.2 → 1.9.0 |
Changed in mahara: | |
status: | Fix Committed → Fix Released |
Patch for "master" branch: https:/
Aaron Wells (u-aaronw) wrote : | #4 |
Another patch for this one which accidentally got pushed directly to the 1.9_STABLE branch (back in April) during the 1.9.0 release process. Consequently it didn't get into gerrit, and didn't get forward-ported to master.
Aaron Wells (u-aaronw) wrote : | #5 |
Commit: 4b39ed5d22df0c4
Reviewed: https:/
Committed: http://
Submitter: Aaron Wells (<email address hidden>)
Branch: master
commit 4adb421b011ee4b
Author: Aaron Wells <email address hidden>
Date: Tue Apr 15 16:50:23 2014 +1200
External feed block upgrade should be in the block's upgrade script
Change-Id: Ib574072c06aec2
Thoughts on sorting this out - will need a script to do:
Step 1: Find the duplicate feed urls: (1 db call)
SELECT COUNT(url), url, authuser, authpassword FROM {blocktype_ externalfeed_ data} GROUP BY url, authuser, authpassword HAVING COUNT(url) > 1 ORDER BY COUNT(url);
step 2: get the feed ids and use the 1st one to be the feed id for the block instances: (many db calls depending on how many rows from step 1 have duplicates)
SELECT id FROM {blocktype_ externalfeed_ data} WHERE url = ?, authuser = ?, authpassword = ?;
- add the ids to an $array[ $duplicatefeed] ['ids'] [] and use 'ids'[0] as the 'true' id
step 3: Find the block instances using external feeds. (1 db call). Check to see if they are not using the 'true' id and update them accordingly:
SELECT id FROM {block_instance} WHERE blocktype = 'externalfeed';
- get block_instance object and get $blockfeedid from object $blockid) ; >get('configdat a'); $configdata[ 'feedid' ])) { 'feedid' ] && $key != '0') {
$configdata[ 'feedid' ] = $ids[0]; e->set( 'dirty' ) = true;
$blockinstanc e->commit( ); (possibly very many db calls)
break;
$blockinstance = new BlockInstance(
$configdata = $blockinstance-
if (!empty(
foreach ($array as $duplicatefeed => $ids) {
foreach ($ids as $key => $id) {
if ($id == $configdata[
// not using the first instance id of this feed
$blockinstanc
}
}
}
}
step 4: delete the now orphaned feed id(s) rows
foreach ($array as $duplicatefeed => $ids) { externalfeed_ data} WHERE id = ? (possibly very many db calls)
foreach ($ids as $key => $id) {
if ($key != '0') {
DELETE FROM {blocktype_
}
}
}
Currently I'm thinking that this approach may use too many db calls. But not sure how to get feedid from configdata column in an easier way. Maybe a preg_match on the serialized data?