Page MenuHomePhabricator

Search fails after running compressOld.php
Closed, ResolvedPublic

Description

Author: dharmaweb

Description:
This is the error I get:

Fatal error: Call to undefined function: uncompress()
in /public/vhost/d/dharmaweb/html/includes/HistoryBlob.php on line 256

I enter the value "test" and it failed...it works with some other values such as "Van"

Here is the failed URL:
http://www.dharmaweb.org/index.php/Special:Search?search=test&go=Go

Here is the normal URL:
http://www.dharmaweb.org/index.php/Special:Search?search=van&go=Go

The bug occurred just right after I run the compressOld.php file.


Version: 1.5.x
Severity: normal
OS: Linux
URL: http://www.dharmaweb.org

Details

Reference
bz4268

Event Timeline

bzimport raised the priority of this task from to Medium.Nov 21 2014, 9:00 PM
bzimport set Reference to bz4268.
bzimport added a subscriber: Unknown Object (MLST).

robchur wrote:

First glance, without thinking about it too much, suggests you're missing a
library or extension needed to handle decompression. Off the top of my head, I
don't know if this is a built-in one (which would indicate a dodgy/broken PHP
config.) or something else, but that's what it appears to be.

What version of PHP?

dharmaweb wrote:

We're running PHP version 4.3.11. The BZip2 module, version 1.0.2, is enabled, as
well as ZLib version 1.1.4.

This call failing probably indicates that the unserialize() failed in some way. Stick in
something like var_dump($obj) on the prior lines to see what you've got.

dharmaweb wrote:

This is what I get when I added the var_dump($obj) line
to the HistoryBlob.php include file:

if( !is_object( $obj ) ) {

// Correct for old double-serialization bug.
$obj = unserialize( $obj );

}

var_dump($obj)

Save this item for reference; if pulling many
items in a row we'll likely use it again.
$obj->uncompress();
$wgBlobCache = array( $this->mOldId => $obj );

Parse error: parse error, unexpected T_VARIABLE
in /public/vhost/d/dharmaweb/html/includes/HistoryBlob.ph
p on line 258

You need a semicolon at the end of the line.

dharmaweb wrote:

object(concatenatedgziphistoryblob)(6) { ["mVersion"]=>
int(0) ["mCompressed"]=> bool(false) ["mItems"]=> array
(3) { ["ccd26dfde854fdb73601545c58811298"]=> string
(73370) "

Long text of the article here...

["mDefaultHash"]=> string
(32) "10dea8ff26d89a7e83531f694ad7b535" ["mFast"]=> int
(0) ["mSize"]=> int(0) } object(historyblobstub)(3) {
["mOldId"]=> string(4) "3029" ["mHash"]=> string
(32) "1b661dabd3208de1c359fd47713ad77a" ["mRef"]=>
NULL }

dharmaweb wrote:

Do you know how to undo the CompressOld process?

dharmaweb wrote:

I just found out that viewing some articles is also having the same problem.

Here is the error and the var_dump($obj); statement produces:

object(historyblobstub)(3) { ["mOldId"]=> string(4) "1791" ["mHash"]=> string
(32) "795e63467eeabd9e8f5548fedd0a0794" ["mRef"]=> NULL }
Fatal error: Call to undefined function: uncompress()
in /public/vhost/d/dharmaweb/html/includes/HistoryBlob.php on line 258

Is there way to uncompress? Can I just take current articles and reinstall
the database and import them back in later?

tderouin wrote:

It turns out that when you run compressOld.php on your table, you can lose some
data.

For articles whose 2nd revision is an article move, data will be lost.

For the revisions, compressOld.php will first set the initial text entry to a
ConcatenatedGzipHistoryBlob object.

It will then look at the 2nd revision entry which represents the move:

  • 1. row ******* rev_id: 265949 rev_page: 58415 rev_comment: rev_user: 0 rev_user_text: 68.77.43.0 rev_timestamp: 20060825022115

rev_minor_edit: 0

rev_deleted: 0
rev_text_id: 265193
  • 2. row ******* rev_id: 265984 rev_page: 58415 rev_comment: [[Make You Paper or Essay Longer Than It Is]] moved to [[Make an

Essay Appear Longer Than It Is]]: correct spelling, make title more exact

rev_user: 1254433

rev_user_text: Ladanea
rev_timestamp: 20060825024904
rev_minor_edit: 1

rev_deleted: 0
rev_text_id: 265193

Unfortunately this rev_text_id is exactly the same as the first one, and a
HistoryBlobStub gets stored here overwriting the ConcatenatedGzipHistoryBlob
object and data is lost.

The way to fix this from occuring again is something like this in
maintenance/storage/compressOld.inc:

287,288c287,288
< # Skip if not compressing and
don't overwrite the first revision
< if ( $stubs[$j] !== false &&

$revs[$i + $j]->rev_text_id != $primaryOldid) {

  1. Skip if not compressing if ( $stubs[$j] !== false ) {

Unfortunately there is no way to undo this process and get back your data. Your
only alternative is to somehow import the lost data from a backup, which you
hopefully have, into the text table.

Thanks, Travis!

Applied fix on trunk in r19726 and rel1.9 in r19727.

asmarin wrote:

I have same problem but i dont know how to fix it. What should i do to solve it?