Page MenuHomePhabricator

PostData() datas are not gzip compressed
Closed, DeclinedPublic

Description

Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1447/
Reported by: Anonymous user
Created on: 2012-05-19 16:38:59
Subject: PostData() datas are not gzip compressed
Assigned to: xqt
Original description:
Using the 'pagegenerators', data transfer is not compressed.
That data will be several megabytes.
That transfer is very time consuming.

Therefore, I think a good idea to remove that line.

Thank you consider.


Version: compat-(1.0)
Severity: major
See Also:
https://sourceforge.net/p/pywikipediabot/bugs/1447

Details

Reference
bz55197

Event Timeline

bzimport raised the priority of this task from to High.Nov 22 2014, 2:26 AM
bzimport set Reference to bz55197.
bzimport added a subscriber: Unknown Object (????).

It is less time consuming than loading e.g. 60 times each page separately. What is the intention of this request?

I also do not consume more time is good.

gzip compression will compress to about one-third of the text data.
Therefore, transfer time will be one-third. Is roughly.

However, data transfer has not been compressed in the generator.
This proposal is one way to solve this problem.

This has nothing to do with preloading generator. http requests are done by Site.PostData\(\) or Site.getUrl\(\) with gzip compression enabled by default.

  • assigned_to: nobody --> xqt

I understand. In the request header are described.
However, the response data are not compressed.

May be a bug of MediaWiki.

I don't know whether they are compressed or not. How did you see it?

I was confirmed by Wireshark.
This is a packet analysis software.

Maybe the header is wrong. I found 'Accept-encoding' as header parameter but at https://www.mediawiki.org/wiki/API:Client\_Code header is described as 'Accept-Encoding' \(capital Encoding\)

  • summary: Do not use preloading in featured.py --> PostData() datas are not gzip compressed
Aklapper lowered the priority of this task from High to Lowest.Jun 5 2015, 1:41 PM
Aklapper subscribed.

Pywikibot has two versions: Compat and Core. This task was filed about the older version, called Pywikibot-compat, which is not under active development anymore. Hence I'm lowering the priority of this task to reflect the reality. Unfortunately, the Pywikibot team does not have the manpower to retest every single bug report / feature request against the (maintained) Pywikibot code base. Furthermore, the code base of Pywikibot-Compat has changed a lot compared to the code base of Pywikibot-Core so there is a chance that the problem described in this task might not exist anymore. Please help: Unfortunately manpower is limited and does not allow testing every single reported task again. If you have time and interest in Pywikibot, please upgrade to Pywikibot-Core and add a comment to this task if the problem in this task still happens in Pywikibot-Core (or directly edit the task by removing the Pywikibot-compat project and adding the Pywikibot project to this task). To learn more about Pywikibot and to get involved in its development, please check out https://www.mediawiki.org/wiki/Manual:Pywikibot/Development Thank you for your understanding.

jayvdb claimed this task.

It appears the status of this bug is unknown. It is unlikely anyone cares enough to test it either.

However even if it is still a bug, compat is in 'decommissioning' mode, or at least 'maintenance-only' mode, so this feature is not likely to be implemented/merged as it risks introducing bugs.