https://bugzilla.wikimedia.org/show_bug.cgi?id=65136
Bug ID: 65136
Summary: implement -wait option for checkimages.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: ladsgroup(a)gmail.com
Web browser: ---
Mobile Platform: ---
It can be added throguh this way (needs some works):
This is the code for core
# if len(arg) == 5:
# waitTime = int(pywikibot.input(
# u'How many time do you want to wait before checking the '
# u'files?'))
# elif len(arg) > 5:
# waitTime = int(arg[6:])
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=65135
Bug ID: 65135
Summary: Fix imageuncat.py -yesterday
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: maarten(a)mdammers.nl
Web browser: ---
Mobile Platform: ---
tools.catbot@tools-login:~$ python /shared/pywikipedia/core/scripts/version.py
Pywikibot: [https] r-p-pywikibot-core.git (05e55f3, g3080, 2014/05/10,
07:16:30, OUTDATED)
Release version: 2.0b1
Python: 2.7.3 (default, Feb 27 2014, 19:58:35)
[GCC 4.6.3]
unicode test: ok
tools.catbot@tools-login:~$ python
/shared/pywikipedia/core/scripts/imageuncat.py -yesterday
Traceback (most recent call last):
File "/shared/pywikipedia/core/scripts/imageuncat.py", line 21, in <module>
import query
ImportError: No module named query
<type 'exceptions.ImportError'>
CRITICAL: Waiting for 1 network thread(s) to finish. Press ctrl-c to abort
tools.catbot@tools-login:~$
Query doesn't work in core and I don't want to add it. Probably best to modify
the function to use logevents() in site.
def logevents(self, logtype=None, user=None, page=None,
start=None, end=None, reverse=False, step=None, total=None):
"""Iterate all log entries.
@param logtype: only iterate entries of this type (see wiki
documentation for available types, which will include "block",
"protect", "rights", "delete", "upload", "move", "import",
"patrol", "merge")
@param user: only iterate entries that match this user name
@param page: only iterate entries affecting this page
@param start: only iterate entries from and after this Timestamp
@param end: only iterate entries up to and through this Timestamp
@param reverse: if True, iterate oldest entries first (default: newest)
"""
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=65115
Bug ID: 65115
Summary: Update category.py to use action=move to move category
description pages
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: category.py
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Web browser: ---
Mobile Platform: ---
With the resolution of bug 5451, we can now use ?action=move to move the
category description page rather than copying it manually and attributing the
editors in the history.
We can check if the user has the 'move-categorypages' right, and if not
fallback on the old functionality (which will work for older wikis).
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=65013
Bug ID: 65013
Summary: Test QueryGenerator.limit
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: valhallasw(a)arctus.nl
Blocks: 58941
Web browser: ---
Mobile Platform: ---
Test various values, including:
0,1,many,
-1, -somethingelse,
None (should probably throw a warning given it's not actually an allowed value)
See https://gerrit.wikimedia.org/r/#/c/131991/ for more details.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=64997
Bug ID: 64997
Summary: Site.preloadpages yields only one item per data
request
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: info(a)gno.de
Web browser: ---
Mobile Platform: ---
(Maybe a new with mw 1.24) Site,preloadpageds only yields 1 item. Sample as
follows:
>>> import pwb
>>> import pywikibot as py
>>> s = py.Site('bar')
>>> pagenames =['Haiku', 'Homer', 'Ilias', 'Watzmo', 'Wean']
>>> pagelist = [py.Page(s, name) for name in pagenames]
>>> gen = s.preloadpages(pagelist)
>>> for p in gen:
print p.title()
Retrieving 5 pages from wikipedia:bar.
Haiku
the request was
/w/api.php?maxlag=5&format=json&rvprop=ids|flags|timestamp|user|comment|content&prop=revisions|info|categoryinfo&titles=Homer|Ilias|Watzmo|Wean|Haiku&meta=userinfo&indexpageids=&action=query&uiprop=blockinfo|hasmsg
inserting direct request call into preloadpages with
data = rvgen.request.submit()
print '#### ####', len(data)
gives me
#### #### 1
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=64855
Bug ID: 64855
Summary: Port get.py to core
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Keywords: easy
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: valhallasw(a)arctus.nl
Blocks: 55880
Web browser: ---
Mobile Platform: ---
Very simple script which gets a page and writes its contents to
standard output. This makes it possible to pipe the text to another
process.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=64899
Bug ID: 64899
Summary: object has no atribute 'argvu'
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: critical
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: jan.dudik(a)gmail.com
Web browser: ---
Mobile Platform: ---
I:\py\rewrite>pwb.py harvest_template -ref:template:Infobox_-_sídlo
-template:"Infobox - sídlo" -namespace:0 obrázek P18 obec P131 město P131 obvod
P131 psč P281 web P856 commons P373 vlajka P41 znak P94 pečeť P158
Configuration variable 'use_api' is defined but unknown. Misspelled?
Traceback (most recent call last):
File "I:\py\rewrite\pwb.py", line 126, in <module>
argvu = pwb.argvu[1:]
AttributeError: 'module' object has no attribute 'argvu'
<type 'exceptions.AttributeError'>
CRITICAL: Waiting for 1 network thread(s) to finish. Press ctrl-c to abort
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55146
Web browser: ---
Bug ID: 55146
Summary: interwiki() fails for closed wikipedia
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: ASSIGNED
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1608/
Reported by: cdpark
Created on: 2013-04-04 01:32:44
Subject: interwiki() fails for closed wikipedia
Assigned to: amird
Original description:
Page.interwiki\(\) and PageData.interwiki\(\) fails when wikidata contains
sitelinks of closed wikipedias.
For example, following code is broken now.
\---------------------------------------------------------------------------------------------------
\#\!/usr/bin/python
\# -\*- coding: utf-8 -\*-
import pywikibot
en = pywikibot.getSite\('en', 'wikipedia'\)
mainpage = pywikibot.Page\(en, u'Main Page'\)
interwiki = mainpage.interwiki\(\)
print interwiki
\--------------------------------------------------------------------------------------------------------------
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=61120
Web browser: ---
Bug ID: 61120
Summary: pywikibot.Link.langlinkUnsafe can't deal with obsolete
sites
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: UNCONFIRMED
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: nullzero.free(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
print list(pywikibot.Page(pywikibot.getSite("af"), "Maan").iterlanglinks())
raises NoSuchSite exception
I actually encountered this problem when I ran featured.py. It seems that there
is an attempt (gerrit: 112322) to fix this problem, but the problem still
persists.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63387
Bug ID: 63387
Summary: Category .py + Template.py
Product: Pywikibot
Version: compat (1.0)
Hardware: PC
URL: https://so.wikipedia.org/w/index.php?title=Template%3A
Sawirka_Maanta_1_Abriil&diff=128335&oldid=128321
OS: Windows XP
Status: UNCONFIRMED
Severity: normal
Priority: Unprioritized
Component: copyright.py
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: abshirdheere(a)hotmail.com
Web browser: ---
Mobile Platform: ---
I have some problem, I can't remove Category to change anothe Category
Example:
-remove category [[ Category : mage of the day ]] from the page <So .
Wikipedia . org / w / Template:Image of the day 1 April> add Another One [[
Category : Image of the day April ]]
Is +360 pages I would like to do one time non single page like this page
https://so.wikipedia.org/w/index.php?title=Template%3ASawirka_Maanta_1_Abri…
Thanks
--
You are receiving this mail because:
You are the assignee for the bug.