jenkins-bot has submitted this change and it was merged.
Change subject: Define config.db_port as local port for mysql server
......................................................................
Define config.db_port as local port for mysql server
Define config.db_port to specify local port for a mysql server.
This allows to run SQL queries on Tool Labs Database from local
computer setting up a SSH tunnel.
(note: SSH tunnel shall be open separately; it is not open by
pywikibot.)
Change-Id: I451a8ed338445b38806b20d3eaed63afc3b5de09
---
M pywikibot/config2.py
M pywikibot/pagegenerators.py
2 files changed, 13 insertions(+), 4 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/config2.py b/pywikibot/config2.py
index 7fc9d7a..4d065a8 100644
--- a/pywikibot/config2.py
+++ b/pywikibot/config2.py
@@ -658,6 +658,9 @@
db_password = ''
db_name_format = '{0}'
db_connect_file = user_home_path('.my.cnf')
+# local port for mysql server
+# ssh -L 4711:enwiki.labsdb:3306 user(a)tools-login.wmflabs.org
+db_port = 3306
# ############# SEARCH ENGINE SETTINGS ##############
diff --git a/pywikibot/pagegenerators.py b/pywikibot/pagegenerators.py
index 2f80a3b..4633b5c 100644
--- a/pywikibot/pagegenerators.py
+++ b/pywikibot/pagegenerators.py
@@ -2408,12 +2408,18 @@
import MySQLdb as mysqldb
if site is None:
site = pywikibot.Site()
+
if config.db_connect_file is None:
- conn = mysqldb.connect(config.db_hostname, db=config.db_name_format.format(site.dbName()),
- user=config.db_username, passwd=config.db_password)
+ conn = mysqldb.connect(config.db_hostname,
+ db=config.db_name_format.format(site.dbName()),
+ user=config.db_username,
+ passwd=config.db_password,
+ port=config.db_port)
else:
- conn = mysqldb.connect(config.db_hostname, db=config.db_name_format.format(site.dbName()),
- read_default_file=config.db_connect_file)
+ conn = mysqldb.connect(config.db_hostname,
+ db=config.db_name_format.format(site.dbName()),
+ read_default_file=config.db_connect_file,
+ port=config.db_port)
cursor = conn.cursor()
pywikibot.output(u'Executing query:\n%s' % query)
--
To view, visit https://gerrit.wikimedia.org/r/275159
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I451a8ed338445b38806b20d3eaed63afc3b5de09
Gerrit-PatchSet: 3
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: jenkins-bot <>
Build Update for wikimedia/pywikibot-core
-------------------------------------
Build: #3285
Status: Errored
Duration: 11 minutes and 17 seconds
Commit: 2abff2f (special_page_limit)
Author: xqt
Message: [Cleanup] Remove special_page_limit from config2.py
special_page_limit was introduced in compat for retrieving pages from
mw special list via screen scraping. The use of this parameter in compat
is a mess but in core it is not used. api may read all items except a total
parameter for a generator or treat counter restricts it.
The only use of that parameter is inside site.watched_page but it doesn't
make any sense to have that default value.
Change-Id: Idd0afa55cec93b92abbd5f0a9d2d0b9317a9a42a
View the changeset: https://github.com/wikimedia/pywikibot-core/commit/2abff2f39149
View the full build log and details: https://travis-ci.org/wikimedia/pywikibot-core/builds/113718376
--
You can configure recipients for build notifications in your .travis.yml file. See https://docs.travis-ci.com/user/notifications
Build Update for wikimedia/pywikibot-core
-------------------------------------
Build: #3284
Status: Errored
Duration: 13 minutes and 10 seconds
Commit: ce01161 (global_step)
Author: xqt
Message: [IMPR] Provide global step parameter via config2.py
step parameter is reimplemented as config setting used by api.
It could be set either in user_config.py or given as global
option -step:<value>.
As before, default is 50 and setting could be overwriten
by set_query_increment() during runtime.
Bug: T109208
Change-Id: I02c6ea9c1a8fc6c3cd4fdf1e39f7317b671ffa10
View the changeset: https://github.com/wikimedia/pywikibot-core/compare/3476c40d9cfc^...ce01161…
View the full build log and details: https://travis-ci.org/wikimedia/pywikibot-core/builds/113715984
--
You can configure recipients for build notifications in your .travis.yml file. See https://docs.travis-ci.com/user/notifications
jenkins-bot has submitted this change and it was merged.
Change subject: site.loadrevisions: Do not compare step with 0 as it may be None
......................................................................
site.loadrevisions: Do not compare step with 0 as it may be None
Comparison of None with other types raises type error in Python 3.x.
Add a test to related test module.
Bug: T128849
Change-Id: If6e99c3743c4168c268c5106fc64fdcd62cd1ea7
---
M pywikibot/site.py
M tests/site_tests.py
2 files changed, 2 insertions(+), 1 deletion(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/site.py b/pywikibot/site.py
index aac59f5..4304a67 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -3799,7 +3799,7 @@
# assemble API request
rvgen = self._generator(api.PropertyGenerator, total=total, **rvargs)
- if step > 0:
+ if step:
rvgen.set_query_increment = step
if latest or "revids" in rvgen.request:
diff --git a/tests/site_tests.py b/tests/site_tests.py
index a5c1dea..a54b9ab 100644
--- a/tests/site_tests.py
+++ b/tests/site_tests.py
@@ -2221,6 +2221,7 @@
"""Test the site.loadrevisions() method."""
# Load revisions without content
self.mysite.loadrevisions(self.mainpage, total=15)
+ self.mysite.loadrevisions(self.mainpage)
self.assertFalse(hasattr(self.mainpage, '_text'))
self.assertEqual(len(self.mainpage._revisions), 15)
self.assertIn(self.mainpage._revid, self.mainpage._revisions)
--
To view, visit https://gerrit.wikimedia.org/r/274933
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: If6e99c3743c4168c268c5106fc64fdcd62cd1ea7
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Dalba <dalba.wiki(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Add extra_headers config option
......................................................................
Add extra_headers config option
Also, avoid pywikibot.comms.http._enqueue changing the header
parameter passed to it.
Change-Id: Ifc57b746bcb40b6bb4de752c202768ccd8af9792
---
M pywikibot/comms/http.py
M pywikibot/config2.py
2 files changed, 15 insertions(+), 5 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/comms/http.py b/pywikibot/comms/http.py
index 0d20e5b..260f486 100644
--- a/pywikibot/comms/http.py
+++ b/pywikibot/comms/http.py
@@ -399,15 +399,15 @@
callbacks += kwargs.pop('callbacks', [])
- if not headers:
- headers = {}
+ all_headers = config.extra_headers.copy()
+ all_headers.update(headers or {})
- user_agent_format_string = headers.get("user-agent", None)
+ user_agent_format_string = all_headers.get('user-agent')
if not user_agent_format_string or '{' in user_agent_format_string:
- headers["user-agent"] = user_agent(None, user_agent_format_string)
+ all_headers['user-agent'] = user_agent(None, user_agent_format_string)
request = threadedhttp.HttpRequest(
- uri, method, body, headers, callbacks, **kwargs)
+ uri, method, body, all_headers, callbacks, **kwargs)
_http_process(session, request)
return request
diff --git a/pywikibot/config2.py b/pywikibot/config2.py
index 0c7cd50..7fc9d7a 100644
--- a/pywikibot/config2.py
+++ b/pywikibot/config2.py
@@ -241,6 +241,16 @@
# by setting this to true.
ignore_file_security_warnings = False
+# Custom headers to send on all requests.
+# This is mainly intended to support setting the
+# X-Wikimedia-Debug header, which is sometimes
+# needed to debug issues with Wikimedia sites:
+# https://wikitech.wikimedia.org/wiki/Debugging_in_production
+#
+# Note that these headers will be sent with all requests,
+# not just MediaWiki API calls.
+extra_headers = {}
+
def user_home_path(path):
"""Return a file path to a file in the user home."""
--
To view, visit https://gerrit.wikimedia.org/r/274616
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Ifc57b746bcb40b6bb4de752c202768ccd8af9792
Gerrit-PatchSet: 5
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Gergő Tisza <gtisza(a)wikimedia.org>
Gerrit-Reviewer: Gergő Tisza <gtisza(a)wikimedia.org>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Remove "step" parameter from site, page and pagegenerators methods.
......................................................................
Remove "step" parameter from site, page and pagegenerators methods.
- deprecate formal "step" parameter
- rename "step" parameter of preloading generators to "groupsize"
- kept for site.loadrevisions
- for generators set_query_increment may be used when needed
Bug: T109208
Bug: T100629
Change-Id: Iac0afb6479f2c73e08c39412903d1a8d6d5092b9
---
M pywikibot/bot.py
M pywikibot/page.py
M pywikibot/pagegenerators.py
M pywikibot/site.py
M scripts/blockpageschecker.py
M scripts/category_redirect.py
M scripts/patrol.py
M scripts/redirect.py
M scripts/reflinks.py
M scripts/weblinkchecker.py
M tests/isbn_tests.py
M tests/pagegenerators_tests.py
M tests/site_tests.py
13 files changed, 338 insertions(+), 372 deletions(-)
Approvals:
Mpaa: Looks good to me, but someone else must approve
Ladsgroup: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/bot.py b/pywikibot/bot.py
index cc4e1e4..6990c79 100644
--- a/pywikibot/bot.py
+++ b/pywikibot/bot.py
@@ -48,7 +48,7 @@
used.
"""
#
-# (C) Pywikibot team, 2008-2015
+# (C) Pywikibot team, 2008-2016
#
# Distributed under the terms of the MIT license.
#
@@ -1817,8 +1817,7 @@
@type property_name: str
"""
ns = self.site.data_repository().property_namespace
- for page in self.site.search(property_name, step=1, total=1,
- namespaces=ns):
+ for page in self.site.search(property_name, total=1, namespaces=ns):
page = pywikibot.PropertyPage(self.site.data_repository(),
page.title())
pywikibot.output(u"Assuming that %s property is %s." %
diff --git a/pywikibot/page.py b/pywikibot/page.py
index 8e6f4c2..b1a731c 100644
--- a/pywikibot/page.py
+++ b/pywikibot/page.py
@@ -902,9 +902,10 @@
disambigInPage = disambigs.intersection(templates)
return self.namespace() != 10 and len(disambigInPage) > 0
+ @deprecated_args(step=None)
def getReferences(self, follow_redirects=True, withTemplateInclusion=True,
onlyTemplateInclusion=False, redirectsOnly=False,
- namespaces=None, step=None, total=None, content=False):
+ namespaces=None, total=None, content=False):
"""
Return an iterator all pages that refer to or embed the page.
@@ -919,7 +920,6 @@
is used as a template.
@param redirectsOnly: if True, only iterate redirects to self.
@param namespaces: only iterate pages in these namespaces
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@param content: if True, retrieve the content of the current version
of each referring page (default False)
@@ -936,13 +936,13 @@
withTemplateInclusion=withTemplateInclusion,
onlyTemplateInclusion=onlyTemplateInclusion,
namespaces=namespaces,
- step=step,
total=total,
content=content
)
+ @deprecated_args(step=None)
def backlinks(self, followRedirects=True, filterRedirects=None,
- namespaces=None, step=None, total=None, content=False):
+ namespaces=None, total=None, content=False):
"""
Return an iterator for pages that link to this page.
@@ -951,7 +951,6 @@
@param filterRedirects: if True, only iterate redirects; if False,
omit redirects; if None, do not filter
@param namespaces: only iterate pages in these namespaces
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@param content: if True, retrieve the content of the current version
of each referring page (default False)
@@ -961,12 +960,12 @@
followRedirects=followRedirects,
filterRedirects=filterRedirects,
namespaces=namespaces,
- step=step,
total=total,
content=content
)
- def embeddedin(self, filter_redirects=None, namespaces=None, step=None,
+ @deprecated_args(step=None)
+ def embeddedin(self, filter_redirects=None, namespaces=None,
total=None, content=False):
"""
Return an iterator for pages that embed this page as a template.
@@ -974,7 +973,6 @@
@param filter_redirects: if True, only iterate redirects; if False,
omit redirects; if None, do not filter
@param namespaces: only iterate pages in these namespaces
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@param content: if True, retrieve the content of the current version
of each embedding page (default False)
@@ -983,7 +981,6 @@
self,
filterRedirects=filter_redirects,
namespaces=namespaces,
- step=step,
total=total,
content=content
)
@@ -1304,7 +1301,8 @@
else:
raise pywikibot.NoPage(self)
- def linkedPages(self, namespaces=None, step=None, total=None,
+ @deprecated_args(step=None)
+ def linkedPages(self, namespaces=None, total=None,
content=False):
"""
Iterate Pages that this Page links to.
@@ -1316,8 +1314,6 @@
@param namespaces: only iterate links in these namespaces
@param namespaces: int, or list of ints
- @param step: limit each API call to this number of pages
- @type step: int
@param total: iterate no more than this number of pages in total
@type total: int
@param content: if True, retrieve the content of the current version
@@ -1327,7 +1323,7 @@
@return: a generator that yields Page objects.
@rtype: generator
"""
- return self.site.pagelinks(self, namespaces=namespaces, step=step,
+ return self.site.pagelinks(self, namespaces=namespaces,
total=total, content=content)
def interwiki(self, expand=True):
@@ -1389,11 +1385,11 @@
else:
return [i for i in self._langlinks if not i.site.obsolete]
- def iterlanglinks(self, step=None, total=None, include_obsolete=False):
+ @deprecated_args(step=None)
+ def iterlanglinks(self, total=None, include_obsolete=False):
"""
Iterate all inter-language links on this page.
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@param include_obsolete: if true, yield even Link object whose site
is obsolete
@@ -1408,7 +1404,7 @@
# method is called. If we do this, we'll have to think
# about what will happen if the generator is not completely
# iterated upon.
- return self.site.pagelanglinks(self, step=step, total=total,
+ return self.site.pagelanglinks(self, total=total,
include_obsolete=include_obsolete)
def data_item(self):
@@ -1443,7 +1439,8 @@
return self._templates
- def itertemplates(self, step=None, total=None, content=False):
+ @deprecated_args(step=None)
+ def itertemplates(self, total=None, content=False):
"""
Iterate Page objects for templates used on this Page.
@@ -1451,7 +1448,6 @@
templates, not template pages that happen to be referenced through
a normal link.
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@param content: if True, retrieve the content of the current version
of each template (default False)
@@ -1459,31 +1455,26 @@
"""
if hasattr(self, '_templates'):
return iter(self._templates)
- return self.site.pagetemplates(self, step=step, total=total,
- content=content)
+ return self.site.pagetemplates(self, total=total, content=content)
- @deprecated_args(followRedirects=None, loose=None)
- def imagelinks(self, step=None, total=None, content=False):
+ @deprecated_args(followRedirects=None, loose=None, step=None)
+ def imagelinks(self, total=None, content=False):
"""
Iterate FilePage objects for images displayed on this Page.
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@param content: if True, retrieve the content of the current version
of each image description page (default False)
@return: a generator that yields FilePage objects.
"""
- return self.site.pageimages(self, step=step, total=total,
- content=content)
+ return self.site.pageimages(self, total=total, content=content)
- @deprecated_args(nofollow_redirects=None, get_redirect=None)
- def categories(self, withSortKey=False, step=None, total=None,
- content=False):
+ @deprecated_args(nofollow_redirects=None, get_redirect=None, step=None)
+ def categories(self, withSortKey=False, total=None, content=False):
"""
Iterate categories that the article is in.
@param withSortKey: if True, include the sort key in each Category.
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@param content: if True, retrieve the content of the current version
of each category description page (default False)
@@ -1494,19 +1485,18 @@
if withSortKey:
raise NotImplementedError('withSortKey is not implemented')
- return self.site.pagecategories(self, step=step, total=total,
- content=content)
+ return self.site.pagecategories(self, total=total, content=content)
- def extlinks(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def extlinks(self, total=None):
"""
Iterate all external URLs (not interwiki links) from this page.
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@return: a generator that yields unicode objects containing URLs.
@rtype: generator
"""
- return self.site.page_extlinks(self, step=step, total=total)
+ return self.site.page_extlinks(self, total=total)
def coordinates(self, primary_only=False):
"""
@@ -1573,14 +1563,14 @@
else:
return lastmove.target_page
- @deprecated_args(getText='content', reverseOrder='reverse')
- def revisions(self, reverse=False, step=None, total=None, content=False,
+ @deprecated_args(getText='content', reverseOrder='reverse', step=None)
+ def revisions(self, reverse=False, total=None, content=False,
rollback=False, starttime=None, endtime=None):
"""Generator which loads the version history as Revision instances."""
# TODO: Only request uncached revisions
self.site.loadrevisions(self, getText=content, rvdir=reverse,
starttime=starttime, endtime=endtime,
- step=step, total=total, rollback=rollback)
+ total=total, rollback=rollback)
return (self._revisions[rev] for rev in
sorted(self._revisions, reverse=not reverse)[:total])
@@ -1588,9 +1578,9 @@
# returned no more than 500 revisions; now, it iterates
# all revisions unless 'total' argument is used
@deprecated('Page.revisions()')
- @deprecated_args(forceReload=None, revCount='total', getAll=None,
- reverseOrder='reverse')
- def getVersionHistory(self, reverse=False, step=None, total=None):
+ @deprecated_args(forceReload=None, revCount='total', step=None,
+ getAll=None, reverseOrder='reverse')
+ def getVersionHistory(self, reverse=False, total=None):
"""
Load the version history page and return history information.
@@ -1599,20 +1589,18 @@
edit summary. Starts with the most current revision, unless
reverse is True.
- @param step: limit each API call to this number of revisions
@param total: iterate no more than this number of revisions in total
"""
return [rev.hist_entry()
- for rev in self.revisions(reverse=reverse,
- step=step, total=total)
+ for rev in self.revisions(reverse=reverse, total=total)
]
- @deprecated_args(forceReload=None, reverseOrder='reverse')
- def getVersionHistoryTable(self, reverse=False, step=None, total=None):
+ @deprecated_args(forceReload=None, reverseOrder='reverse', step=None)
+ def getVersionHistoryTable(self, reverse=False, total=None):
"""Return the version history as a wiki table."""
result = '{| class="wikitable"\n'
result += '! oldid || date/time || username || edit summary\n'
- for entry in self.revisions(reverse=reverse, step=step, total=total):
+ for entry in self.revisions(reverse=reverse, total=total):
result += '|----\n'
result += ('| {r.revid} || {r.timestamp} || {r.user} || '
'<nowiki>{r.comment}</nowiki>\n'.format(r=entry))
@@ -1620,23 +1608,22 @@
return result
@deprecated("Page.revisions(content=True)")
- @deprecated_args(reverseOrder='reverse', rollback=None)
- def fullVersionHistory(self, reverse=False, step=None, total=None):
+ @deprecated_args(reverseOrder='reverse', rollback=None, step=None)
+ def fullVersionHistory(self, reverse=False, total=None):
"""Iterate previous versions including wikitext.
Takes same arguments as getVersionHistory.
"""
return [rev.full_hist_entry()
for rev in self.revisions(content=True, reverse=reverse,
- step=step, total=total)
+ total=total)
]
- def contributors(self, step=None, total=None,
- starttime=None, endtime=None):
+ @deprecated_args(step=None)
+ def contributors(self, total=None, starttime=None, endtime=None):
"""
Compile contributors of this page with edit counts.
- @param step: limit each API call to this number of revisions
@param total: iterate no more than this number of revisions in total
@param starttime: retrieve revisions starting at this Timestamp
@param endtime: retrieve revisions ending at this Timestamp
@@ -1645,20 +1632,20 @@
@rtype: L{collections.Counter}
"""
return Counter(rev.user for rev in
- self.revisions(step=step, total=total,
+ self.revisions(total=total,
starttime=starttime, endtime=endtime))
@deprecated('contributors()')
- def contributingUsers(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def contributingUsers(self, total=None):
"""
Return a set of usernames (or IPs) of users who edited this page.
- @param step: limit each API call to this number of revisions
@param total: iterate no more than this number of revisions in total
@rtype: set
"""
- return self.contributors(step=step, total=total).keys()
+ return self.contributors(total=total).keys()
def revision_count(self, contributors=None):
"""
@@ -1776,7 +1763,8 @@
self.text = template + self.text
return self.save(summary=reason)
- def loadDeletedRevisions(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def loadDeletedRevisions(self, total=None):
"""
Retrieve deleted revisions for this Page.
@@ -1789,7 +1777,7 @@
"""
if not hasattr(self, "_deletedRevs"):
self._deletedRevs = {}
- for item in self.site.deletedrevs(self, step=step, total=total):
+ for item in self.site.deletedrevs(self, total=total):
for rev in item.get("revisions", []):
self._deletedRevs[rev['timestamp']] = rev
yield rev['timestamp']
@@ -2393,17 +2381,16 @@
return ('{| border="1"\n! date/time || username || resolution || size '
'|| edit summary\n|----\n\n|----\n'.join(lines) + '\n|}')
- def usingPages(self, step=None, total=None, content=False):
+ @deprecated_args(step=None)
+ def usingPages(self, total=None, content=False):
"""
Yield Pages on which the file is displayed.
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in total
@param content: if True, load the current content of each iterated page
(default False)
"""
- return self.site.imageusage(
- self, step=step, total=total, content=content)
+ return self.site.imageusage(self, total=total, content=content)
wrapper = _ModuleDeprecationWrapper(__name__)
@@ -2447,9 +2434,8 @@
titleWithSortKey = self.title(withSection=False)
return '[[%s]]' % titleWithSortKey
- @deprecated_args(startFrom=None, cacheResults=None)
- def subcategories(self, recurse=False, step=None, total=None,
- content=False):
+ @deprecated_args(startFrom=None, cacheResults=None, step=None)
+ def subcategories(self, recurse=False, total=None, content=False):
"""
Iterate all subcategories of the current category.
@@ -2458,7 +2444,6 @@
levels. (Example: recurse=1 will iterate direct subcats and
first-level sub-sub-cats, but no deeper.)
@type recurse: int or bool
- @param step: limit each API call to this number of categories
@param total: iterate no more than this number of
subcategories in total (at all levels)
@param content: if True, retrieve the content of the current version
@@ -2469,8 +2454,7 @@
if not hasattr(self, "_subcats"):
self._subcats = []
for member in self.site.categorymembers(
- self, member_type='subcat', step=step,
- total=total, content=content):
+ self, member_type='subcat', total=total, content=content):
subcat = Category(member)
self._subcats.append(subcat)
yield subcat
@@ -2480,7 +2464,7 @@
return
if recurse:
for item in subcat.subcategories(
- recurse, step=step, total=total, content=content):
+ recurse, total=total, content=content):
yield item
if total is not None:
total -= 1
@@ -2495,15 +2479,15 @@
return
if recurse:
for item in subcat.subcategories(
- recurse, step=step, total=total, content=content):
+ recurse, total=total, content=content):
yield item
if total is not None:
total -= 1
if total == 0:
return
- @deprecate_arg("startFrom", "startsort")
- def articles(self, recurse=False, step=None, total=None,
+ @deprecated_args(startFrom='startsort', step=None)
+ def articles(self, recurse=False, total=None,
content=False, namespaces=None, sortby=None,
reverse=False, starttime=None, endtime=None,
startsort=None, endsort=None):
@@ -2518,7 +2502,6 @@
levels. (Example: recurse=1 will iterate articles in first-level
subcats, but no deeper.)
@type recurse: int or bool
- @param step: limit each API call to this number of pages
@param total: iterate no more than this number of pages in
total (at all levels)
@param namespaces: only yield pages in the specified namespaces
@@ -2547,7 +2530,7 @@
"""
for member in self.site.categorymembers(self,
namespaces=namespaces,
- step=step, total=total,
+ total=total,
content=content, sortby=sortby,
reverse=reverse,
starttime=starttime,
@@ -2564,8 +2547,8 @@
if recurse:
if not isinstance(recurse, bool) and recurse:
recurse = recurse - 1
- for subcat in self.subcategories(step=step):
- for article in subcat.articles(recurse, step=step, total=total,
+ for subcat in self.subcategories():
+ for article in subcat.articles(recurse, total=total,
content=content,
namespaces=namespaces,
sortby=sortby,
@@ -2581,11 +2564,12 @@
if total == 0:
return
- def members(self, recurse=False, namespaces=None, step=None, total=None,
+ @deprecated_args(step=None)
+ def members(self, recurse=False, namespaces=None, total=None,
content=False):
"""Yield all category contents (subcats, pages, and files)."""
for member in self.site.categorymembers(
- self, namespaces, step=step, total=total, content=content):
+ self, namespaces, total=total, content=content):
yield member
if total is not None:
total -= 1
@@ -2594,10 +2578,9 @@
if recurse:
if not isinstance(recurse, bool) and recurse:
recurse = recurse - 1
- for subcat in self.subcategories(step=step):
+ for subcat in self.subcategories():
for article in subcat.members(
- recurse, namespaces, step=step,
- total=total, content=content):
+ recurse, namespaces, total=total, content=content):
yield article
if total is not None:
total -= 1
diff --git a/pywikibot/pagegenerators.py b/pywikibot/pagegenerators.py
index 2a22663..ce44f2b 100644
--- a/pywikibot/pagegenerators.py
+++ b/pywikibot/pagegenerators.py
@@ -14,7 +14,7 @@
¶ms;
"""
#
-# (C) Pywikibot team, 2008-2015
+# (C) Pywikibot team, 2008-2016
#
# Distributed under the terms of the MIT license.
#
@@ -191,10 +191,6 @@
-prefixindex Work on pages commencing with a common prefix.
--step:n When used with any other argument that specifies a set
- of pages, only retrieve n pages at a time from the wiki
- server.
-
-subpage:n Filters pages to only those that have depth n
i.e. a depth of 0 filters out all pages that are subpages, and
a depth of 1 filters out all pages that are subpages of subpages.
@@ -351,7 +347,6 @@
"""
self.gens = []
self._namespaces = []
- self.step = None
self.limit = None
self.qualityfilter_list = []
self.articlefilter_list = []
@@ -416,8 +411,6 @@
if isinstance(self.gens[i], pywikibot.data.api.QueryGenerator):
if self.namespaces:
self.gens[i].set_namespace(self.namespaces)
- if self.step:
- self.gens[i].set_query_increment(self.step)
if self.limit:
self.gens[i].set_maximum_items(self.limit)
else:
@@ -662,10 +655,9 @@
self._namespaces += value.split(",")
return True
elif arg == '-step':
- if not value:
- value = pywikibot.input('What is the step value?')
- self.step = int(value)
- return True
+ issue_deprecation_warning(
+ 'The usage of "{0}"'.format(arg), 2, ArgumentDeprecationWarning)
+ return False
elif arg == '-limit':
if not value:
value = pywikibot.input('What is the limit value?')
@@ -868,16 +860,15 @@
return False
+@deprecated_args(step=None)
def AllpagesPageGenerator(start='!', namespace=0, includeredirects=True,
- site=None, step=None, total=None, content=False):
+ site=None, total=None, content=False):
"""
Iterate Page objects for all titles in a single namespace.
If includeredirects is False, redirects are not included. If
includeredirects equals the string 'only', only redirects are added.
- @param step: Maximum number of pages to retrieve per API query
- @type step: int
@param total: Maxmum number of pages to retrieve in total
@type total: int
@param content: If True, load current version of each page (default False)
@@ -895,18 +886,16 @@
else:
filterredir = False
return site.allpages(start=start, namespace=namespace,
- filterredir=filterredir, step=step, total=total,
- content=content)
+ filterredir=filterredir, total=total, content=content)
+@deprecated_args(step=None)
def PrefixingPageGenerator(prefix, namespace=None, includeredirects=True,
- site=None, step=None, total=None, content=False):
+ site=None, total=None, content=False):
"""
Prefixed Page generator.
- @param step: Maximum number of pages to retrieve per API query
- @type step: int
- @param total: Maxmum number of pages to retrieve in total
+ @param total: Maximum number of pages to retrieve in total
@type total: int
@param content: If True, load current version of each page (default False)
@param site: Site for generator results.
@@ -926,8 +915,7 @@
else:
filterredir = False
return site.allpages(prefix=title, namespace=namespace,
- filterredir=filterredir, step=step, total=total,
- content=content)
+ filterredir=filterredir, total=total, content=content)
@deprecated_args(number="total", mode="logtype", repeat=None)
@@ -985,14 +973,12 @@
site=site, namespace=namespace)
-@deprecated_args(number='total', namespace='namespaces', repeat=None,
- get_redirect=None)
-def NewpagesPageGenerator(site=None, namespaces=[0], step=None, total=None):
+@deprecated_args(number='total', step=None, namespace='namespaces',
+ repeat=None, get_redirect=None)
+def NewpagesPageGenerator(site=None, namespaces=[0], total=None):
"""
Iterate Page objects for all new titles in a single namespace.
- @param step: Maximum number of pages to retrieve per API query
- @type step: int
@param total: Maxmium number of pages to retrieve in total
@type total: int
@param site: Site for generator results.
@@ -1004,17 +990,17 @@
if site is None:
site = pywikibot.Site()
for item in site.recentchanges(changetype='new', namespaces=namespaces,
- step=step, total=total):
+ total=total):
yield pywikibot.Page(pywikibot.Link(item["title"], site))
-@deprecated_args(nobots=None)
+@deprecated_args(nobots=None, step=None)
def RecentChangesPageGenerator(start=None, end=None, reverse=False,
namespaces=None, pagelist=None,
changetype=None, showMinor=None,
showBot=None, showAnon=None,
showRedirects=None, showPatrolled=None,
- topOnly=False, step=None, total=None,
+ topOnly=False, total=None,
user=None, excludeuser=None, site=None,
_filter_unique=None):
"""
@@ -1067,7 +1053,7 @@
showBot=showBot, showAnon=showAnon,
showRedirects=showRedirects,
showPatrolled=showPatrolled,
- topOnly=topOnly, step=step, total=total,
+ topOnly=topOnly, total=total,
user=user, excludeuser=excludeuser)
gen = (pywikibot.Page(site, x['title'])
@@ -1078,12 +1064,11 @@
return gen
-def UnconnectedPageGenerator(site=None, step=None, total=None):
+@deprecated_args(step=None)
+def UnconnectedPageGenerator(site=None, total=None):
"""
Iterate Page objects for all unconnected pages to a Wikibase repository.
- @param step: Maximum number of pages to retrieve per API query
- @type step: int
@param total: Maximum number of pages to retrieve in total
@type total: int
@param site: Site for generator results.
@@ -1093,19 +1078,20 @@
site = pywikibot.Site()
if not site.data_repository():
raise ValueError('The given site does not have Wikibase repository.')
- for page in site.unconnected_pages(step=step, total=total):
+ for page in site.unconnected_pages(total=total):
yield page
-@deprecated_args(referredImagePage='referredFilePage')
-def FileLinksGenerator(referredFilePage, step=None, total=None, content=False):
+@deprecated_args(referredImagePage='referredFilePage', step=None)
+def FileLinksGenerator(referredFilePage, total=None, content=False):
"""Yield Pages on which the file referredFilePage is displayed."""
- return referredFilePage.usingPages(step=step, total=total, content=content)
+ return referredFilePage.usingPages(total=total, content=content)
-def ImagesPageGenerator(pageWithImages, step=None, total=None, content=False):
+@deprecated_args(step=None)
+def ImagesPageGenerator(pageWithImages, total=None, content=False):
"""Yield FilePages displayed on pageWithImages."""
- return pageWithImages.imagelinks(step=step, total=total, content=content)
+ return pageWithImages.imagelinks(total=total, content=content)
def InterwikiPageGenerator(page):
@@ -1114,26 +1100,29 @@
yield pywikibot.Page(link)
-def LanguageLinksPageGenerator(page, step=None, total=None):
+@deprecated_args(step=None)
+def LanguageLinksPageGenerator(page, total=None):
"""Iterate over all interwiki language links on a page."""
- for link in page.iterlanglinks(step=step, total=total):
+ for link in page.iterlanglinks(total=total):
yield pywikibot.Page(link)
+@deprecated_args(step=None)
def ReferringPageGenerator(referredPage, followRedirects=False,
withTemplateInclusion=True,
onlyTemplateInclusion=False,
- step=None, total=None, content=False):
+ total=None, content=False):
"""Yield all pages referring to a specific page."""
return referredPage.getReferences(
follow_redirects=followRedirects,
withTemplateInclusion=withTemplateInclusion,
onlyTemplateInclusion=onlyTemplateInclusion,
- step=step, total=total, content=content)
+ total=total, content=content)
+@deprecated_args(step=None)
def CategorizedPageGenerator(category, recurse=False, start=None,
- step=None, total=None, content=False,
+ total=None, content=False,
namespaces=None):
"""Yield all pages in a specific category.
@@ -1149,7 +1138,7 @@
retrieved page will be downloaded.
"""
- kwargs = dict(recurse=recurse, step=step, total=total,
+ kwargs = dict(recurse=recurse, total=total,
content=content, namespaces=namespaces)
if start:
kwargs['sortby'] = 'sortkey'
@@ -1158,8 +1147,9 @@
yield a
+@deprecated_args(step=None)
def SubCategoriesPageGenerator(category, recurse=False, start=None,
- step=None, total=None, content=False):
+ total=None, content=False):
"""Yield all subcategories in a specific category.
If recurse is True, pages in subcategories are included as well; if
@@ -1175,21 +1165,20 @@
"""
# TODO: page generator could be modified to use cmstartsortkey ...
- for s in category.subcategories(recurse=recurse, step=step,
+ for s in category.subcategories(recurse=recurse,
total=total, content=content):
if start is None or s.title(withNamespace=False) >= start:
yield s
-def LinkedPageGenerator(linkingPage, step=None, total=None, content=False):
+@deprecated_args(step=None)
+def LinkedPageGenerator(linkingPage, total=None, content=False):
"""Yield all pages linked from a specific page.
See L{pywikibot.page.BasePage.linkedPages} for details.
@param linkingPage: the page that links to the pages we want
@type linkingPage: L{pywikibot.Page}
- @param step: the limit number of pages to retrieve per API call
- @type step: int
@param total: the total number of pages to iterate
@type total: int
@param content: if True, retrieve the current content of each linked page
@@ -1197,7 +1186,7 @@
@return: a generator that yields Page objects of pages linked to linkingPage
@rtype: generator
"""
- return linkingPage.linkedPages(step=step, total=total, content=content)
+ return linkingPage.linkedPages(total=total, content=content)
def TextfilePageGenerator(filename=None, site=None):
@@ -1253,14 +1242,11 @@
yield pywikibot.Page(pywikibot.Link(title, site))
-@deprecated_args(number="total")
+@deprecated_args(number='total', step=None)
def UserContributionsGenerator(username, namespaces=None, site=None,
- step=None, total=None,
- _filter_unique=filter_unique):
+ total=None, _filter_unique=filter_unique):
"""Yield unique pages edited by user:username.
- @param step: Maximum number of pages to retrieve per API query
- @type step: int
@param total: Maxmum number of pages to retrieve in total
@type total: int
@param namespaces: list of namespace numbers to fetch contribs from
@@ -1274,7 +1260,7 @@
return _filter_unique(
pywikibot.Page(pywikibot.Link(contrib["title"], source=site))
for contrib in site.usercontribs(user=username, namespaces=namespaces,
- step=step, total=total)
+ total=total)
)
@@ -1768,14 +1754,14 @@
yield item
-@deprecated_args(pageNumber="step", lookahead=None)
-def PreloadingGenerator(generator, step=50):
+@deprecated_args(pageNumber='groupsize', step='groupsize', lookahead=None)
+def PreloadingGenerator(generator, groupsize=50):
"""
Yield preloaded pages taken from another generator.
@param generator: pages to iterate over
- @param step: how many pages to preload at once
- @type step: int
+ @param groupsize: how many pages to preload at once
+ @type groupsize: int
"""
# pages may be on more than one site, for example if an interwiki
# generator is used, so use a separate preloader for each site
@@ -1784,26 +1770,27 @@
for page in generator:
site = page.site
sites.setdefault(site, []).append(page)
- if len(sites[site]) >= step:
- # if this site is at the step, process it
+ if len(sites[site]) >= groupsize:
+ # if this site is at the groupsize, process it
group = sites[site]
sites[site] = []
- for i in site.preloadpages(group, step):
+ for i in site.preloadpages(group, groupsize):
yield i
for site in sites:
if sites[site]:
- # process any leftover sites that never reached the step
- for i in site.preloadpages(sites[site], step):
+ # process any leftover sites that never reached the groupsize
+ for i in site.preloadpages(sites[site], groupsize):
yield i
-def DequePreloadingGenerator(generator, step=50):
+@deprecated_args(step='groupsize')
+def DequePreloadingGenerator(generator, groupsize=50):
"""Preload generator of type DequeGenerator."""
assert isinstance(generator, DequeGenerator), \
'generator must be a DequeGenerator object'
while True:
- page_count = min(len(generator), step)
+ page_count = min(len(generator), groupsize)
if not page_count:
return
@@ -1811,15 +1798,16 @@
yield page
-def PreloadingItemGenerator(generator, step=50):
+@deprecated_args(step='groupsize')
+def PreloadingItemGenerator(generator, groupsize=50):
"""
Yield preloaded pages taken from another generator.
Function basically is copied from above, but for ItemPage's
@param generator: pages to iterate over
- @param step: how many pages to preload at once
- @type step: int
+ @param groupsize: how many pages to preload at once
+ @type groupsize: int
"""
sites = {}
for page in generator:
@@ -1835,26 +1823,24 @@
site = page.site
sites.setdefault(site, []).append(page)
- if len(sites[site]) >= step:
- # if this site is at the step, process it
+ if len(sites[site]) >= groupsize:
+ # if this site is at the groupsize, process it
group = sites[site]
sites[site] = []
- for i in site.preloaditempages(group, step):
+ for i in site.preloaditempages(group, groupsize):
yield i
for site in sites:
if sites[site]:
- # process any leftover sites that never reached the step
- for i in site.preloaditempages(sites[site], step):
+ # process any leftover sites that never reached the groupsize
+ for i in site.preloaditempages(sites[site], groupsize):
yield i
-@deprecated_args(number='total', repeat=None)
-def NewimagesPageGenerator(step=None, total=None, site=None):
+@deprecated_args(number='total', step=None, repeat=None)
+def NewimagesPageGenerator(total=None, site=None):
"""
New file generator.
- @param step: Maximum number of pages to retrieve per API query
- @type step: int
@param total: Maxmum number of pages to retrieve in total
@type total: int
@param site: Site for generator results.
@@ -1862,7 +1848,7 @@
"""
if site is None:
site = pywikibot.Site()
- for entry in site.logevents(logtype="upload", step=step, total=total):
+ for entry in site.logevents(logtype='upload', total=total):
# entry is an UploadEntry object
# entry.page() returns a Page object
yield entry.page()
@@ -2165,8 +2151,8 @@
yield page
-@deprecated_args(link='url', euprotocol='protocol')
-def LinksearchPageGenerator(url, namespaces=None, step=None, total=None,
+@deprecated_args(link='url', euprotocol='protocol', step=None)
+def LinksearchPageGenerator(url, namespaces=None, total=None,
site=None, protocol='http'):
"""Yield all pages that link to a certain URL, like Special:Linksearch.
@@ -2176,8 +2162,6 @@
@type url: str
@param namespaces: list of namespace numbers to fetch contribs from
@type namespaces: list of int
- @param step: Maximum number of pages to retrieve per API query
- @type step: int
@param total: Maxmum number of pages to retrieve in total
@type total: int
@param site: Site for generator results.
@@ -2186,17 +2170,14 @@
if site is None:
site = pywikibot.Site()
return site.exturlusage(url, namespaces=namespaces, protocol=protocol,
- step=step, total=total, content=False)
+ total=total, content=False)
-@deprecated_args(number='total')
-def SearchPageGenerator(query, step=None, total=None, namespaces=None,
- site=None):
+@deprecated_args(number='total', step=None)
+def SearchPageGenerator(query, total=None, namespaces=None, site=None):
"""
Yield pages from the MediaWiki internal search engine.
- @param step: Maximum number of pages to retrieve per API query
- @type step: int
@param total: Maxmum number of pages to retrieve in total
@type total: int
@param site: Site for generator results.
@@ -2204,8 +2185,7 @@
"""
if site is None:
site = pywikibot.Site()
- for page in site.search(query, step=step, total=total,
- namespaces=namespaces):
+ for page in site.search(query, total=total, namespaces=namespaces):
yield page
diff --git a/pywikibot/site.py b/pywikibot/site.py
index 39ac5cb..aac59f5 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -1886,8 +1886,9 @@
"""Return whether this site has an API."""
return True
+ @deprecated_args(step=None)
def _generator(self, gen_class, type_arg=None, namespaces=None,
- step=None, total=None, **args):
+ total=None, **args):
"""Convenience method that returns an API generator.
All generic keyword arguments are passed as MW API parameter except for
@@ -1904,8 +1905,6 @@
@type namespaces: iterable of basestring or Namespace key,
or a single instance of those types. May be a '|' separated
list of namespace identifiers.
- @param step: if not None, limit each API call to this many items
- @type step: int
@param total: if not None, limit the generator to yielding this many
items in total
@type total: int
@@ -1925,8 +1924,6 @@
gen = gen_class(**req_args)
if namespaces is not None:
gen.set_namespace(namespaces)
- if step is not None and int(step) > 0:
- gen.set_query_increment(int(step))
if total is not None and int(total) > 0:
gen.set_maximum_items(int(total))
return gen
@@ -3338,7 +3335,7 @@
# following group of methods map more-or-less directly to API queries
def pagebacklinks(self, page, followRedirects=False, filterRedirects=None,
- namespaces=None, step=None, total=None, content=False):
+ namespaces=None, total=None, content=False):
"""Iterate all pages that link to the given page.
@param page: The Page to get links to.
@@ -3352,7 +3349,6 @@
@type namespaces: iterable of basestring or Namespace key,
or a single instance of those types. May be a '|' separated
list of namespace identifiers.
- @param step: Limit on number of pages to retrieve per API query.
@param total: Maximum number of pages to retrieve in total.
@param content: if True, load the current content of each iterated page
(default False)
@@ -3366,7 +3362,7 @@
blargs["gblfilterredir"] = (filterRedirects and "redirects" or
"nonredirects")
blgen = self._generator(api.PageGenerator, type_arg="backlinks",
- namespaces=namespaces, step=step, total=total,
+ namespaces=namespaces, total=total,
g_content=content, **blargs)
if followRedirects:
# links identified by MediaWiki as redirects may not really be,
@@ -3397,8 +3393,9 @@
return itertools.chain(*list(genlist.values()))
return blgen
+ @deprecated_args(step=None)
def page_embeddedin(self, page, filterRedirects=None, namespaces=None,
- step=None, total=None, content=False):
+ total=None, content=False):
"""Iterate all pages that embedded the given page as a template.
@param page: The Page to get inclusions for.
@@ -3422,13 +3419,14 @@
eiargs["geifilterredir"] = (filterRedirects and "redirects" or
"nonredirects")
eigen = self._generator(api.PageGenerator, type_arg="embeddedin",
- namespaces=namespaces, step=step, total=total,
+ namespaces=namespaces, total=total,
g_content=content, **eiargs)
return eigen
+ @deprecated_args(step=None)
def pagereferences(self, page, followRedirects=False, filterRedirects=None,
withTemplateInclusion=True, onlyTemplateInclusion=False,
- namespaces=None, step=None, total=None, content=False):
+ namespaces=None, total=None, content=False):
"""
Convenience method combining pagebacklinks and page_embeddedin.
@@ -3444,24 +3442,25 @@
if onlyTemplateInclusion:
return self.page_embeddedin(page, namespaces=namespaces,
filterRedirects=filterRedirects,
- step=step, total=total, content=content)
+ total=total, content=content)
if not withTemplateInclusion:
return self.pagebacklinks(page, followRedirects=followRedirects,
filterRedirects=filterRedirects,
namespaces=namespaces,
- step=step, total=total, content=content)
+ total=total, content=content)
return itertools.islice(
itertools.chain(
self.pagebacklinks(
page, followRedirects, filterRedirects,
- namespaces=namespaces, step=step, content=content),
+ namespaces=namespaces, content=content),
self.page_embeddedin(
page, filterRedirects, namespaces=namespaces,
- step=step, content=content)
+ content=content)
), total)
+ @deprecated_args(step=None)
def pagelinks(self, page, namespaces=None, follow_redirects=False,
- step=None, total=None, content=False):
+ total=None, content=False):
"""Iterate internal wikilinks contained (or transcluded) on page.
@param namespaces: Only iterate pages in these namespaces (default: all)
@@ -3483,13 +3482,14 @@
pltitle = page.title(withSection=False).encode(self.encoding())
plargs['titles'] = pltitle
plgen = self._generator(api.PageGenerator, type_arg="links",
- namespaces=namespaces, step=step, total=total,
+ namespaces=namespaces, total=total,
g_content=content, redirects=follow_redirects,
**plargs)
return plgen
- @deprecate_arg("withSortKey", None) # Sortkey doesn't work with generator
- def pagecategories(self, page, step=None, total=None, content=False):
+ # Sortkey doesn't work with generator
+ @deprecated_args(withSortKey=None, step=None)
+ def pagecategories(self, page, total=None, content=False):
"""Iterate categories to which page belongs.
@param content: if True, load the current content of each iterated page
@@ -3503,11 +3503,12 @@
clargs['titles'] = page.title(
withSection=False).encode(self.encoding())
clgen = self._generator(api.PageGenerator,
- type_arg="categories", step=step, total=total,
+ type_arg='categories', total=total,
g_content=content, **clargs)
return clgen
- def pageimages(self, page, step=None, total=None, content=False):
+ @deprecated_args(step=None)
+ def pageimages(self, page, total=None, content=False):
"""Iterate images used (not just linked) on the page.
@param content: if True, load the current content of each iterated page
@@ -3517,12 +3518,12 @@
"""
imtitle = page.title(withSection=False).encode(self.encoding())
imgen = self._generator(api.PageGenerator, type_arg="images",
- titles=imtitle, step=step, total=total,
+ titles=imtitle, total=total,
g_content=content)
return imgen
- def pagetemplates(self, page, namespaces=None, step=None, total=None,
- content=False):
+ @deprecated_args(step=None)
+ def pagetemplates(self, page, namespaces=None, total=None, content=False):
"""Iterate templates transcluded (not just linked) on the page.
@param namespaces: Only iterate pages in these namespaces
@@ -3539,12 +3540,13 @@
tltitle = page.title(withSection=False).encode(self.encoding())
tlgen = self._generator(api.PageGenerator, type_arg="templates",
titles=tltitle, namespaces=namespaces,
- step=step, total=total, g_content=content)
+ total=total, g_content=content)
return tlgen
+ @deprecated_args(step=None)
def categorymembers(self, category, namespaces=None, sortby=None,
reverse=False, starttime=None, endtime=None,
- startsort=None, endsort=None, step=None, total=None,
+ startsort=None, endsort=None, total=None,
content=False, member_type=None):
"""Iterate members of specified category.
@@ -3680,14 +3682,14 @@
"invalid combination of 'sortby' and 'endsort'")
cmgen = self._generator(api.PageGenerator, namespaces=namespaces,
- step=step, total=total, g_content=content,
- **cmargs)
+ total=total, g_content=content, **cmargs)
return cmgen
def loadrevisions(self, page, getText=False, revids=None,
startid=None, endid=None, starttime=None,
endtime=None, rvdir=None, user=None, excludeuser=None,
- section=None, sysop=False, step=None, total=None, rollback=False):
+ section=None, sysop=False, step=None, total=None,
+ rollback=False):
"""Retrieve and store revision information.
By default, retrieves the last (current) revision of the page,
@@ -3796,8 +3798,9 @@
# TODO if sysop: something
# assemble API request
- rvgen = self._generator(api.PropertyGenerator,
- step=step, total=total, **rvargs)
+ rvgen = self._generator(api.PropertyGenerator, total=total, **rvargs)
+ if step > 0:
+ rvgen.set_query_increment = step
if latest or "revids" in rvgen.request:
rvgen.set_maximum_items(-1) # suppress use of rvlimit parameter
@@ -3822,8 +3825,8 @@
parsed_text = data['parse']['text']['*']
return parsed_text
- def pagelanglinks(self, page, step=None, total=None,
- include_obsolete=False):
+ @deprecated_args(step=None)
+ def pagelanglinks(self, page, total=None, include_obsolete=False):
"""Iterate all interlanguage links on page, yielding Link objects.
@param include_obsolete: if true, yield even Link objects whose
@@ -3833,7 +3836,7 @@
llquery = self._generator(api.PropertyGenerator,
type_arg="langlinks",
titles=lltitle.encode(self.encoding()),
- step=step, total=total)
+ total=total)
for pageitem in llquery:
if not self.sametitle(pageitem['title'], lltitle):
raise Error(
@@ -3850,12 +3853,13 @@
else:
yield link
- def page_extlinks(self, page, step=None, total=None):
+ @deprecated_args(step=None)
+ def page_extlinks(self, page, total=None):
"""Iterate all external links on page, yielding URL strings."""
eltitle = page.title(withSection=False)
elquery = self._generator(api.PropertyGenerator, type_arg="extlinks",
titles=eltitle.encode(self.encoding()),
- step=step, total=total)
+ total=total)
for pageitem in elquery:
if not self.sametitle(pageitem['title'], eltitle):
raise RuntimeError(
@@ -3884,12 +3888,12 @@
'subcats': 0}
return category._catinfo
- @deprecated_args(throttle=None, limit='total',
+ @deprecated_args(throttle=None, limit='total', step=None,
includeredirects='filterredir')
def allpages(self, start="!", prefix="", namespace=0, filterredir=None,
filterlanglinks=None, minsize=None, maxsize=None,
protect_type=None, protect_level=None, reverse=False,
- step=None, total=None, content=False):
+ total=None, content=False):
"""Iterate pages in a single namespace.
@param start: Start at this title (page need not exist).
@@ -3933,7 +3937,7 @@
apgen = self._generator(api.PageGenerator, type_arg="allpages",
namespaces=namespace,
- gapfrom=start, step=step, total=total,
+ gapfrom=start, total=total,
g_content=content)
if prefix:
apgen.request["gapprefix"] = prefix
@@ -3965,8 +3969,9 @@
return self.allpages(prefix=prefix, namespace=namespace,
filterredir=includeredirects)
+ @deprecated_args(step=None)
def alllinks(self, start="!", prefix="", namespace=0, unique=False,
- fromids=False, step=None, total=None):
+ fromids=False, total=None):
"""Iterate all links to pages (which need not exist) in one namespace.
Note that, in practice, links that were found on pages that have
@@ -3990,7 +3995,7 @@
raise Error("alllinks: unique and fromids cannot both be True.")
algen = self._generator(api.ListGenerator, type_arg="alllinks",
namespaces=namespace, alfrom=start,
- step=step, total=total, alunique=unique)
+ total=total, alunique=unique)
if prefix:
algen.request["alprefix"] = prefix
if fromids:
@@ -4001,7 +4006,8 @@
p._fromid = link['fromid']
yield p
- def allcategories(self, start="!", prefix="", step=None, total=None,
+ @deprecated_args(step=None)
+ def allcategories(self, start='!', prefix='', total=None,
reverse=False, content=False):
"""Iterate categories used (which need not have a Category page).
@@ -4019,7 +4025,7 @@
"""
acgen = self._generator(api.PageGenerator,
type_arg="allcategories", gacfrom=start,
- step=step, total=total, g_content=content)
+ total=total, g_content=content)
if prefix:
acgen.request["gacprefix"] = prefix
if reverse:
@@ -4039,7 +4045,8 @@
"""Return True is username is a bot user."""
return username in [userdata['name'] for userdata in self.botusers()]
- def botusers(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def botusers(self, total=None):
"""Iterate bot users.
Iterated values are dicts containing 'name', 'userid', 'editcount',
@@ -4052,14 +4059,14 @@
self._bots = {}
if not self._bots:
- for item in self.allusers(group='bot', step=step, total=total):
+ for item in self.allusers(group='bot', total=total):
self._bots.setdefault(item['name'], item)
for value in self._bots.values():
yield value
- def allusers(self, start="!", prefix="", group=None, step=None,
- total=None):
+ @deprecated_args(step=None)
+ def allusers(self, start='!', prefix='', group=None, total=None):
"""Iterate registered users, ordered by username.
Iterated values are dicts containing 'name', 'editcount',
@@ -4075,15 +4082,16 @@
"""
augen = self._generator(api.ListGenerator, type_arg="allusers",
auprop="editcount|groups|registration",
- aufrom=start, step=step, total=total)
+ aufrom=start, total=total)
if prefix:
augen.request["auprefix"] = prefix
if group:
augen.request["augroup"] = group
return augen
+ @deprecated_args(step=None)
def allimages(self, start="!", prefix="", minsize=None, maxsize=None,
- reverse=False, sha1=None, sha1base36=None, step=None,
+ reverse=False, sha1=None, sha1base36=None,
total=None, content=False):
"""Iterate all images, ordered by image title.
@@ -4103,7 +4111,7 @@
"""
aigen = self._generator(api.PageGenerator,
type_arg="allimages", gaifrom=start,
- step=step, total=total, g_content=content)
+ total=total, g_content=content)
if prefix:
aigen.request["gaiprefix"] = prefix
if isinstance(minsize, int):
@@ -4118,8 +4126,9 @@
aigen.request["gaisha1base36"] = sha1base36
return aigen
+ @deprecated_args(step=None)
def blocks(self, starttime=None, endtime=None, reverse=False,
- blockids=None, users=None, step=None, total=None):
+ blockids=None, users=None, total=None):
"""Iterate all current blocks, in order of creation.
Note that logevents only logs user blocks, while this method
@@ -4145,7 +4154,7 @@
"blocks: "
"endtime must be before starttime with reverse=False")
bkgen = self._generator(api.ListGenerator, type_arg="blocks",
- step=step, total=total)
+ total=total)
bkgen.request["bkprop"] = "id|user|by|timestamp|expiry|reason|range|flags"
if starttime:
bkgen.request["bkstart"] = starttime
@@ -4165,8 +4174,9 @@
bkgen.request["bkusers"] = users
return bkgen
+ @deprecated_args(step=None)
def exturlusage(self, url=None, protocol="http", namespaces=None,
- step=None, total=None, content=False):
+ total=None, content=False):
"""Iterate Pages that contain links to the given URL.
@param url: The URL to search for (without the protocol prefix);
@@ -4177,12 +4187,13 @@
"""
eugen = self._generator(api.PageGenerator, type_arg="exturlusage",
geuquery=url, geuprotocol=protocol,
- namespaces=namespaces, step=step,
+ namespaces=namespaces,
total=total, g_content=content)
return eugen
+ @deprecated_args(step=None)
def imageusage(self, image, namespaces=None, filterredir=None,
- step=None, total=None, content=False):
+ total=None, content=False):
"""Iterate Pages that contain links to the given FilePage.
@param image: the image to search for (FilePage need not exist on
@@ -4205,13 +4216,13 @@
iuargs['giufilterredir'] = ('redirects' if filterredir else
'nonredirects')
iugen = self._generator(api.PageGenerator, type_arg="imageusage",
- namespaces=namespaces, step=step,
+ namespaces=namespaces,
total=total, g_content=content, **iuargs)
return iugen
+ @deprecated_args(step=None)
def logevents(self, logtype=None, user=None, page=None, namespace=None,
- start=None, end=None, reverse=False, tag=None,
- step=None, total=None):
+ start=None, end=None, reverse=False, tag=None, total=None):
"""Iterate all log entries.
@param logtype: only iterate entries of this type (see wiki
@@ -4233,8 +4244,6 @@
@type reverse: bool
@param tag: only iterate entries tagged with this tag
@type tag: basestring
- @param step: request batch size
- @type step: int
@param total: maximum number of events to iterate
@type total: int
@rtype: iterable
@@ -4247,7 +4256,7 @@
self.assert_valid_iter_params('logevents', start, end, reverse)
legen = self._generator(api.LogEntryListGenerator, type_arg=logtype,
- step=step, total=total)
+ total=total)
if logtype is not None:
legen.request["letype"] = logtype
if user is not None:
@@ -4302,14 +4311,14 @@
@deprecated_args(returndict=None, nobots=None, rcshow=None, rcprop=None,
rctype='changetype', revision=None, repeat=None,
- rcstart='start', rcend='end', rcdir=None,
+ rcstart='start', rcend='end', rcdir=None, step=None,
includeredirects='showRedirects', namespace='namespaces',
rcnamespace='namespaces', number='total', rclimit='total')
def recentchanges(self, start=None, end=None, reverse=False,
namespaces=None, pagelist=None, changetype=None,
showMinor=None, showBot=None, showAnon=None,
showRedirects=None, showPatrolled=None, topOnly=False,
- step=None, total=None, user=None, excludeuser=None):
+ total=None, user=None, excludeuser=None):
"""Iterate recent changes.
@param start: Timestamp to start listing from
@@ -4360,7 +4369,7 @@
rcgen = self._generator(api.ListGenerator, type_arg="recentchanges",
rcprop="user|comment|timestamp|title|ids"
"|sizes|redirect|loginfo|flags",
- namespaces=namespaces, step=step,
+ namespaces=namespaces,
total=total, rctoponly=topOnly)
if start is not None:
rcgen.request["rcstart"] = start
@@ -4397,10 +4406,10 @@
return rcgen
- @deprecated_args(number='total', key='searchstring',
+ @deprecated_args(number='total', step=None, key='searchstring',
getredirects='get_redirects')
def search(self, searchstring, namespaces=None, where="text",
- get_redirects=False, step=None, total=None, content=False):
+ get_redirects=False, total=None, content=False):
"""Iterate Pages that contain the searchstring.
Note that this may include non-existing Pages if the wiki's database
@@ -4433,15 +4442,16 @@
namespaces = [0]
srgen = self._generator(api.PageGenerator, type_arg="search",
gsrsearch=searchstring, gsrwhat=where,
- namespaces=namespaces, step=step,
+ namespaces=namespaces,
total=total, g_content=content)
if MediaWikiVersion(self.version()) < MediaWikiVersion('1.23'):
srgen.request['gsrredirects'] = get_redirects
return srgen
+ @deprecated_args(step=None)
def usercontribs(self, user=None, userprefix=None, start=None, end=None,
reverse=False, namespaces=None, showMinor=None,
- step=None, total=None, top_only=False):
+ total=None, top_only=False):
"""Iterate contributions by a particular user.
Iterated values are in the same format as recentchanges.
@@ -4473,7 +4483,7 @@
ucgen = self._generator(api.ListGenerator, type_arg="usercontribs",
ucprop="ids|title|timestamp|comment|flags",
- namespaces=namespaces, step=step,
+ namespaces=namespaces,
total=total, uctoponly=top_only)
if user:
ucgen.request["ucuser"] = user
@@ -4490,9 +4500,10 @@
ucgen.request['ucshow'] = option_set
return ucgen
+ @deprecated_args(step=None)
def watchlist_revs(self, start=None, end=None, reverse=False,
namespaces=None, showMinor=None, showBot=None,
- showAnon=None, step=None, total=None):
+ showAnon=None, total=None):
"""Iterate revisions to pages on the bot user's watchlist.
Iterated values will be in same format as recentchanges.
@@ -4520,7 +4531,7 @@
wlgen = self._generator(api.ListGenerator, type_arg="watchlist",
wlprop="user|comment|timestamp|title|ids|flags",
wlallrev="", namespaces=namespaces,
- step=step, total=total)
+ total=total)
# TODO: allow users to ask for "patrol" as well?
if start is not None:
wlgen.request["wlstart"] = start
@@ -4536,8 +4547,9 @@
return wlgen
# TODO: T75370
+ @deprecated_args(step=None)
def deletedrevs(self, page, start=None, end=None, reverse=None,
- get_text=False, step=None, total=None):
+ get_text=False, total=None):
"""Iterate deleted revisions.
Each value returned by the iterator will be a dict containing the
@@ -4583,7 +4595,7 @@
drgen = self._generator(api.ListGenerator, type_arg="deletedrevs",
titles=page.title(withSection=False),
drprop="revid|user|comment|minor",
- step=step, total=total)
+ total=total)
if get_text:
drgen.request['drprop'] = (drgen.request['drprop'] +
['content', 'token'])
@@ -4628,7 +4640,8 @@
"""
return self.randompages(total=1, redirects=True)
- def randompages(self, step=None, total=10, namespaces=None,
+ @deprecated_args(step=None)
+ def randompages(self, total=10, namespaces=None,
redirects=False, content=False):
"""Iterate a number of random pages.
@@ -4649,7 +4662,7 @@
type such as NoneType or bool
"""
rngen = self._generator(api.PageGenerator, type_arg="random",
- namespaces=namespaces, step=step, total=total,
+ namespaces=namespaces, total=total,
g_content=content, grnredirect=redirects)
return rngen
@@ -5916,10 +5929,11 @@
rcshow=None,
rc_show=None,
get_redirect=None)
+ @deprecated_args(step=None)
def newpages(self, user=None, returndict=False,
start=None, end=None, reverse=False, showBot=False,
showRedirects=False, excludeuser=None,
- showPatrolled=None, namespaces=None, step=None, total=None):
+ showPatrolled=None, namespaces=None, total=None):
"""Yield new articles (as Page objects) from recent changes.
Starts with the newest article and fetches the number of articles
@@ -5950,7 +5964,7 @@
namespaces=namespaces, changetype="new", user=user,
excludeuser=excludeuser, showBot=showBot,
showRedirects=showRedirects, showPatrolled=showPatrolled,
- step=step, total=total
+ total=total
)
for pageitem in gen:
newpage = pywikibot.Page(self, pageitem['title'])
@@ -5961,9 +5975,9 @@
u'', pageitem['user'], pageitem['comment'])
@deprecated_args(lestart='start', leend='end', leuser='user', letitle=None,
- repeat=None, number='total')
+ repeat=None, number='total', step=None)
def newfiles(self, user=None, start=None, end=None, reverse=False,
- step=None, total=None):
+ total=None):
"""Yield information about newly uploaded files.
Yields a tuple of FilePage, Timestamp, user(unicode), comment(unicode).
@@ -5975,7 +5989,7 @@
# TODO: update docstring
for event in self.logevents(logtype="upload", user=user,
start=start, end=end, reverse=reverse,
- step=step, total=total):
+ total=total):
filepage = event.page()
date = event.timestamp()
user = event.user()
@@ -5992,264 +6006,250 @@
"""
return self.newfiles(*args, **kwargs)
- @deprecated_args(number='total', repeat=None)
- def longpages(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def longpages(self, total=None):
"""Yield Pages and lengths from Special:Longpages.
Yields a tuple of Page object, length(int).
- @param step: request batch size
@param total: number of pages to return
"""
lpgen = self._generator(api.ListGenerator,
type_arg="querypage", qppage="Longpages",
- step=step, total=total)
+ total=total)
for pageitem in lpgen:
yield (pywikibot.Page(self, pageitem['title']),
int(pageitem['value']))
- @deprecated_args(number="total", repeat=None)
- def shortpages(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def shortpages(self, total=None):
"""Yield Pages and lengths from Special:Shortpages.
Yields a tuple of Page object, length(int).
- @param step: request batch size
@param total: number of pages to return
"""
spgen = self._generator(api.ListGenerator,
type_arg="querypage", qppage="Shortpages",
- step=step, total=total)
+ total=total)
for pageitem in spgen:
yield (pywikibot.Page(self, pageitem['title']),
int(pageitem['value']))
- @deprecated_args(number='total', repeat=None)
- def deadendpages(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def deadendpages(self, total=None):
"""Yield Page objects retrieved from Special:Deadendpages.
- @param step: request batch size
@param total: number of pages to return
"""
degen = self._generator(api.PageGenerator,
type_arg="querypage", gqppage="Deadendpages",
- step=step, total=total)
+ total=total)
return degen
- @deprecated_args(number='total', repeat=None)
- def ancientpages(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def ancientpages(self, total=None):
"""Yield Pages, datestamps from Special:Ancientpages.
- @param step: request batch size
@param total: number of pages to return
"""
apgen = self._generator(api.ListGenerator,
type_arg="querypage", qppage="Ancientpages",
- step=step, total=total)
+ total=total)
for pageitem in apgen:
yield (pywikibot.Page(self, pageitem['title']),
pywikibot.Timestamp.fromISOformat(pageitem['timestamp']))
- @deprecated_args(number='total', repeat=None)
- def lonelypages(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def lonelypages(self, total=None):
"""Yield Pages retrieved from Special:Lonelypages.
- @param step: request batch size
@param total: number of pages to return
"""
lpgen = self._generator(api.PageGenerator,
type_arg="querypage", gqppage="Lonelypages",
- step=step, total=total)
+ total=total)
return lpgen
- @deprecated_args(number='total', repeat=None)
- def unwatchedpages(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def unwatchedpages(self, total=None):
"""Yield Pages from Special:Unwatchedpages (requires Admin privileges).
- @param step: request batch size
@param total: number of pages to return
"""
uwgen = self._generator(api.PageGenerator,
type_arg="querypage", gqppage="Unwatchedpages",
- step=step, total=total)
+ total=total)
return uwgen
- def wantedpages(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def wantedpages(self, total=None):
"""Yield Pages from Special:Wantedpages.
- @param step: request batch size
@param total: number of pages to return
"""
wpgen = self._generator(api.PageGenerator,
type_arg="querypage", gqppage="Wantedpages",
- step=step, total=total)
+ total=total)
return wpgen
- @deprecated_args(number='total', repeat=None)
- def wantedcategories(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def wantedcategories(self, total=None):
"""Yield Pages from Special:Wantedcategories.
- @param step: request batch size
@param total: number of pages to return
"""
wcgen = self._generator(api.PageGenerator,
type_arg="querypage", gqppage="Wantedcategories",
- step=step, total=total)
+ total=total)
return wcgen
- @deprecated_args(number='total', repeat=None)
- def uncategorizedcategories(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def uncategorizedcategories(self, total=None):
"""Yield Categories from Special:Uncategorizedcategories.
- @param step: request batch size
@param total: number of pages to return
"""
ucgen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="Uncategorizedcategories",
- step=step, total=total)
+ total=total)
return ucgen
- @deprecated_args(number='total', repeat=None)
- def uncategorizedimages(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def uncategorizedimages(self, total=None):
"""Yield FilePages from Special:Uncategorizedimages.
- @param step: request batch size
@param total: number of pages to return
"""
uigen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="Uncategorizedimages",
- step=step, total=total)
+ total=total)
return uigen
# synonym
uncategorizedfiles = uncategorizedimages
- @deprecated_args(number='total', repeat=None)
- def uncategorizedpages(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def uncategorizedpages(self, total=None):
"""Yield Pages from Special:Uncategorizedpages.
- @param step: request batch size
@param total: number of pages to return
"""
upgen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="Uncategorizedpages",
- step=step, total=total)
+ total=total)
return upgen
- @deprecated_args(number='total', repeat=None)
- def uncategorizedtemplates(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def uncategorizedtemplates(self, total=None):
"""Yield Pages from Special:Uncategorizedtemplates.
- @param step: request batch size
@param total: number of pages to return
"""
utgen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="Uncategorizedtemplates",
- step=step, total=total)
+ total=total)
return utgen
- @deprecated_args(number='total', repeat=None)
- def unusedcategories(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def unusedcategories(self, total=None):
"""Yield Category objects from Special:Unusedcategories.
- @param step: request batch size
@param total: number of pages to return
"""
ucgen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="Unusedcategories",
- step=step, total=total)
+ total=total)
return ucgen
- @deprecated_args(extension=None, number='total', repeat=None)
- def unusedfiles(self, step=None, total=None):
+ @deprecated_args(extension=None, number='total', step=None, repeat=None)
+ def unusedfiles(self, total=None):
"""Yield FilePage objects from Special:Unusedimages.
- @param step: request batch size
@param total: number of pages to return
"""
uigen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="Unusedimages",
- step=step, total=total)
+ total=total)
return uigen
@deprecated("Site().unusedfiles()")
- @deprecated_args(extension=None, number='total', repeat=None)
- def unusedimages(self, step=None, total=None):
+ @deprecated_args(extension=None, number='total', step=None, repeat=None)
+ def unusedimages(self, total=None):
"""Yield FilePage objects from Special:Unusedimages.
DEPRECATED: Use L{APISite.unusedfiles} instead.
"""
- return self.unusedfiles(step, total)
+ return self.unusedfiles(total)
- @deprecated_args(number='total', repeat=None)
- def withoutinterwiki(self, step=None, total=None):
+ @deprecated_args(number='total', step=None, repeat=None)
+ def withoutinterwiki(self, total=None):
"""Yield Pages without language links from Special:Withoutinterwiki.
- @param step: request batch size
@param total: number of pages to return
"""
wigen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="Withoutinterwiki",
- step=step, total=total)
+ total=total)
return wigen
@need_version("1.18")
- def broken_redirects(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def broken_redirects(self, total=None):
"""Yield Pages without language links from Special:BrokenRedirects.
- @param step: request batch size
@param total: number of pages to return
"""
brgen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="BrokenRedirects",
- step=step, total=total)
+ total=total)
return brgen
@need_version("1.18")
- def double_redirects(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def double_redirects(self, total=None):
"""Yield Pages without language links from Special:BrokenRedirects.
- @param step: request batch size
@param total: number of pages to return
"""
drgen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="DoubleRedirects",
- step=step, total=total)
+ total=total)
return drgen
@need_version("1.18")
- def redirectpages(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def redirectpages(self, total=None):
"""Yield redirect pages from Special:ListRedirects.
- @param step: request batch size
@param total: number of pages to return
"""
lrgen = self._generator(api.PageGenerator,
type_arg="querypage",
gqppage="Listredirects",
- step=step, total=total)
+ total=total)
return lrgen
- def unconnected_pages(self, step=None, total=None):
+ @deprecated_args(step=None)
+ def unconnected_pages(self, total=None):
"""Yield Page objects from Special:UnconnectedPages.
- @param step: request batch size
@param total: number of pages to return
"""
upgen = self._generator(api.PageGenerator,
type_arg='querypage',
gqppage='UnconnectedPages',
- step=step, total=total)
+ total=total)
return upgen
@deprecated_args(lvl='level')
@@ -6666,7 +6666,8 @@
"""
return self.moderate_post(post, 'restore', reason)
- def watched_pages(self, sysop=False, force=False, step=None, total=None):
+ @deprecated_args(step=None)
+ def watched_pages(self, sysop=False, force=False, total=None):
"""
Return watchlist.
@@ -6682,8 +6683,7 @@
total = pywikibot.config.special_page_limit
expiry = None if force else pywikibot.config.API_config_expiry
gen = api.PageGenerator(site=self, generator='watchlistraw',
- expiry=expiry,
- step=step, gwrlimit=total)
+ expiry=expiry, gwrlimit=total)
return gen
# aliases for backwards compatibility
diff --git a/scripts/blockpageschecker.py b/scripts/blockpageschecker.py
index 3cb004b..807707e 100755
--- a/scripts/blockpageschecker.py
+++ b/scripts/blockpageschecker.py
@@ -277,7 +277,7 @@
generator.append(pageCat)
pywikibot.output(u'Categories loaded, start!')
# Main Loop
- preloadingGen = pagegenerators.PreloadingGenerator(generator, step=60)
+ preloadingGen = pagegenerators.PreloadingGenerator(generator, groupsize=60)
for page in preloadingGen:
pagename = page.title(asLink=True)
pywikibot.output('Loading %s...' % pagename)
diff --git a/scripts/category_redirect.py b/scripts/category_redirect.py
index 51e3159..7c96f67 100755
--- a/scripts/category_redirect.py
+++ b/scripts/category_redirect.py
@@ -23,7 +23,7 @@
"""
#
-# (C) Pywikibot team, 2008-2014
+# (C) Pywikibot team, 2008-2016
#
# Distributed under the terms of the MIT license.
#
@@ -225,7 +225,8 @@
# generator yields all hard redirect pages in namespace 14
for page in pagegenerators.PreloadingGenerator(
- self.site.allpages(namespace=14, filterredir=True), step=250):
+ self.site.allpages(namespace=14, filterredir=True),
+ groupsize=250):
if page.isCategoryRedirect():
# this is already a soft-redirect, so skip it (for now)
continue
diff --git a/scripts/patrol.py b/scripts/patrol.py
index 9c57bf2..f5bc371 100755
--- a/scripts/patrol.py
+++ b/scripts/patrol.py
@@ -39,7 +39,7 @@
"""
#
-# (C) Pywikibot team, 2011-2015
+# (C) Pywikibot team, 2011-2016
#
# Distributed under the terms of the MIT license.
#
@@ -444,15 +444,15 @@
return p
-def api_feed_repeater(gen, delay=0, repeat=False, number=1000, namespaces=None,
+def api_feed_repeater(gen, delay=0, repeat=False, namespaces=None,
user=None, recent_new_gen=True):
"""Generator which loads pages details to be processed."""
while True:
if recent_new_gen:
- generator = gen(step=number, namespaces=namespaces, user=user,
+ generator = gen(namespaces=namespaces, user=user,
showPatrolled=False)
else:
- generator = gen(step=number, namespaces=namespaces, user=user,
+ generator = gen(namespaces=namespaces, user=user,
returndict=True, showPatrolled=False)
for page in generator:
if recent_new_gen:
@@ -511,11 +511,9 @@
if usercontribs:
pywikibot.output(u'Processing user: %s' % usercontribs)
- newpage_count = 300
if not newpages and not recentchanges and not usercontribs:
if site.family.name == 'wikipedia':
newpages = True
- newpage_count = 5000
else:
recentchanges = True
@@ -525,7 +523,7 @@
pywikibot.output(u'Newpages:')
gen = site.newpages
feed = api_feed_repeater(gen, delay=60, repeat=repeat,
- number=newpage_count, user=usercontribs,
+ user=usercontribs,
namespaces=genFactory.namespaces,
recent_new_gen=False)
bot.run(feed)
@@ -533,7 +531,7 @@
if recentchanges or usercontribs:
pywikibot.output(u'Recentchanges:')
gen = site.recentchanges
- feed = api_feed_repeater(gen, delay=60, repeat=repeat, number=1000,
+ feed = api_feed_repeater(gen, delay=60, repeat=repeat,
namespaces=genFactory.namespaces,
user=usercontribs)
bot.run(feed)
diff --git a/scripts/redirect.py b/scripts/redirect.py
index 6691a49..a15546e 100755
--- a/scripts/redirect.py
+++ b/scripts/redirect.py
@@ -56,8 +56,6 @@
-until:title The possible last page title in each namespace. Page needs not
exist.
--step:n The number of entries retrieved at once via API
-
-total:n The maximum count of redirects to work upon. If omitted, there
is no limit.
@@ -87,7 +85,9 @@
import pywikibot
from pywikibot import i18n, xmlreader, Bot
+from pywikibot.exceptions import ArgumentDeprecationWarning
from pywikibot.tools.formatter import color_format
+from pywikibot.tools import issue_deprecation_warning
if sys.version_info[0] > 2:
basestring = (str, )
@@ -105,7 +105,7 @@
def __init__(self, xmlFilename=None, namespaces=[], offset=-1,
use_move_log=False, use_api=False, start=None, until=None,
- number=None, step=None, page_title=None):
+ number=None, page_title=None):
"""Constructor."""
self.site = pywikibot.Site()
self.xmlFilename = xmlFilename
@@ -118,7 +118,6 @@
self.api_start = start
self.api_until = until
self.api_number = number
- self.api_step = step
self.page_title = page_title
def get_redirects_from_dump(self, alsoGetPageTitles=False):
@@ -192,8 +191,6 @@
filterredir=True)
if self.api_number:
gen.set_maximum_items(self.api_number)
- if self.api_step:
- gen.set_query_increment(self.api_step)
for p in gen:
done = (self.api_until and
p.title(withNamespace=False) >= self.api_until)
@@ -749,7 +746,6 @@
start = ''
until = ''
number = None
- step = None
pagename = None
for arg in pywikibot.handle_args(args):
@@ -794,7 +790,8 @@
elif arg.startswith('-total:'):
number = int(arg[7:])
elif arg.startswith('-step:'):
- step = int(arg[6:])
+ issue_deprecation_warning(
+ 'The usage of "{0}"'.format(arg), 2, ArgumentDeprecationWarning)
elif arg.startswith('-page:'):
pagename = arg[6:]
elif arg == '-always':
@@ -819,7 +816,7 @@
else:
pywikibot.Site().login()
gen = RedirectGenerator(xmlFilename, namespaces, offset, moved_pages,
- fullscan, start, until, number, step, pagename)
+ fullscan, start, until, number, pagename)
bot = RedirectRobot(action, gen, number=number, **options)
bot.run()
diff --git a/scripts/reflinks.py b/scripts/reflinks.py
index 4fe758a..5e9b913 100755
--- a/scripts/reflinks.py
+++ b/scripts/reflinks.py
@@ -795,7 +795,7 @@
pywikibot.bot.suggest_help(missing_generator=True)
return False
- generator = pagegenerators.PreloadingGenerator(generator, step=50)
+ generator = pagegenerators.PreloadingGenerator(generator)
generator = pagegenerators.RedirectFilterPageGenerator(generator)
bot = ReferencesRobot(generator, **options)
bot.run()
diff --git a/scripts/weblinkchecker.py b/scripts/weblinkchecker.py
index 62e2b09..9f5abb5 100755
--- a/scripts/weblinkchecker.py
+++ b/scripts/weblinkchecker.py
@@ -95,7 +95,7 @@
"""
#
# (C) Daniel Herding, 2005
-# (C) Pywikibot team, 2005-2015
+# (C) Pywikibot team, 2005-2016
#
# Distributed under the terms of the MIT license.
#
@@ -962,7 +962,7 @@
# fetch at least 240 pages simultaneously from the wiki, but more if
# a high thread number is set.
pageNumber = max(240, config.max_external_links * 2)
- gen = pagegenerators.PreloadingGenerator(gen, step=pageNumber)
+ gen = pagegenerators.PreloadingGenerator(gen, groupsize=pageNumber)
gen = pagegenerators.RedirectFilterPageGenerator(gen)
bot = WeblinkCheckerRobot(gen, HTTPignore, config.weblink_dead_days)
try:
diff --git a/tests/isbn_tests.py b/tests/isbn_tests.py
index c437a03..7122aaf 100644
--- a/tests/isbn_tests.py
+++ b/tests/isbn_tests.py
@@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
"""Tests for isbn script."""
#
-# (C) Pywikibot team, 2014-2015
+# (C) Pywikibot team, 2014-2016
#
# Distributed under the terms of the MIT license.
#
@@ -202,7 +202,7 @@
# Check if the unit test item page and the property both exist
item_ns = cls.get_repo().item_namespace
- for page in cls.get_site().search('IsbnWikibaseBotUnitTest', step=1,
+ for page in cls.get_site().search('IsbnWikibaseBotUnitTest',
total=1, namespaces=item_ns):
cls.test_page_qid = page.title()
item_page = ItemPage(cls.get_repo(), page.title())
diff --git a/tests/pagegenerators_tests.py b/tests/pagegenerators_tests.py
index 9d8397c..91519c2 100755
--- a/tests/pagegenerators_tests.py
+++ b/tests/pagegenerators_tests.py
@@ -2,7 +2,7 @@
# -*- coding: utf-8 -*-
"""Test pagegenerators module."""
#
-# (C) Pywikibot team, 2009-2015
+# (C) Pywikibot team, 2009-2016
#
# Distributed under the terms of the MIT license.
from __future__ import absolute_import, unicode_literals
@@ -482,7 +482,7 @@
mainpage = self.get_mainpage()
links = list(self.site.pagelinks(mainpage, total=10))
count = 0
- for page in PreloadingGenerator(links, step=20):
+ for page in PreloadingGenerator(links, groupsize=20):
self.assertIsInstance(page, pywikibot.Page)
self.assertIsInstance(page.exists(), bool)
if page.exists():
@@ -497,7 +497,7 @@
mainpage = self.get_mainpage()
links = list(self.site.pagelinks(mainpage, total=20))
count = 0
- for page in PreloadingGenerator(links, step=10):
+ for page in PreloadingGenerator(links, groupsize=10):
self.assertIsInstance(page, pywikibot.Page)
self.assertIsInstance(page.exists(), bool)
if page.exists():
@@ -681,7 +681,6 @@
gf = pagegenerators.GeneratorFactory()
self.assertTrue(gf.handleArg('-start:!'))
gf.handleArg('-limit:10')
- gf.handleArg('-step:5')
gen = gf.getCombinedGenerator()
self.assertIsNotNone(gen)
pages = set(gen)
@@ -759,7 +758,6 @@
gf = pagegenerators.GeneratorFactory()
self.assertTrue(gf.handleArg('-prefixindex:a'))
gf.handleArg('-limit:10')
- gf.handleArg('-step:5')
gen = gf.getCombinedGenerator()
self.assertIsNotNone(gen)
pages = set(gen)
diff --git a/tests/site_tests.py b/tests/site_tests.py
index 327509f..a5c1dea 100644
--- a/tests/site_tests.py
+++ b/tests/site_tests.py
@@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
"""Tests for the site module."""
#
-# (C) Pywikibot team, 2008-2015
+# (C) Pywikibot team, 2008-2016
#
# Distributed under the terms of the MIT license.
#
@@ -1869,7 +1869,9 @@
"""
mysite = self.get_site()
pages = []
- for rndpage in mysite.randompages(step=5, total=None):
+ rngen = mysite.randompages(total=None)
+ rngen.set_query_increment = 5
+ for rndpage in rngen:
self.assertIsInstance(rndpage, pywikibot.Page)
pages.append(rndpage)
if len(pages) == 11:
@@ -2071,16 +2073,24 @@
mypage = pywikibot.Page(mysite, 'Albert Einstein')
mycat = pywikibot.Page(mysite, 'Category:1879 births')
- cats = [c for c in mysite.pagecategories(mypage, step=5, total=12)]
+ gen = mysite.pagecategories(mypage, total=12)
+ gen.set_query_increment = 5
+ cats = [c for c in gen]
self.assertEqual(len(cats), 12)
- cat_members = [cm for cm in mysite.categorymembers(mycat, step=5, total=12)]
+ gen = mysite.categorymembers(mycat, total=12)
+ gen.set_query_increment = 5
+ cat_members = [cm for cm in gen]
self.assertEqual(len(cat_members), 12)
- images = [im for im in mysite.pageimages(mypage, step=3, total=5)]
+ gen = mysite.pageimages(mypage, total=5)
+ gen.set_query_increment = 3
+ images = [im for im in gen]
self.assertEqual(len(images), 5)
- templates = [tl for tl in mysite.pagetemplates(mypage, step=3, total=5)]
+ gen = mysite.pagetemplates(mypage, total=5)
+ gen.set_query_increment = 3
+ templates = [tl for tl in gen]
self.assertEqual(len(templates), 5)
mysite.loadrevisions(mypage, step=5, total=12)
--
To view, visit https://gerrit.wikimedia.org/r/266476
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Iac0afb6479f2c73e08c39412903d1a8d6d5092b9
Gerrit-PatchSet: 7
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Russell Blau <russblau(a)imapmail.org>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Change flickrripper:remove default values for parameters wherever necessary
......................................................................
Change flickrripper:remove default values for parameters wherever necessary
Remove default values for necessary parameters to avoid errors in functions
Bug: T93098
Change-Id: I9519d7d1dc24640920a50dffe6d326b153459721
---
M scripts/flickrripper.py
1 file changed, 9 insertions(+), 9 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
Xqt: Looks good to me, but someone else must approve
jenkins-bot: Verified
diff --git a/scripts/flickrripper.py b/scripts/flickrripper.py
index cbbe7d7..7fa9bce 100755
--- a/scripts/flickrripper.py
+++ b/scripts/flickrripper.py
@@ -80,7 +80,7 @@
}
-def getPhoto(flickr=None, photo_id=''):
+def getPhoto(flickr, photo_id):
"""
Get the photo info and the photo sizes so we can use these later on.
@@ -99,7 +99,7 @@
time.sleep(30)
-def isAllowedLicense(photoInfo=None):
+def isAllowedLicense(photoInfo):
"""
Check if the image contains the right license.
@@ -112,7 +112,7 @@
return False
-def getPhotoUrl(photoSizes=None):
+def getPhotoUrl(photoSizes):
"""Get the url of the jpg file with the highest resolution."""
url = ''
# The assumption is that the largest image is last
@@ -121,7 +121,7 @@
return url
-def downloadPhoto(photoUrl=''):
+def downloadPhoto(photoUrl):
"""
Download the photo and store it in a io.BytesIO object.
@@ -153,7 +153,7 @@
return site.getFilesFromAnHash(base64.b16encode(hashObject.digest()))
-def getTags(photoInfo=None):
+def getTags(photoInfo):
"""Get all the tags on a photo."""
result = []
for tag in photoInfo.find('photo').find('tags').findall('tag'):
@@ -162,7 +162,7 @@
return result
-def getFlinfoDescription(photo_id=0):
+def getFlinfoDescription(photo_id):
"""
Get the description from http://wikipedia.ramselehof.de/flinfo.php.
@@ -174,7 +174,7 @@
'http://wikipedia.ramselehof.de/flinfo.php?%s' % parameters).content
-def getFilename(photoInfo=None, site=None, project=u'Flickr'):
+def getFilename(photoInfo, site=None, project=u'Flickr'):
"""Build a good filename for the upload based on the username and title.
Prevents naming collisions.
@@ -279,7 +279,7 @@
return description
-def processPhoto(flickr=None, photo_id=u'', flickrreview=False, reviewer=u'',
+def processPhoto(flickr, photo_id=u'', flickrreview=False, reviewer=u'',
override=u'', addCategory=u'', removeCategories=False,
autonomous=False):
"""Process a single Flickr photo."""
@@ -342,7 +342,7 @@
return 0
-def getPhotos(flickr=None, user_id=u'', group_id=u'', photoset_id=u'',
+def getPhotos(flickr, user_id=u'', group_id=u'', photoset_id=u'',
start_id='', end_id='', tags=u''):
"""Loop over a set of Flickr photos."""
found_start_id = not start_id
--
To view, visit https://gerrit.wikimedia.org/r/273842
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I9519d7d1dc24640920a50dffe6d326b153459721
Gerrit-PatchSet: 5
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Darthbhyrava <hbhyrava(a)gmail.com>
Gerrit-Reviewer: Dalba <dalba.wiki(a)gmail.com>
Gerrit-Reviewer: Darthbhyrava <hbhyrava(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Multichill <maarten(a)mdammers.nl>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: hroest <hannesroest(a)gmx.ch>
Gerrit-Reviewer: jenkins-bot <>
Xqt has submitted this change and it was merged.
Change subject: [bugfix] use string.splitlines() instead of string.split('\n')
......................................................................
[bugfix] use string.splitlines() instead of string.split('\n')
split('\n') is intended to split line feeds but in some circumstances
a remaining CR (\r) was found in the string and comparing always fails
in that case. Using splitlines solves this problem.
Change-Id: I30fd90d6c040dd0d25d7f328db0eee3ce45577d7
---
M tests/script_tests.py
1 file changed, 2 insertions(+), 2 deletions(-)
Approvals:
Mpaa: Looks good to me, approved
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/tests/script_tests.py b/tests/script_tests.py
index 5f9fb85..cc4348c 100644
--- a/tests/script_tests.py
+++ b/tests/script_tests.py
@@ -272,7 +272,7 @@
result = execute_pwb(cmd, data_in, timeout=timeout, error=error,
overrides=test_overrides)
- stderr = result['stderr'].split('\n')
+ stderr = result['stderr'].splitlines()
stderr_sleep = [l for l in stderr
if l.startswith('Sleeping for ')]
stderr_other = [l for l in stderr
@@ -288,7 +288,7 @@
exit_codes = [0, 1, 2, -9]
elif not is_autorun:
- if stderr_other == ['']:
+ if stderr_other == []:
stderr_other = None
if stderr_other is not None:
self.assertIn('Use -help for further information.',
--
To view, visit https://gerrit.wikimedia.org/r/273908
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I30fd90d6c040dd0d25d7f328db0eee3ce45577d7
Gerrit-PatchSet: 3
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>