Hi All, welcome to the monthly MediaWiki Insights email!
First of all, congratulations to Ederporto, SD hehua and Labster who got
their first patch in MW core or Wikimedia deployed extensions and services
merged during the month of April! Many thanks also to the reviewers for
your support and thoughtfulness! It's the collaborative effort that makes
MediaWiki scale for what it's built for.
10 months ago
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports/August_20…>
we started the journey of ramping up MediaWiki as a product and thinking
about ways to systemically tackle the challenges around the MediaWiki
software platform. The initial announcement
<https://phabricator.wikimedia.org/T336990> and first conversations already
happened before we officially kicked things off: At the Wikimedia Hackathon
2023 in Athens. Many interviews, conversations, explorations, investments
and decisions later we’re coming back to the Wikimedia Hackathon
<https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2024> this year to
connect the dots.
MediaWiki, the software platform and interfaces that allow Wikipedia and
other projects to function, needs ongoing support for the next decade in
order to provide creation, moderation, storage, discovery, and consumption
of open, multilingual content at scale.
What decisions and platform improvements can we make to ensure that
MediaWiki is sustainable? - This has been a key guiding question over the
past months. Platform improvements have been the subject of many
initiatives mentioned in the monthly MediaWiki Insights emails
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports>, and
the focus on sustainability is also reflected in the draft for the
Wikimedia Foundation’s annual plan
<https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2024-2025/…>
for the upcoming fiscal year (July 2024 – June 2025).
We plan to present an overview on the state of things at the Hackathon, and
hope to use the “hallway track” to discuss questions, plans and ideas with
some of you! A related presentation has already been given at the recent
MediaWiki users and developers conference.
We will follow up with links to presentations, a summary and reflections in
the next MediaWiki insights email. Stay tuned!
MediaWiki track at the Wikimedia Hackathon in Tallinn
The Wikimedia Hackathon starts this week! Beyond wanting to talk about
MediaWiki, we’ve also put a MediaWiki track
<https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2024/MediaWiki_Track>
together to help people get started - or continue - with working on
MediaWiki (core), discuss and collaborate with each other (workboard
<https://phabricator.wikimedia.org/project/board/7117/>). Whether you want
to get started, join as a mentor, or talk about MediaWiki - if you’re
attending the Hackathon, please come find us at the MediaWiki track table!
Project snapshots: ObjectCacheFactory introduced to MediaWiki core, SUL3
and more work on REST API
Caching: We introduced ObjectCacheFactory to MediaWiki core (T358346
<https://phabricator.wikimedia.org/T358346>). ObjectCache is responsible
for creating and fetching various cache instances for MediaWiki making use
of BagOStuff, which is the mechanism for caching in the software's
ecosystem. The introduction of the Cache Factory aims to reduce
inconsistencies in our codebase and improve stability. Many thanks to
Derick for leading on this work and to Timo, Piotr and Gergö for their
support!
Help with replacing the deprecated static factory methods is much
appreciated! Please see this tracking ticket for more information on which
repositories need updates: T363770
<https://phabricator.wikimedia.org/T363770>.
MediaWiki REST API: We completed T359652
<https://phabricator.wikimedia.org/T359652>, which disallows optional path
parameters <https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1016820> in
the MediaWiki REST API <https://www.mediawiki.org/wiki/API:REST_API> (aka
rest.php) at the PathMatcher
<https://gerrit.wikimedia.org/g/mediawiki/core/+/eab7628c59be589c8bba2b29dab…>
level. This is important for compatibility with API modules that we are
moving from RESTBase <https://www.mediawiki.org/wiki/RESTBase> into MW Core
<https://www.mediawiki.org/wiki/Core> as part of RESTBase sunset
<https://phabricator.wikimedia.org/T262315> (such as Reading Lists
<https://phabricator.wikimedia.org/T336693>), and avoids issues with our
generated OpenAPI/swagger specs
<https://meta.wikimedia.beta.wmflabs.org/w/rest.php/> (swagger does not
allow optional path parameters
<https://swagger.io/docs/specification/describing-parameters/#path-parameters>).
A big thanks to Bill for his work on this, and to all the people who
provided support!
Database: SelectQueryBuilder and Expression Builder: We shared about the
work that the Data Persistence team has been doing on the MediaWiki Rdbms
library’s interface to improve consistency, security and ease for getting
database connections and performing common queries in an earlier email
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports/October_2…>.
A big thanks to Amir for all his work on this to date! Since then, a lot of
progress has been made on updating MediaWiki repositories in Wikimedia
production to use the new SelectQuery and Expression Builders: Many thanks
to all the people who helped with this! A special thanks to Umherirrender
for migrating an impressive number of extensions to use SelectQuery
<https://phabricator.wikimedia.org/T311866> and Expression
<https://phabricator.wikimedia.org/T350075> builders over the past month,
and to DannyS712 for many reviews!
Another highlight is the work done by Taavi over the past year: Allowing
multiple different 2FA devices (T242031
<https://phabricator.wikimedia.org/T242031>) is about to wrap up, which
should be a nice improvement to make 2FA easier to use! Many thanks to
Taavi and everyone involved for this work!
SUL3: Browsers increasingly roll out anti-tracking measures and limitations
on third-party cookie use. A side effect of this is that it also impacts
CentralAuth autologin. We aim to transition to a single sign-on domain
to minimize
the number of times users need to enter their credentials when changing
wikis as well as for other benefits; and are about to move from the
research to the coding phase. The implementation plan (with some question
marks) is in T348388 <https://phabricator.wikimedia.org/T348388> and its
subtasks - feedback is very welcome!
The Language team just released MediaWiki Language Extension Bundle 2024.04
(announcement
<https://lists.wikimedia.org/hyperkitty/list/mediawiki-l@lists.wikimedia.org…>).
They are also looking into changing how and when we release MLEB. Please
see T356847 <https://phabricator.wikimedia.org/T356847> for more
information and feedback.
MediaWiki Release: The 1.42.0-rc.0 announcement will be out soon. MW
1.42-alpha has been branched since April 9th and added to the on-wiki
documentation as the development snapshot. If you have changes that need to
go to 1.42, they should be backported. New tasks with commits since April
16th have been targeted to the 1.43 unstable branch
<https://phabricator.wikimedia.org/project/profile/7083/>.
See some of you in Tallinn!
Thanks all for reading,
Birgit
--
Birgit Müller (she/her)
Director of Product, MediaWiki and Developer Experiences
Wikimedia Foundation <https://wikimediafoundation.org/>
Hey all,
This is a quick note to highlight that in five weeks' time, the REL1_42
branch will be created for MediaWiki core and each of the extensions and
skins in Wikimedia git, with some (the 'tarball') included as sub-modules
of MediaWiki itself[0]. This is the first step in the release process for
MediaWiki 1.42, which should be out in May 2024, approximately six months
after MediaWiki 1.41.
The branches will reflect the code as of the last 'alpha' branch for the
release, 1.42.0-wmf.26, which will be deployed to Wikimedia wikis in the
week beginning 8 April 2024 for MediaWiki itself and those extensions
and skins available there.
After that point, patches that land in the main development branch of
MediaWiki and its bundled extensions and skins will be instead be slated
for the MediaWiki 1.43 release unless specifically backported[1].
If you are working on a new feature that you wish to land for the release,
you now have a few days to finish your work and land it in the development
branch; feature changes should not be backported except in an urgent case.
If your work might not be complete in time, and yet should block release
for everyone else, please file a task against the `mw-1.42-release` project
on Phabricator.[2]
If you have tickets that are already tagged for `mw-1.42-release`, please
finish them, untag them, or reach out to get them resolved in the next few
weeks.
We hope to issue the first release candidate, 1.42.0-rc.0, two weeks after
the branch point, and if all goes well, to release MediaWiki 1.42.0 a few
weeks after that.
[0]: https://www.mediawiki.org/wiki/Bundled_extensions_and_skins
[1]: https://www.mediawiki.org/wiki/Backporting_fixes
[2]: https://phabricator.wikimedia.org/tag/mw-1.42-release/
Yours,
--
James D. Forrester (he/him or they/themself)
Wikimedia Foundation
Hi All, welcome to the monthly MediaWiki Insights email!
We’re starting this email again by celebrating volunteer contributors who
got their first patch merged in MW core, WMF deployed extensions or
services over the past month:
Many thanks and congrats to Nemoralis, RockingPenny4, Theprotonade, S8321414,
Philip and Aram!
Enable more people to know MediaWiki and contribute effectively
We’ve officially hit the baseline
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Contributor_reten…>
we’ve set to measure the growth in number of people who have submitted
patches to MediaWiki core: During the period July 2022 - June 2023 (= last
WMF fiscal year), 70 people have submitted more than 5 patches to MW core.
This year, we reached this number already in March and are currently
observing a 14.5% increase in the number of contributors who have submitted
more than 5 patches between July 2023 and March 2024 compared to July 2022
- March 2023.
To achieve the goal of a 20% increase, we will continue with consultancy
and code review for teams and volunteers as part of our regular work. We
are also planning a dedicated MediaWiki focus area for the upcoming Wikimedia
Hackathon <https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2024> to help
people contribute to MediaWiki. If you are attending the Hackathon and are
interested in joining the fun to help others onboard in MediaWiki, want to
get started with MediaWiki development or already have a project you’d like
to get help on, please monitor the Hackathon channels and workboard over
the coming weeks for more communication around this!
Project Snapshots: Sustainability and evolution of the platform requires
the efforts of many: Deprecated wfGetDB(), migration to Prometheus and
supporting page content in core REST API
The theme of many initiatives that ensure sustainability and evolution of
the MediaWiki platform is that it requires the collaborative effort of
many. In some cases this may be work done primarily by specific teams, in
other cases this is a collective effort of staff across teams and
volunteers.
One great example for the collective effort of volunteers and staff is
that wfGetDB()
- the old global function to get database connections - is now
hard-deprecated <https://phabricator.wikimedia.org/T273239>, thanks to many
people who did the migration across many extensions!
<https://phabricator.wikimedia.org/T273239>
With wfGetDB() out of the picture, we no longer rely on global state when
accessing database connections. This means that code that wants to access
another wiki’s database may not end up accidentally mixing information from
two wikis. Besides, the new IConnectionProvider interface -
getReplicaDatabase() is hopefully more readable than wfGetDB(DB_REPLICA). This
is part of our wider work to modernize MediaWiki's platform so that we can
inject services for scale and testing.
Supporting page content in core REST API for fetching HTML
We already talked about RESTBase deprecation and Parser Unification
milestones in the last edition
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports/February_…>
of the MediaWiki Insights email. Oftentimes, work on one initiative
benefits other initiatives and vice versa. This is very true for the
following achievement, which is closely related to Parser Unification and
RESTBase deprecation:
The MediaWiki REST endpoints for fetching page HTML
<https://www.mediawiki.org/wiki/API:REST_API/Reference#Get_HTML> now
support all kinds of content. This provides an easy way for bots and
scripts to get the content of any wiki page just like it would be shown to
users when they view an article in the browser (*).
This may seem like a simple and obvious thing, but we did not have the
capability until recently: From introduction of these endpoints in 2020 up
until the completion of the work on T359426
<https://phabricator.wikimedia.org/T359426>, they were limited to wikitext
pages and would fail on pages containing JavaScript or Lua or Wikidata
items.
A lot of changes were necessary “under the hood” to get to a point where
the REST API could provide rendered HTML for all kinds of content, while
using the Parsoid rendering of wiki pages. Much of the necessary work
overlaps with the efforts of sunsetting RESTbase
<https://www.mediawiki.org/wiki/RESTBase/deprecation> and implementing
support for Parsoid page views <https://phabricator.wikimedia.org/T55784>.
Some key aspects were:
- Integrating Parsoid with ContentHandler (T311648
<https://phabricator.wikimedia.org/T311648>)
- Populating the cache with Parsoid output when pages are edited (T320534
<https://phabricator.wikimedia.org/T320534>)
- Removing the caching layer for Parsoid output in RESTbase (T344945
<https://phabricator.wikimedia.org/T344945>)
- Implementing variant conversion for the API endpoint (T317019
<https://phabricator.wikimedia.org/T317019>)
Having support for all kinds of page content in the REST API while using
Parsoid for wikitext pages is a major milestone towards supporting Parsoid
page views. It demonstrates that Parsoid has been fully integrated with the
content rendering and caching infrastructure in MediaWiki. This is the
culmination of the efforts of several teams over multiple years. Many
thanks to everyone involved!
Note that, while the REST endpoints now support all kinds of content, they
are not quite yet ready for prime time: they still lack proper integration
with the edge caches <https://www.mediawiki.org/wiki/Manual:Varnish_caching>
that will ensure good performance when a lot of clients start using these
APIs. To address this issue and to improve version management for API
endpoints, we are considering changing the canonical URLs of the endpoints.
(*) We still use the old parser of page views though. See the last
MediaWiki Insights email
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports/February_…>
for where we’re at on the roadmap.
MediaWiki metrics to Prometheus migration: Status and call to action
The Observability team is currently working on migrating MediaWiki metrics
to Prometheus <https://prometheus.io/docs/introduction/overview/>,
utilizing StatsLib <https://www.mediawiki.org/wiki/Manual:Stats>, an
internally developed, Prometheus-capable metrics interface. We have been
using Prometheus in Wikimedia production for several years as it offers
several benefits over Graphite
<https://prometheus.io/docs/introduction/comparison/>. Migrating ensures we
stay ahead with a supported, scalable metrics platform for a more
effective, multidimensional metrics analysis and storage engine.
We are closing in on about 8% of total metrics emitted to graphite migrated
over to Prometheus
<https://grafana.wikimedia.org/d/nCxX65cSk/mediawiki-statslib-migration?orgI…>
and are now ready to invite more people in to help contribute to this
effort! Your expertise can help drive the success of this migration and support
in migrating your component’s metrics to StatsLib (T350592)
<https://phabricator.wikimedia.org/T350592>:
- Look up your component, extension, or module and follow the
examples/docs in the task to migrate your metrics to the new metrics
interface.
- Help deprecate and clean up/remove outdated metrics not in use (or
graphed in dashboards).
- Collaboration in testing and feedback for a seamless transition.
A more detailed announcement and call to action will follow.
Many thanks to the Observability team for their leadership on this
initiative, Derrick, Timo and Larissa from the MediaWiki Platform team for
their consultancy and help in converting MediaWiki metrics so far; Kavitha,
Giuseppe, Janis, and Clement from Service Ops for infrastructure support;
and the Search Platform team for their recent involvement!
Annual Plan 2024/25: Key Result drafts published
The Wikimedia Foundation has recently published the draft “key results”
<https://en.wikipedia.org/wiki/Objectives_and_key_results> by the Product &
Technology department for the upcoming annual plan. While this is not yet
at the project/initiatives level (“hypotheses”), the draft KRs give
insights in focus areas for the next year. The most relevant objectives for
MediaWiki platform work and services for developers are WE5 (“Knowledge
Platform I”) and WE6 (“Knowledge Platform II”). Input and questions on
these drafts
<https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2024-2025/…>
are welcome!
Upcoming: MW 1.42 release
MW 1.42 <https://mediawiki.org/wiki/MediaWiki_1.42> release is coming! The
tentative target was set as May 2024
<https://phabricator.wikimedia.org/T359833> and in the next few weeks, it
will be the time to polish and prepare for the release. Stay tuned for
updates and use Phabricator
<https://phabricator.wikimedia.org/project/view/6601/> to engage with us
and raise potential blockers if you haven’t done yet.
Thanks all for reading,
Birgit
--
Birgit Müller (she/her)
Director of Product, MediaWiki and Developer Experiences
Wikimedia Foundation <https://wikimediafoundation.org/>
On T190369 <https://phabricator.wikimedia.org/T190369> we're trying to
complete our collection of old MediaWiki release tarballs.
MediaWiki 1.20.7 was released on September 3, 2013. On December 8,
2013, it was reported that something happened to our download server,
and many files went missing.
Most of the missing files turn out to have been mirrored in once place
or another, but 1.20.7, which was only available for three months at
most, hasn't shown up yet.
Has anyone got a copy? The filename would have been
mediawiki-1.20.7.tar.gz
-- Tim Starling
Hi all,
On Thursday we will be issuing a security and maintenance release to all
supported branches of MediaWiki.
The new releases will be:
- 1.39.7
- 1.40.3
- 1.41.1
This will resolve two security issues in MediaWiki core, along with bug
fixes included for maintenance reasons. This includes various patches for
PHP 8.0, 8.1, 8.2 and 8.3 support.
This release may or may not be made with a CVE number formally attached,
due to the recent delays in receiving them from MITRE.
We will make the fixes available in the respective release branches and
master in git. Tarballs will be available for the above mentioned point
releases as well.
A summary of some of the security fixes that have gone into non-bundled
MediaWiki extensions will also follow later.
As a reminder, MediaWiki 1.35 became end of life (EOL) in December 2023.
It is strongly recommended to upgrade to either 1.39 (the next LTS after
1.35), which will be supported until November 2025, 1.40, which will be
supported until June 2024, or 1.41, which will be supported until December
2024.
[1] https://www.mediawiki.org/wiki/Version_lifecycle
Dear Wikimedians,
We hope this message finds you well and that you are in good spirits. We
are the Let’s Connect working group
<https://meta.wikimedia.org/wiki/Grants:Knowledge_Sharing/Connect/Team>- a
team of movement contributors/organizers who are liaison representatives of 7/8
regions <https://meta.wikimedia.org/wiki/Wikimedia_regions>. We are
connecting with you to see if you are interested in and/or know about the
peer-to-peer program, Let’s Connect
<https://meta.wikimedia.org/wiki/Grants:Knowledge_Sharing/Connect>!
The program creates an open and safe learning space for any Wikimedian who is
part of an organized group to share and learn different skills
(organizational/interpersonal / grant related / learning & evaluation ...)
with other peers to add value and contribute collectively to the community.
The purpose is to further develop skills, share knowledge and promote human
connections and mutual support between different groups and communities, in
alignment with the Movement Strategy
<https://meta.wikimedia.org/wiki/Movement_Strategy>.
Every month, we host 2-3 live 2-hour learning clinics
<https://meta.wikimedia.org/wiki/Grants:Knowledge_Sharing/Connect#Live_learn…>
with
interesting topics selected by our team and our interested sharers. Our
live learning sessions have up to 4 interpreters translating the clinic for
our participants. Our main languages are Spanish | Arabic | French |
Portuguese. If there is a specific language you would like to see in the
calls, we are happy to see how we can accommodate it.
Let’s Connect is directed at Wikimedians in all regions that are part of
organized groups (this can range from a group of individuals that are not
formally organized user groups, chapters and mission-aligned
organizations). Please see our Meta page for more criteria
<https://meta.wikimedia.org/wiki/Grants:Knowledge_Sharing/Connect#Who_is_Let…>.
To participate as a sharer, you can register in this initial registration
form
<https://docs.google.com/forms/d/e/1FAIpQLSdiea87tSYmB2-1XHn_u8RLe7efMJifJBz…>
where
you can register your learning and sharing interests and state if you want
to share your knowledge through Learning Clinics.
Below, you will find our team of 8 who are excited to meet with you if you
are interested. Please email our team at letsconnect(a)wikimedia.org if you
have any questions :)
We look forward to hearing from you.
Best,
The Let’s Connect Working Group
Hi All, welcome to the monthly MediaWiki Insights email!
Like last time
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports/January_2…>,
we’re starting this email by celebrating volunteer contributors who got
their first patch merged in MW core, WMF deployed extensions or services
over the past month:
A big thanks to GergesShamon and Agent Isai for their contributions!
Welcome :-)
Many thanks also to the reviewers - volunteers and staff - of patches by
new contributors. Your support is invaluable for helping people contribute
to MediaWiki effectively and make it a fun first experience!
The focus of this edition of the monthly MW Insights email lies on our
multi-year initiatives: Migration of MediaWiki from bare metal to Kubernetes
<https://wikitech.wikimedia.org/wiki/MediaWiki_On_Kubernetes> in Wikimedia
Production, Parser Unification
<https://www.mediawiki.org/wiki/Parsoid/Parser_Unification> and RESTBase
deprecation <https://phabricator.wikimedia.org/project/profile/6289/>.
These initiatives involve many people and projects, require orchestrated
efforts and coordination, and will help us in many different ways. Another
thing that these initiatives have in common is that we’re getting closer to
being able to benefit from these efforts :-).
Project Snapshots: Milestones reached on 3 multi-year projects
MediaWiki on Kubernetes migration is 75% complete, completing migration of
all internal traffic, all scheduled jobs and reaches 50% global traffic
milestone!
Often shortened as mw-on-k8s, MediaWiki on Kubernetes is a multi-year
effort to move all of the MediaWiki deployments running on WMF production
infrastructure to the new WikiKube platform.
The migration is underway and very recently a new milestone was hit where
the WikiKube platform now serves 50% of end user requests
<https://phabricator.wikimedia.org/T290536>. At half mark, having more
percentage of traffic on WikiKube, Service Operations is working with
Release Engineering to make changes to monitoring tools to surface any
issues during deployments to continue with further ramps. Also, almost the
entirety of what is called internal traffic
<https://phabricator.wikimedia.org/T333120>, that is traffic that is
generated by applications running in the infrastructure and reaching out to
MediaWiki for various purposes is migrated to WikiKube. A special mention
should go to MediaWiki Jobs fully migrated just a week ago
<https://phabricator.wikimedia.org/T349796>. Feel free to follow the higher
umbrella task <https://phabricator.wikimedia.org/T290536> and/or
MediaWiki_On_Kubernetes
on Wikitech <https://wikitech.wikimedia.org/wiki/MediaWiki_On_Kubernetes>.
This migration will unleash the ability to deploy multiple versions of code
simultaneously. Also, this will help enhance platform capabilities to build
dockerized isolated environments for coding, testing and even production
debugging.
MediaWiki on Kubernetes will allow us to deprecate and eventually remove a
lot of our in-house developed code. Another benefit is that we will be able
to react to sudden traffic spikes, like newsworthy events better, as the
flexing up and down is a matter of configuration change. This enables
efficient placement of workloads, packing workloads in a more
environmentally friendly way, increasing hardware utilization.
It takes a village to get this far: A big thanks to the Service Operations
team (Clément Goubert, Giuseppe Lavagetto, Alexandros Kosiaris, Kamila
Součková, Hugh Nowlan, Effie Mouzeli, Reuven Lazarus, Jannis Mayboom and
Kavitha Appakayala) for their leadership on this project, the Release
Engineering team (specifically: Dan Duvall, Jeena Huneidi, Tyler Cipriani
and Ahmon Dancy) for their work on Blubber
<https://wikitech.wikimedia.org/wiki/Blubber>, the Deployment Pipeline
<https://wikitech.wikimedia.org/wiki/Deployment_pipeline> and Scap
<https://wikitech.wikimedia.org/wiki/Scap>, Dom Walden from Quality and
Test Engineering for crafting and executing a plan to test the first
deployment of MediaWiki on Kubernetes, and everyone else who has
contributed in the one way or the other! <3
More milestones:
We’re seeing the first lights at the end of the tunnel of another
multi-year initiative: The *parser unification*
<https://www.mediawiki.org/wiki/Parsoid/Parser_Unification>. One week
ago, Parsoid
Read Views was rolled out to first wikis: Parsoid is now the default read
views renderer on the Foundations’ Office Wiki and Wikitech
DiscussionTools. This early experimentation allows us to find issues in a
limited space, which will help us evaluate readiness of the feature
and increase
our confidence for future rollouts
<http://mediawiki.org/wiki/Parsoid/Parser_Unification/Confidence_Framework>.
A huge thanks to Subbu Sastry, Mateus Santos, CScott, Isabelle
Hubert-Pallatin, Arlo Breault, Shannon Bailey, Yiannis Giannelos and Sérgio
Lopes for making this work! Many thanks also to Daniel Kinzler: His work on
the RESTBase deprecation directly helped us get to this milestone :-)
*RESTBase deprecation*: We have been continuously working on decoupling
services from RESTBase, aiming for the modernisation and sustainability of
Wikimedia products in our services platform. The MediaWiki Interfaces team
finished up reimplementation of Reading Lists endpoints in MW REST API
<https://phabricator.wikimedia.org/T348491> and are now confirming with
affected callers <https://phabricator.wikimedia.org/T357478> that the new
endpoints meet their needs before rerouting calls
<https://phabricator.wikimedia.org/T348493> and retiring old code
<https://phabricator.wikimedia.org/T348494>. The overall effort
<https://phabricator.wikimedia.org/T336693> will not only move us forward
on RESTBase retirement, but also reduce the total amount of code we have to
maintain. Many thanks to Bill Pirkle, Atieno Njira, Wendy Quarshie, and
Daniel Kinzler for making this work! We also fully turned off Parsoid cache
storage in RESTBase - clients will get outputs direct from MediaWiki and
the cache will be handled by ParserCache. Next, we will re-route clients
directly to MediaWiki and fully remove Parsoid from RESTBase (T344944
<https://phabricator.wikimedia.org/T344944>). The Page Content Service
(PCS) will also handle its own cache
<https://phabricator.wikimedia.org/T348995> and we are ready to test the
new capabilities in staging soon. Many thanks to Yiannis Giannelos and the
Content Transform team and to Daniel Kinzler for his efforts and support on
decoupling Parsoid from RESTBase!
All of these multi-year initiatives help us increase sustainability and
maintainability of the platform, streamline engineering and developer
workflows, and unlock the path for new and improved platform capabilities
and product opportunities.
Outlook: Knowledge Platform in the annual plan 2024/2025
The Wikimedia Foundation has recently published the draft objectives by the
Product & Technology department
<https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2024-2025/…>
for the next annual plan on Meta, alongside an introduction by Selena
Deckelmann and a few questions that we’re exploring
<https://meta.wikimedia.org/wiki/Talk:Wikimedia_Foundation_Annual_Plan/2024-…>.
Your input on these questions is very welcome!
The draft objectives include “Knowledge Platform I” - centered around
MediaWiki platform evolution and “Knowledge Platform II” - centered around
developer/engineering services and workflows. The objectives show only the
high-level direction for next year. The draft “key results” (currently work
in progress) will give a better idea of what areas of work we’re thinking
about. We’ll be publishing these in March and share the link + invitation
for feedback with this list again.
Thanks all for reading,
Birgit
--
Birgit Müller (she/her)
Director of Product, MediaWiki and Developer Experiences
Wikimedia Foundation <https://wikimediafoundation.org/>