Jump to content

Help talk:Template limits

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 156.223.148.201 (talk) at 09:21, 8 January 2022. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Analyzing a page's PEIS

If you need to reduce a page's post-expand include size, it would help to understand where that PEIS comes from. So is there a tool that, given a particular page, tells you how much different parts of the page contribute to the PEIS? One possibility would be to break it down by section (so, like the "Section sizes" template, but for PEIS instead of source size), but another would be to break it down by template name. (E.g., all of this page's calls to the "cite" template contribute X to the page's PEIS). Jmdyck (talk) 18:12, 15 June 2020 (UTC)[reply]

There’s already a report in the HTML source, which shows template expand times, not PEIS, like this:
<!--
Transclusion expansion time report (%,ms,calls,template)
100.00%   13.596      1 -total
 47.24%    6.423      1 Template:Resolved
 25.82%    3.511      1 Template:Hmbox
 24.92%    3.389      9 Template:Tl
 18.80%    2.556      1 Template:Z48
-->
I can imagine something like this for PEIS as well. (I don’t know how feasible it technically is, though.) —Tacsipacsi (talk) 00:50, 17 June 2020 (UTC)[reply]

Necessity of the "post-expand include size"-limit, identifying the core problem and possible alternative solutions

I was running into a problem when trying to transclude multiple sections in something related to the article 2020 in science due to the Post-expand include size limit.

  • The problem seems to be that each of the transcluded sections had many references which caused a long Transclusion expansion time (unbelievably over 7 seconds for content that is only text and images which are supposed to lazy-load in 2020)
    • The Transclusion expansion time can be checked in the source code of the article. For a tested version of the article it was (the total % being over 100% seems to be an additional problem):
Transclusion expansion time report (%,ms,calls,template)
100.00% 7733.218 1 -total
61.75% 4774.901 1 Template:Reflist
23.74% 1836.005 293 Template:Cite_journal
22.62% 1749.421 502 Template:Cite_news
5.67% 438.425 4 Template:Fix
5.50% 425.630 1 Template:Overly_detailed_inline
5.47% 422.810 12 Template:Category_handler
4.52% 349.686 1 Template:CVE
3.32% 257.085 80 Template:Cite_web
1.89% 145.880 7 Template:Convert
  • Due to these problems I created an issue on phabricator here.
  • However my recent questions there have not been answered which may be due to readers of the task not knowing any answer to the questions I posed. This is why I'm asking you, dear watchers/readers of this talk page.
  • The problem I'm having seems to be shared with a substantial number of other editors / other articles − including many COVID-19-related ones.
  • The questions:
    • Is or can there be any near-term solution for including more text via transclusions (i.e. up to 12 section-transclusions with many references each)?
      • One alternative or near-term solution I could imagine would be a software change that allows for collapsed sections that e.g. only load the data after clicking a [show] button and/or when expanding a section in the mobile view. What do you think of this and is there already an issue for this? It could also detect if the client device is a mobile phone (and/or has a slow Internet-connection) and if not preload the content of these collapsed sections.
    • According to #Why are there limits? the rationale for this artificial limit seems to be that a) the content is considered "large quantity of data" b) the amount is slow to load and c) that it can be "used to mount a denial of service (DoS) attack on the servers".
      • However, 2 MB is not a large quantity of data in 2020 − it's about the size of images commonly shared on image-sharing websites and it should be only a fraction of that size when compressed. Is compression used as much/efficiently as it could be? There are many websites with a) lot more data that b) load a lot quicker and c) often deploy modern Web solutions like lazy-loading content or infinite-scrolling etc.
      • If the Wikimedia servers can't handle requests of 2 MB of uncompressed data (mostly text) I think there's something really wrong with the hard/software, especially when considering the amount of donated money theoretically available to upgrade either or both.
      • I don't think more content in pages would allow for more problematic DoS attacks than otherwise. And if so this should probably be mitigated, just like any other DoS attacks, with adequate technical measures (like limiting the number of requests to a page of large size from a single IP).
    • Are templates and transclusions not prerendered/preparsed after a change to the template or transcluded target section? Does including templates and transclusions always require servers to process every single request? There seems to be something really wrong with the templating architecture if this is the case − no wonder there's a high load on the servers if the templates cause computing-intensive parsing at every page-load. Is there a task and/or suggested solution to this somewhere? From this it seems that transcluded templates are updated after their changed and not somehow dynamically loaded at page-load.
    • Why do the reference-templates take so long to load? Shouldn't they be just text (partly hyperlinks)? There seems to be something really wrong with the reference templating design or template-mechanisms in general if the transclusion expansion time (see the example above) show the real load-times.

I'd appreciate if somebody could answer it here or at the phabricator issue.

tl;dr: just read the text highlighted in bold.

Thank you.

--Prototyperspective (talk) 21:18, 26 September 2020 (UTC)[reply]

The limits really do need to be raised, it is 2020. There should also be higher limits on certain "white listed" pages, with the understanding that purging of these pages' caches may be de-prioritized if the work queue gets long. I know you were writing about limits on time but there are whole classes of articles, mostly sports-related, that run up against other limits, including post-expansion include size and expensive function count limits. Wikipedia needs a better way to handle these. davidwr/(talk)/(contribs) 23:43, 26 September 2020 (UTC)[reply]
Workaround to the partial-transclusion issue: Instead of marking off sections in article A to be transcluded by Article B, SPLIT Article A into parts, and have both Article A and Article B include the sub-parts. It's not the normal way of doing things, but sometimes when you hit hard limits you have to WP:Ignore all rules to get the job done. Just be sure to document what you did and why you did it so it can be undone when the limits are eventually raised. davidwr/(talk)/(contribs) 23:54, 26 September 2020 (UTC)[reply]
Thanks for the helpful comments. I also think they should be raised for at least whitelisted pages. However, I propose to solve the problems that caused people to create these artificial limits (or to have these limits as low as they currently are).
Afaik the "Post-expand include size limit" is directly related to the "Transclusion expansion time" which is why the latter is so long when a page hits the former's limit. I think solving whatever causes long "Transclusion expansion time"s would also allow for higher "Post-expand include size limit"s. From the research, it looks like the reference templates or rather the way templating works in Wikipedia are what cause long "Transclusion expansion time"s.
I'm not sure if I understood your suggested workaround: what I tried was having Article A transclude sections from Article B as well as sections from Article C. So far I haven't seen any temporary workaround in articles which have similar problems, such as COVID-19 pandemic in Japan.
--Prototyperspective (talk) 09:37, 27 September 2020 (UTC)[reply]
I don't remember exactly what they are, but I think there are two very large Middle-East-related templates that are each transcluded into more than one article. If limits and performance were not an issue, one or both would be better done as a "transcluded section" in one of the articles which they are currently transcluded into. I do NOT know if they are done up as templates for technical reasons or if that is just an editorial decision made by whoever made them.
Here is something you can try as a test:
Take several science articles and copy them to your personal userspace. Be sure to remove categories and the like.
Carve off sections into stand-alone pages that are in your userspace.
Copy the "big" article that transcludes from several science articles into your userspace, removing categories and the like. Have it transclude these carved-off pages instead.
Compare the parserdata against the parserdata for the "live" version and see if there is any significant difference.
davidwr/(talk)/(contribs) 15:55, 27 September 2020 (UTC)[reply]

Move to help namespace?

Looks like a help page to me — Martin (MSGJ · talk) 20:20, 21 October 2020 (UTC)[reply]

Constant expressions

Will constrant expressions only be expanded once (until the template code is modified)? Trigenibinion (talk) 18:22, 22 January 2021 (UTC)[reply]

Whitespace

It is very bad that whitespace is being counted. Trigenibinion (talk) 14:05, 27 January 2021 (UTC)[reply]

 You are invited to join the discussion at Wikipedia:Categories for discussion/Log/2021 May 7 § Category:Pages where template include size is exceeded. * Pppery * it has begun... 17:12, 7 May 2021 (UTC)[reply]