<feed xmlns="http://www.w3.org/2005/Atom"><title>Leave it to the prose</title><id>https://phabricator.wikimedia.org/phame/blog/feed/9/</id><link rel="self" type="application/atom+xml" href="https://phabricator.wikimedia.org/phame/blog/feed/9/" /><updated>2026-01-18T08:04:51+00:00</updated><entry><title>How and why we moved our skins to Mustache</title><link href="/phame/live/9/post/290/how_and_why_we_moved_our_skins_to_mustache/" /><id>https://phabricator.wikimedia.org/phame/post/view/290/</id><author><name>Jdlrobson (Jon Robson)</name></author><published>2022-07-15T20:24:12+00:00</published><updated>2026-01-18T08:04:51+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p>As part of the <a href="https://www.mediawiki.org/wiki/Reading/Web/Desktop_Improvements" class="remarkup-link remarkup-link-ext" rel="noreferrer">desktop improvements project</a> we spent time investing in the core code that powers skins. With support from volunteers (the majority of this support coming from the prolific <a href="https://phabricator.wikimedia.org/p/Ammarpad/" class="phui-tag-view phui-tag-type-person " data-sigil="hovercard" data-meta="0_10"><span class="phui-tag-core phui-tag-color-person">@Ammarpad</span></a>), we identified code patterns and made changes to the <a href="/tag/mediawiki-core-skin-architecture/" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_9"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-briefcase" data-meta="0_8" aria-hidden="true"></span>MediaWiki-Core-Skin-Architecture</span></a> to retroactively define a data layer API for generating a skin.</p>

<p>Once this was in place, we updated the legacy MediaWiki skins Monobook, Modern, CologneBlue to use Mustache to bring them in line with how Vector and Minerva were built.</p>

<p>The rationale for doing this was as follows:</p>

<ol class="remarkup-list">
<li class="remarkup-list-item">We wanted to centralize code into core, and standardize markup, to make it easier to roll out changes to all skins. Often developers found ourselves updating every skin every time we wanted to make a small change or forced to use specific classes to markup elements (e.g.  <a href="https://phabricator.wikimedia.org/T248137" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_0"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T248137</span></span></a>, <a href="https://phabricator.wikimedia.org/T253938" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_1"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T253938</span></span></a>).</li>
<li class="remarkup-list-item">We wanted to move away from server-side technologies to client-side technologies to play better to the strengths of frontend engineers and designers who worked on skins.</li>
<li class="remarkup-list-item">Since many of these skins do not see active development, we wanted to support them better by reducing lines of code</li>
<li class="remarkup-list-item">Many of the skins didn&#039;t support certain extensions because they used different code (for example certain skins didn&#039;t run hooks that were used by certain features) e.g. <a href="https://phabricator.wikimedia.org/rSMNB6ce3ce1acb68f0a3fdf1bd8824f6d0717bffa320" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_4"><span class="phui-tag-core phui-tag-color-object">6ce3ce1acb68f0a3fdf1bd8824f6d0717bffa320</span></a> <a href="https://phabricator.wikimedia.org/T259400" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_2"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T259400</span></span></a></li>
<li class="remarkup-list-item">Stop supporting features in core that were never widely adopted e.g. <a href="https://phabricator.wikimedia.org/T97892" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_3"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T97892</span></span></a></li>
</ol>

<p>This process reduced 106,078 lines of code to 85,310 lines of code - a 20% decrease.<br />
Before the change around 45% of skin code was PHP.  After the change PHP only accounted for 15% of the code.</p>

<p>It would be great to in the future migrate Timeless too, but Timeless using the legacy skin platform does help keep us accountable for ensuring we continue to support skins built on this platform.</p>

<h2 class="remarkup-header">Methodology for result</h2>

<p>To measure code makeup we can run <a href="https://github.com/github/linguist" class="remarkup-link remarkup-link-ext" rel="noreferrer">github-linguist</a>  before and after the change.</p>

<h3 class="remarkup-header">Monobook</h3>

<p>Before:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">46.53%  22713      Less
36.83%  17981      PHP
16.53%  8071       JavaScript
0.10%   50         CSS
Lines of code: 48815</pre></div>

<p>After change (<a href="https://phabricator.wikimedia.org/rSMNBabe94aa4082dbc4f8b9060528a1b4fea2d0af0f1" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_5"><span class="phui-tag-core phui-tag-color-object">abe94aa4082dbc4f8b9060528a1b4fea2d0af0f1</span></a>)</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">59.28%  22831      Less
20.96%  8071       JavaScript
11.67%  4496       Mustache
7.96%   3066       PHP
0.13%   50         CSS
Lines of code: 38514</pre></div>



<h3 class="remarkup-header">Modern</h3>

<p>Before:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">52.25%  13752      CSS
40.99%  10790      PHP
4.16%   1094       Less
2.61%   686        JavaScript
Lines of code: 26322</pre></div>

<p>After change (<a href="https://phabricator.wikimedia.org/rSMODc74d67950b6de2bafd9e3b1e05e601caaa7d9452" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_6"><span class="phui-tag-core phui-tag-color-object">c74d67950b6de2bafd9e3b1e05e601caaa7d9452</span></a>)</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">68.87%  13877      CSS
18.22%  3672       Mustache
5.43%   1094       Less
4.07%   821        PHP
3.40%   686        JavaScript
Lines of code: 20150</pre></div>



<h3 class="remarkup-header">Cologne Blue</h3>

<p>Before:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">62.00%  19183      PHP
34.82%  10773      CSS
2.22%   686        JavaScript
0.97%   299        Less
Lines of code: 30941</pre></div>

<p>After change (<a href="https://phabricator.wikimedia.org/rSCBLbf06742467f6c6c2bb42367f2e073eb26ed5d495" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_7"><span class="phui-tag-core phui-tag-color-object">bf06742467f6c6c2bb42367f2e073eb26ed5d495</span></a>)</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">40.40%  10765      CSS
31.87%  8491       PHP
24.04%  6405       Mustache
2.57%   686        JavaScript
1.12%   299        Less
Lines of code: 26646</pre></div>



<h3 class="remarkup-header">PHP</h3>

<p>The total number of lines of PHP before the change: 47954<br />
After the change:  12378 lines of PHP<br />
(This is a 74% decrease in lines of code)</p></div></content></entry><entry><title>Should Vector be responsive?</title><link href="/phame/live/9/post/286/should_vector_be_responsive/" /><id>https://phabricator.wikimedia.org/phame/post/view/286/</id><author><name>Jdlrobson (Jon Robson)</name></author><published>2022-05-09T22:31:45+00:00</published><updated>2022-06-23T20:35:10+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p>Here I share some thoughts around the history of &quot;responsive&quot; MediaWiki skins and how we might want to think about it for Vector.</p>

<p>The buzzword &quot;responsive&quot; is thrown around a lot in Wikimedia-land, but essentially what we are talking about is whether to include a single tag in the page. The addition of a meta tag with name viewport, will tell the page how to adapt to a mobile device.</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">&lt;meta name=&quot;viewport&quot; content=&quot;width=device-width, initial-scale=1&quot;&gt;</pre></div>

<p>More information: <a href="https://css-tricks.com/snippets/html/responsive-meta-tag/" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://css-tricks.com/snippets/html/responsive-meta-tag/</a></p>

<p>Since the viewport tag must be added, by default websites are not made mobile-friendly. Given the traditional Wikimedia skins were built before mobile sites and this tag existed, CologneBlue, Modern, Vector did not add this tag.</p>

<p>When viewing these skins on mobile the content will not adapt to the device and instead will appear zoomed out. One of the benefits of this is that the reader sees a design that is consistent with the design they see on desktop. The interface is familiar and easy enough to navigate as the user can pinch and zoom to parts of the UI. The downside is that reading is very difficult, and requires far more hand manipulation to move between sentences and paragraphs, and for this reason, many search engines will penalize traffic.</p>

<h2 class="remarkup-header">Enter Minerva</h2>

<p>The Minerva skin (and MobileFrontend before it) were introduced to allow us to start adapting our content for mobile. This turned out to be a good decision as it avoided the SEO of our projects from being penalized. However, building Minerva showed that making content mobile-friendly was more than adding a meta tag. For example, many templates used HTML elements with fixed widths that were bigger than the available space. This was notably a problem with large tables. Minerva swept many of these issues under the rug with generic fixes (For example enforcing horizontal scrolling on tables). Minerva took a bottom-up approach where it added features only after they were mobile-friendly. The result of this was a minimal experience that was not popular with editors.</p>

<h2 class="remarkup-header">Timeless</h2>

<p>Timeless was the 2nd responsive skin added to Wikimedia wikis. It was popular with editors as it took a different approach to Minerva, in that it took a top-down approach, adding features despite their shortcomings on a mobile screen. It ran into many of the same issues that Minerva had e.g. large tables and copied many of the solutions in Minerva.</p>

<h2 class="remarkup-header">MonoBook</h2>

<p>During the building of Timeless, the Monobook skin was made responsive (<a href="https://phabricator.wikimedia.org/T195625" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_15"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T195625</span></span></a>). Interestingly this led <a href="https://de.wikipedia.org/w/index.php?title=Wikipedia:Fragen_zur_Wikipedia&amp;oldid=178125403#Merkw%C3%BCrdige_Seitendarstellung" class="remarkup-link remarkup-link-ext" rel="noreferrer">to a lot of backlash from users (particularly on German Wikipedia)</a>, revealing that many users did not want a skin that adapted to the screen (presumably because of the reasons I outlined earlier - while reading is harder, it&#039;s easier to get around a complex site. Because of this, a preference was added to allow editors to disable responsive mode (the viewport tag). This preference was later generalized to apply to all skins:<br />
<div class="phabricator-remarkup-embed-layout-left"><a href="https://phab.wmfusercontent.org/file/data/gqpz6xyllkte5b75askd/PHID-FILE-djfe423t22fujnpj5mip/Screen_Shot_2022-05-09_at_2.47.31_PM.png" class="phabricator-remarkup-embed-image" data-sigil="lightboxable" data-meta="0_11"><img src="https://phab.wmfusercontent.org/file/data/7ypcemkvwkcaqva4r7sh/PHID-FILE-ty6hzzep7ifqkannjibv/preview-Screen_Shot_2022-05-09_at_2.47.31_PM.png" width="220" height="149.06896551724" alt="Screen Shot 2022-05-09 at 2.47.31 PM.png (786×1 px, 182 KB)" /></a></div></p>

<h2 class="remarkup-header">Responsive Vector</h2>

<p>Around the same time, several attempts were made by volunteers to force Vector to work as a responsive skin. This was feature flagged given the backlash for MonoBook&#039;s responsive mode. The feature flag saw little development, presumably because many gadgets popped up that were providing the same service.</p>

<h2 class="remarkup-header">Vector 2022</h2>

<p>The feature flag for responsive Vector was removed for legacy Vector in <a href="https://phabricator.wikimedia.org/T242772" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_16"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T242772</span></span></a> and efforts were redirected into making the new Vector responsive. Currently, the Vector skin can be resized comfortably down to 500px. It currently does not add a viewport tag, so does not adapt to a mobile screen.</p>

<p>However, during the building of the table of contents, many mobile users started complaining (<a href="https://phabricator.wikimedia.org/T306910" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_17"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T306910</span></span></a>). The reason for this was that when you don&#039;t define a viewport tag the browser makes decisions for you. To avoid these kind of issues popping up it might make sense for us to define an explicit viewport to request content that appears scaled out at a width of our choosing. For example, we could explicitly set width 1200px with  a zoom level of 0.25 and users would see:<br />
<div class="phabricator-remarkup-embed-layout-left"><a href="https://phab.wmfusercontent.org/file/data/mwoufc5jtl3nwqewjdff/PHID-FILE-s73przw3knp2gqznxoiv/Screenshot_20220509-151012_Chrome.jpg" class="phabricator-remarkup-embed-image" data-sigil="lightboxable" data-meta="0_12"><img src="https://phab.wmfusercontent.org/file/data/ln25ut3xht3st4q7hedr/PHID-FILE-ps2cxp2c6nxzvmoy6rqu/preview-Screenshot_20220509-151012_Chrome.jpg" width="107.02702702703" height="220" alt="Screenshot_20220509-151012_Chrome.jpg (2×1 px, 651 KB)" /></a></div></p>

<p>If Vector was responsive, it would encourage people to think about mobile-friendly content as they edit on mobile. If editors insist on using the desktop skin on their mobile phones rather than Minerva, they have their reasons, but by not serving them a responsive skin, we are encouraging them to create content that does not work in Minerva and skins that adapt to the mobile device.</p>

<p>There is a little bit more work needed on our part to deal with content that cannot hit into 320px e.g. below 500px. Currently if the viewport tag is set, a horizontal scrollbar will be shown - for example the header does not adapt to that breakpoint:<br />
<div class="phabricator-remarkup-embed-layout-left"><a href="https://phab.wmfusercontent.org/file/data/57ehgmxhb6h6ipu2vldu/PHID-FILE-jiactrqxeuidwpoc2b4j/Screenshot_20220509-152651_Chrome.jpg" class="phabricator-remarkup-embed-image" data-sigil="lightboxable" data-meta="0_13"><img src="https://phab.wmfusercontent.org/file/data/ia37hgb2mdp5x5bw3x3s/PHID-FILE-5tlqqcuyuxlyqbxs26ok/preview-Screenshot_20220509-152651_Chrome.jpg" width="107.02702702703" height="220" alt="Screenshot_20220509-152651_Chrome.jpg (2×1 px, 571 KB)" /></a></div><br />
<div class="phabricator-remarkup-embed-layout-left"><a href="https://phab.wmfusercontent.org/file/data/zz3eqnlqzdhqilj2tubp/PHID-FILE-xf5b2xxtk6u3ic6tfl3h/Screenshot_20220509-152646_Chrome.jpg" class="phabricator-remarkup-embed-image" data-sigil="lightboxable" data-meta="0_14"><img src="https://phab.wmfusercontent.org/file/data/m2os5eenqfswmvlxnpc6/PHID-FILE-g2r3b6vdfrcd6myoq3pi/preview-Screenshot_20220509-152646_Chrome.jpg" width="107.02702702703" height="220" alt="Screenshot_20220509-152646_Chrome.jpg (2×1 px, 579 KB)" /></a></div></p>

<h2 class="remarkup-header">Decisions to be made</h2>

<ol class="remarkup-list">
<li class="remarkup-list-item">Should we enable Vector 2022&#039;s responsive mode? The only downside of doing this is that some users may dislike it, and need to visit preferences to opt-out.</li>
<li class="remarkup-list-item">When a user doesn&#039;t want responsive mode, should we be more explicit about what we serve them? For example, should we tell a mobile device to render at a width of 1000px with a scale of 0.25  ( 1/4 of the normal size) ? This would avoid issues like <a href="https://phabricator.wikimedia.org/T306910" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_18"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T306910</span></span></a>. Example code [1] <a href="https://patchdemo.wmflabs.org/wikis/d3c5f6174c/wiki/Spain" class="remarkup-link remarkup-link-ext" rel="noreferrer">demo</a></li>
<li class="remarkup-list-item">Should we apply the responsive mode to legacy Vector too? This would fix <a href="https://phabricator.wikimedia.org/T291656" class="phui-tag-view phui-tag-type-object " data-sigil="hovercard" data-meta="0_19"><span class="phui-tag-core-closed"><span class="phui-tag-core phui-tag-color-object">T291656</span></span></a> as it would mean the option applies to all skins.</li>
</ol>

<p>[1]</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">&lt;meta name=&quot;viewport&quot; content=&quot;width=1400px, initial-scale=0.22&quot;&gt;</pre></div>

</div></content></entry><entry><title>Creating a vue.js based skin with server side rendering</title><link href="/phame/live/9/post/243/creating_a_vue.js_based_skin_with_server_side_rendering/" /><id>https://phabricator.wikimedia.org/phame/post/view/243/</id><author><name>Jdlrobson (Jon Robson)</name></author><published>2021-05-27T21:54:25+00:00</published><updated>2021-06-13T21:21:44+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p>For WMF staff&#039;s inspiration week, I decided to take a step back from my work building out a new skin architecture and a redesign of Vector and put myself into the shoes of a skin developer to see if the changes my team had made life easier. As a secondary objective, I was interested in how a MediaWiki skin could be written in Vue.js and what the challenges were to get there.</p>

<h2 class="remarkup-header">What I built</h2>

<p>I decided to build a new skin called <a href="https://www.mediawiki.org/wiki/Skin:Alexandria#/" class="remarkup-link remarkup-link-ext" rel="noreferrer">Alexandria</a> named after <a href="https://en.wikipedia.org/wiki/Library_of_Alexandria" class="remarkup-link remarkup-link-ext" rel="noreferrer">the Great Library</a>. The design was modeled on an open-source project and website I volunteer for called OpenLibrary.org.</p>

<p>I began by creating a skin that was JavaScript only.</p>

<p>I generated most of my boilerplate using the skins.wmflabs.org tool. I tweaked it so it gave me all the client-side tooling I needed.</p>

<p>Once I had that, I added a few more advanced features to my skin. In particular, I created a PHP class that extended the core SkinMustache class to allow me to extend the data given by core.</p>

<p>I wanted to be able to render my skin in JavaScript, so I needed to pass the data in PHP to the client. To do this, I added a template data value that represented a stringified JSON of the entire data that would be passed to the template like so:<br />
<a href="https://github.com/jdlrobson/Alexandria/blob/master/SkinAlexandria.php#L29" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/jdlrobson/Alexandria/blob/master/SkinAlexandria.php#L29</a></p>

<p>The skin template rendered this JSON into a data attribute.</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">&lt;div id=&quot;ol-app&quot; data-json=&quot;{{data-json}}&quot;&gt;&lt;/div&gt;</pre></div>

<p>This rendered a blank screen that was readable by JavaScript. While not a useful skin, from here, I was able to begin using Vue.js to parse that data attribute and pass it through Vue components to render the skin. <a href="https://github.com/jdlrobson/Alexandria/blob/master/resources/skin.js#L20" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/jdlrobson/Alexandria/blob/master/resources/skin.js#L20</a></p>

<p>From here, I was up and running. I had a skin made from a Vue.js component that said hello world!</p>

<h2 class="remarkup-header">Building a skin with Vue.</h2>

<p>This really was a breeze.</p>

<p>I made use of the [[ <a href="http://github.com/wikimedia/wvui" class="remarkup-link remarkup-link-ext" rel="noreferrer">http://github.com/wikimedia/wvui</a> | wvui library  ]]for existing standard components, such as TypeaheadSearch and Button by including the wvui and vue libraries in my main skin module.</p>

<p>Keen to test out the work Roan and others did to request ES6 only modules, I decided to allow myself the luxury of writing code in ES6 and excluding older browsers.</p>

<p><a href="https://github.com/jdlrobson/Alexandria/blob/master/skin.json#L86" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/jdlrobson/Alexandria/blob/master/skin.json#L86</a></p>

<p>When components didn’t exist in the wvui library, I made them and thinking in terms of components led to well scoped CSS. Aside from components inside the wvui library, I ended up creating components such as <a href="https://github.com/jdlrobson/Alexandria/blob/master/resources/App.vue" class="remarkup-link remarkup-link-ext" rel="noreferrer">App.vue</a>, AppArticle.vue, AppBanner.vue, AppFooter.vue, AppHeader.vue, DropdownMenu.vue, FooterMenu.vue, Portlet.vue, TypeaheadSearch.vue.</p>

<p>One thing that was frustrating as I created/renamed these components is I had to update these in the skin.json manifest. It was unintuitive and I often forgot to do it, which made development a little more tedious. I captured this in a Phabricator ticket, as I think it’s something that could be much better in the developer experience: <a href="https://phabricator.wikimedia.org/T283388" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_21"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-anchor" data-meta="0_20" aria-hidden="true"></span>https://phabricator.wikimedia.org/T283388</span></a>.</p>

<p>While wvui and Vue.js were on npm, Since I was relying on some styles inside MediaWiki-core, I couldn’t use Vite or Parcel.js without lots of scaffolding, so I decided to develop without hot reloading. This slowed me down a lot as I was doing a lot of page refreshing.</p>

<p>Using the OpenLibrary project I copied across the CSS I needed. The resulting CSS was much better organized and scoped than the original project as I was constantly thinking about reuse.</p>

<h2 class="remarkup-header">Styling article content</h2>

<p>To generate articles  I used the MobileFrontend content provider ( <a href="https://www.mediawiki.org/wiki/Extension:MobileFrontend#Testing_with_articles_on_a_foreign_wiki_(live_data))" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://www.mediawiki.org/wiki/Extension:MobileFrontend#Testing_with_articles_on_a_foreign_wiki_(live_data))</a> to generate articles to test on. I found myself running into a few issues with that and a few CSS rules that were in Minerva that should have been in MobileFrontend for toggling and ended up submitting patches to deal with that: <a href="https://gerrit.wikimedia.org/r/c/mediawiki/extensions/MobileFrontend/+/693503" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://gerrit.wikimedia.org/r/c/mediawiki/extensions/MobileFrontend/+/693503</a>, <a href="https://gerrit.wikimedia.org/r/c/mediawiki/extensions/MobileFrontend/+/696644" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://gerrit.wikimedia.org/r/c/mediawiki/extensions/MobileFrontend/+/696644</a></p>

<p>Styles for thumbnails and table of contents are provided by core. Both of these styles didn’t fit in with the aesthetics of my design, so I ended up adding override CSS. I would have preferred to have not spent any time styling these elements so raised Phabricator tickets lest I forget to revisit our defaults <a href="https://phabricator.wikimedia.org/T283836" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_23"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-anchor" data-meta="0_22" aria-hidden="true"></span>https://phabricator.wikimedia.org/T283836</span></a>   and <a href="https://phabricator.wikimedia.org/T283396" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_25"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-anchor" data-meta="0_24" aria-hidden="true"></span>https://phabricator.wikimedia.org/T283396</span></a> .</p>

<p>I wanted to do a lot with the content - such as move all images and infobox to the left, but after wrestling with inline styles, CSS grid and I gave up. It would be great if the parser marked up articles in a way that lent itself better to a grid system, but sadly it doesn’t. I didn’t know what tasks to raise here, as I didn’t think about it too deeply, but I want to acknowledge that this was a point of pain.</p>

<h2 class="remarkup-header">Server side rendering</h2>

<p>I then turned my attention to server-side rendering. MediaWiki currently doesn’t support server-side rendering of Vue components.  (<a href="https://phabricator.wikimedia.org/T272878" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_27"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-anchor" data-meta="0_26" aria-hidden="true"></span>https://phabricator.wikimedia.org/T272878</span></a>)</p>

<p>To server-side render Vue components, a Node.js service is advised which PHP can request HTML from. Because I’m a little crazy and didn’t want to set up such a service, for now, I explored the differences between Mustache and Vue.js templates. I built a Node script that imported Vue components, found the template tag, and then traversed the DOM of that template rewriting it node by node recursively to a Mustache equivalent. Constraining myself to the minimal work possible I managed to create <a href="https://github.com/jdlrobson/Alexandria/blob/master/ssr.js" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/jdlrobson/Alexandria/blob/master/ssr.js</a></p>

<p>This mostly worked but of course, I ran into a few problems.</p>

<p>This couldn’t load the general components in the wvui library. For these, I had to define fallback templates such as <a href="https://github.com/jdlrobson/Alexandria/blob/master/resources/TypeaheadSearch.vue" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/jdlrobson/Alexandria/blob/master/resources/TypeaheadSearch.vue</a>. For the search widget, I ended up rendering a form fallback that looked nothing like the JavaScript Vue version, but that was fine. I made sure the script threw an error so that I’d never load it accidentally in my JavaScript application.</p>

<p>I have a bad habit of giving Vue component props the same name as attributes. I had to stop this. A parameter <tt class="remarkup-monospaced">id</tt> became <tt class="remarkup-monospaced">menuId</tt> for example. This allowed me to avoid too much complexity in my Mustache template generator, to know when I was dealing with an attribute or a template property.</p>

<p>With <tt class="remarkup-monospaced">for</tt> loops, it was easier to parse <tt class="remarkup-monospaced">v-for=”a in list</tt> than it was to parse something like <tt class="remarkup-monospaced">v-for=”a in data.list”</tt> so I made sure that was a constraint in the Vue templates I was writing.</p>

<p>I decided against computed properties as these involved JavaScript so those were not supported in my proof of concept.</p>

<p>Now I was generating a template via a build step, my skin was working with JavaScript loading, however, loading styles for the new experience became my next problem.  I had included all my styles in a Vue template, so now needed them out. I extended my build script to generate a stylesheet as well, but later backtracked on that and pulled the styles out of the Vue components. I opened <a href="https://phabricator.wikimedia.org/T283882" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_29"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-anchor" data-meta="0_28" aria-hidden="true"></span>https://phabricator.wikimedia.org/T283882</span></a> to discuss best practices for that.</p>

<h2 class="remarkup-header">API-driven frontend!</h2>

<p>Now I had a skin rendering in Vue via JavaScript. It would be silly not to play to its strengths and make it a single-page application, loading content from JS. Unfortunately, there was no Skin API, only APIs for generating content and various things can vary on a page such as JavaScript/CSS loaded, mw.config values and even items in menu.. eek!!</p>

<p>A while back I made <a href="https://github.com/jdlrobson/mediawiki-skins-skinjson" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/jdlrobson/mediawiki-skins-skinjson</a> to help with skin development. It allows you to see a JSON representation of the data that a skin template can render. I repurposed this to allow me to use it as an API in my app. I made use of this. Pages rendered via API calls and JavaScript with ease. Wire up was very small: <a href="https://github.com/jdlrobson/Alexandria/blob/master/resources/App.vue#L252" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/jdlrobson/Alexandria/blob/master/resources/App.vue#L252</a> Maybe something for the #product-infrastructure-team-backlog  ?</p>

<p>This allowed me to load articles, however, I ran into technical debt. Most MediaWiki extensions expect to be run on page load. Some work, because of the use of <tt class="remarkup-monospaced">mw.hook</tt>. For event handlers bound to the body tag using a proxy pattern, things just worked. We clearly need to use that pattern more if we ever want to go down this route.</p>

<p>E.g</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">$(‘body’).on(‘click’, ‘.uls-button’, loadULS );</pre></div>

<p><a href="https://skins-demo.wmflabs.org/wiki/Alexandria?useskin=alexandria" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://skins-demo.wmflabs.org/wiki/Alexandria?useskin=alexandria</a> demonstrates article loading using the Parsoid API</p>

<h2 class="remarkup-header">Reflection</h2>

<ul class="remarkup-list">
<li class="remarkup-list-item">Converting Vue templates to Mustache is possible with constraints</li>
<li class="remarkup-list-item">There are a few kinks in ResourceLoader that need to be worked out</li>
<li class="remarkup-list-item">API-driven skins are possible if we&#039;re willing to put in the effort across our codebases to build the APIs and rethink how our existing features load.</li>
</ul></div></content></entry><entry><title>All code is built</title><link href="/phame/live/9/post/206/all_code_is_built/" /><id>https://phabricator.wikimedia.org/phame/post/view/206/</id><author><name>Niedzielski (Stephen Niedzielski)</name></author><published>2020-07-28T13:27:02+00:00</published><updated>2025-10-28T21:14:09+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p><em>HEADER CAPTION: The head of the Statue of Liberty on exhibit at the Paris World&#039;s Fair, 1878. The statue was built in France ahead of time, shipped overseas in crates, and then assembled in New York. Image by <a href="https://commons.wikimedia.org/wiki/File:Head_of_the_Statue_of_Liberty_on_display_in_a_park_in_Paris.jpg" class="remarkup-link remarkup-link-ext" rel="noreferrer">Albert Fernique / public domain</a>.</em></p>

<p>The process of mapping human-readable source code inputs to optimized, machine-readable outputs is called compiling or more generally, building. It&#039;s been a necessary part of software development since computers evolved past machine code. Even to serve the most abstract, high-level languages such as HTML and CSS, this build process is essential.</p>

<h3 class="remarkup-header">Just-in-time build steps</h3>

<p>We build code all the time at Wikimedia. Every page request benefits from Less compilation, CSS and JavaScript minification, internationalization, URL mapping, and bundling build steps. All of this occurs at runtime through the ResourceLoader pipeline.</p>

<p>ResourceLoader&#039;s <a href="https://wikipedia.org/wiki/Just-in-time_compilation" class="remarkup-link remarkup-link-ext" rel="noreferrer">just-in-time build process</a> is critical when key parameters vary on request. However, it has some notable limitations including:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Every just-in-time build step must be extremely performant, so fast that it can run on-the-fly, or our pages will load slowly. Additionally, sequential steps cannot be appended ad infinitum.</li>
<li class="remarkup-list-item">Effectively, ResourceLoader&#039;s just-in-time build steps can only use tools written in PHP. JavaScript execution is not possible.</li>
<li class="remarkup-list-item">Just-in-time build steps are less secure. They execute on production servers and serve content directly to the user. This eliminates the separation between development and runtime-only dependency trees, which can dramatically increase the attack surface, sometimes by orders of magnitude. Additionally, build outputs are shipped directly to the user without any opportunity for security review. When it comes to security, a just-in-time build step always strives to be as secure as an ahead-of-time build step that produces static outputs.</li>
<li class="remarkup-list-item">Just-in-time build steps are custom and complex. An ahead-of-time build step can easily be a one-liner that invokes standard tooling but the equivalent just-in-time build step, if one exists, is just as likely to be hundreds of lines of custom code. Historically, these custom steps have suffered from <a href="https://wikipedia.org/wiki/Bus_factor" class="remarkup-link remarkup-link-ext" rel="noreferrer">bus factor</a> and received little attention beyond basic life support. Few engineers possess the abilities to write code of the caliber needed to add new build steps or change existing ones, which means the rest of Wikipedia and WMDE is blocked on their evolution. For example, we have been unable to keep pace with fundamental features like source map support (<a href="https://phabricator.wikimedia.org/T47514" class="remarkup-link" rel="noreferrer">a formal request since 2013</a>) or ES6 transpilation. In fact, there are laundry lists of missing features now standard elsewhere. The lack of standard functionality means that developing any code at Wikimedia is a completely different and far slower experience than the rest of the industry.</li>
<li class="remarkup-list-item">Just-in-time build step outputs have worse caching. The most advanced build step executed at runtime endeavors to have the same caching that comes out-of-the-box with an ahead-of-time build step: a plain file on disk.</li>
</ul>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/ue5mymljpf7usiz344qq/PHID-FILE-g2zcptbyf77phef2xse2/assembly-line.jpeg" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_30"><img src="https://phab.wmfusercontent.org/file/data/ue5mymljpf7usiz344qq/PHID-FILE-g2zcptbyf77phef2xse2/assembly-line.jpeg" height="874" width="990" loading="lazy" alt="An old photograph of the Ford assembly line." /></a></div><em>CAPTION: No step in the pipeline can be delayed, and the longer the pipeline, the longer it takes to go from a nut to a new car. Image by <a href="https://commons.wikimedia.org/wiki/File:Ford_assembly_line_-_1913.jpg" class="remarkup-link remarkup-link-ext" rel="noreferrer">unknown author / public domain</a>.</em></p>

<h3 class="remarkup-header">Solving problems too big for just-in-time</h3>

<p>Some problems are only solvable by just-in-time build steps. However, many solutions <em>cannot</em> meet the constraints of just-in-time build steps, so only a subset of all problems can be solved. This is a more general limitation of just-in-time build steps, not the ResourceLoader implementation. In practice, this means that developers cannot add a build step to the pipeline but are still left with their problem unsolved.</p>

<p>There must be an alternative. Our options include:</p>

<ol class="remarkup-list">
<li class="remarkup-list-item">Double down on building new features in ResourceLoader. This approach fails to address the fundamental limitations of all just-in-time build steps and may require reimplementing existing open-source solutions.</li>
<li class="remarkup-list-item">Ship extra tooling to every user&#039;s browser and let them process it. Besides significantly increased bandwidth and computation costs that go against our mission to serve everyone, this isn&#039;t very eco-friendly, fails to solve many problems, leads to the laggy browsing experiences users so loathe on JavaScript-heavy pages, and doesn&#039;t scale far past polyfills.</li>
<li class="remarkup-list-item">Replace ResourceLoader with industry standard tooling that has fewer constraints. This will require exploration, be expensive, and may have the same outcome as #1.</li>
<li class="remarkup-list-item">Enhance ResourceLoader by building what we can <a href="https://en.wikipedia.org/wiki/Ahead-of-time_compilation" class="remarkup-link remarkup-link-ext" rel="noreferrer">ahead-of-time</a>.</li>
</ol>

<p>The first two options don&#039;t work. The third option doesn&#039;t sound like a good first choice. The fourth is the most conventional and proven solution.</p>

<h3 class="remarkup-header">Ahead-of-time build steps</h3>

<p>Ahead-of-time build steps are usually what people think of when they refer to &quot;building code.&quot; Most build problems that remain to be solved in Wikimedia <em>only</em> fit in the ahead-of-time space. As you might expect, we&#039;re using these enhancements all over the place already and can&#039;t live without them. Some examples include:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item"><strong>OOUI</strong>: Portions of this library are built with Grunt and a suite of packages from NPM for minification, uglification, and additional processing. The results are dozens of build products that are file-copied into Core manually.</li>
<li class="remarkup-list-item"><strong>Page Previews</strong>: This gem of a codebase is fully compiled from the latest JavaScript with Webpack. It serves about two billion virtual pageviews a month.</li>
<li class="remarkup-list-item"><strong>Wikibase</strong> : Ahead-of-time build tools are used by Wikibase including Webpack, TypeScript, and a plethora of other standards to serve the Wikidata communities.</li>
<li class="remarkup-list-item"><strong>MultimediaViewer</strong>: Commits to MultimediaViewer use ahead-of-time build steps to replace any human readable source SVGs with optimized, machine-readable outputs.</li>
<li class="remarkup-list-item"><strong>MediaWiki</strong>: Core uses a build step on every deployment. The process is called &quot;a full scap.&quot; When the process fails, it&#039;s called &quot;a full scapadapadoo.&quot;</li>
<li class="remarkup-list-item"><strong>MobileFrontend</strong>: All JavaScript in MobileFrontend, the heart of the mobile site, is built by Webpack. That&#039;s over 50% of <em>all</em> pageviews benefiting from an ahead-of-time build step using industry standard tooling.</li>
<li class="remarkup-list-item"><strong>Wikipedia for KaiOS</strong>: This Webpack-powered project uses a build step to serve a highly performant web app.</li>
<li class="remarkup-list-item"><strong>ContentTranslation</strong>: The glittering new ContentTranslation app uses the Vue CLI and standard tooling to generate the next-generation interfaces essential to serving contributors around the world. Put plainly, this is the kind of modern experience that would be impossible to build without modern tooling that leverages ahead-of-time build steps.</li>
<li class="remarkup-list-item"><strong>Wikipedia.org</strong>: Portals uses a build step to synchronize sister project statistics. I know someone who has a recurring task each week reminding him &quot;it&#039;s build time.&quot; Although triggering the build step is person-powered, the outputs are what you would expect of an ahead-of-time build step: practical and project specific.</li>
<li class="remarkup-list-item"><strong>VisualEditor</strong>: VE is a sophisticated application that requires a build step. I don&#039;t know <a href="https://gerrit.wikimedia.org/r/plugins/gitiles/VisualEditor/VisualEditor/+/refs/heads/master/Gruntfile.js" class="remarkup-link remarkup-link-ext" rel="noreferrer">what this does exactly</a> but I would guess it&#039;s solving the same kinds of problems everyone else has ahead-of-time.</li>
<li class="remarkup-list-item">And <strong>many</strong> more.</li>
</ul>

<p>These ahead-of-time build steps are <em>everywhere</em> in Gruntfiles, Gulpfiles, Webpack configs, NPM package.json files, and shell scripts. Even if the Foundation mandated it today, we could never get rid of them.</p>

<h3 class="remarkup-header">Evolving the ResourceLoader pipeline with a new stage</h3>

<p>Ahead-of-time build steps are the only solution for many problems, so it&#039;s fortunate they have such a proven track record of success both within and beyond the MediaWiki ecosystem. As everyone who is already using ahead-of-time build steps has discovered, they&#039;re the perfect complement to ResourceLoader&#039;s just-in-time build steps.</p>

<p>However, this is a problem at scale and it needs to be solved at scale. Informal developer builds work surprisingly well but aren&#039;t as efficient for developers as they could be. We need to extend the pipeline to include a pre-ResourceLoader stage. This stage is an ahead-of-time build step.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/ksphx6adxohtgsjzaglg/PHID-FILE-5d4nepalyeru5zpcfqjk/international-space-station.jpeg" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_31"><img src="https://phab.wmfusercontent.org/file/data/ksphx6adxohtgsjzaglg/PHID-FILE-5d4nepalyeru5zpcfqjk/international-space-station.jpeg" height="816" width="1280" loading="lazy" alt="Photograph of the International Space Station in Earth&#039;s orbit." /></a></div><em>CAPTION: The International Space Station was built on Earth in modules that were optimized for assembly and constructed in orbit. Similarly, ResourceLoader modules can be built before deployment and finally assembled in the user&#039;s browser. Image by <a href="https://commons.wikimedia.org/wiki/File:International_Space_Station_after_undocking_of_STS-132.jpg" class="remarkup-link remarkup-link-ext" rel="noreferrer">NASA/Crew of STS-132 / public domain</a>.</em></p>

<p>In conclusion:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">ResourceLoader provides useful just-in-time build steps.</li>
<li class="remarkup-list-item">Many projects have requirements that cannot be solved at runtime. These real problems are only solvable by traditional ahead-of-time build steps.</li>
<li class="remarkup-list-item">Just-in-time and ahead-of-time build steps are already in use by and are for everyone, and we can&#039;t change that.</li>
<li class="remarkup-list-item">Ahead-of-time build steps often use standard tools but are highly project specific. These should not be centralized nor should they be constrained by artificial limitations. Per-project solution autonomy must be preserved.</li>
<li class="remarkup-list-item">Adding a pre-ResourceLoader stage can integrate neatly with the current ResourceLoader system by extending the pipeline to include these existing ahead-of-time workflows.</li>
</ul>

<p>Above all, a build step means <strong>freedom</strong>. The freedom to succeed and the freedom to use the tool that&#039;s right for the job, not the rare tool that fits into a runtime-only pipeline.</p>

<p><em>Thanks to <a href="https://meta.wikimedia.org/wiki/User:JDrewniak_(WMF)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Jan Drewniak</a>, <a href="https://meta.wikimedia.org/wiki/User:Sthottingal_(WMF)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Santhosh Thottingal</a>, <a href="https://meta.wikimedia.org/wiki/User:DCipoletti_(WMF)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Daniel Cipoletti</a>, <a href="https://meta.wikimedia.org/wiki/User:JoeWalsh_(WMF)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Joe Walsh</a>, <a href="https://meta.wikimedia.org/wiki/User:BSitzmann_(WMF)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Bernd Sitzmann</a>, and <a href="https://meta.wikimedia.org/wiki/User:Monica_Pinedo_Bajo_(WMDE)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Mónica Pinedo Bajo</a> for reviewing and providing detailed feedback.</em></p>

<p>This post is also available on the <a href="https://techblog.wikimedia.org/2020/07/28/all-code-is-built-build-what-you-can-before-you-ship/" class="remarkup-link remarkup-link-ext" rel="noreferrer">Wikimedia Tech Blog</a>.</p></div></content></entry><entry><title>The best documentation automation can buy</title><link href="/phame/live/9/post/194/the_best_documentation_automation_can_buy/" /><id>https://phabricator.wikimedia.org/phame/post/view/194/</id><author><name>Niedzielski (Stephen Niedzielski)</name></author><published>2020-03-22T14:24:43+00:00</published><updated>2020-06-01T15:58:41+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p><em>HEADER CAPTION: Screenshot from Wikimedia&#039;s famous Visual Editor. The typo &quot;documenation&quot; has a red squiggly line under it indicating the spell checker has automatically detected a spelling error by the author.</em></p>

<p>Tools for validating that JavaScript documentation is current and error-free have advanced significantly over the last several years. It is now possible to detect mismatches between a program&#039;s documentation and its source code automatically using a free and open-source, industry-standard type checker. This goes <em>way</em> beyond typos.</p>

<h3 class="remarkup-header">JavaScript typing is loose</h3>

<p>JavaScript is an <a href="https://stackoverflow.com/questions/964910/is-javascript-an-untyped-language" class="remarkup-link remarkup-link-ext" rel="noreferrer">untyped language</a>. Unlike a typed language, a JavaScript program is <em>always</em> generated regardless of whether the types in it are valid. Some consider JavaScript&#039;s fast-and-loose style a feature, not a bug. Notable proponents of that viewpoint include <a href="https://www.oreilly.com/library/view/javascript-the-good/9780596517748/ch01s02.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">Douglas Crockford</a> and <a href="http://www.paulgraham.com/hp.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">Paul Graham</a>.</p>

<p>There have been numerous articles written on the subject, but I suspect that most reading already understand the values of clear typing. For any nontrivial program with multiple authors and any longevity, especially those likely to be found among the <a href="https://meta.wikimedia.org/wiki/List_of_Wikipedias" class="remarkup-link remarkup-link-ext" rel="noreferrer">sprawling</a> <a href="https://wikimediafoundation.org/our-work/wikimedia-projects/" class="remarkup-link remarkup-link-ext" rel="noreferrer">wikis</a>, strong typing is much more practical and sustainable than the alternative. With good typing, one can quickly grasp the structure of a program. That is, you can conceptualize and interface with any well-typed API whether you understand how it works internally or not. Refactors are a lot easier too and while not fearless, a typed codebase is far more malleable than an untyped one. Type checks are also a great way to verify your work, just like in grade school.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/7mz2g2tpnjnzzp7k7s3s/PHID-FILE-b4kasbynqcbizilpqhu3/unit-conversions-whiteboard.jpg" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_32"><img src="https://phab.wmfusercontent.org/file/data/7mz2g2tpnjnzzp7k7s3s/PHID-FILE-b4kasbynqcbizilpqhu3/unit-conversions-whiteboard.jpg" height="676" width="3818" loading="lazy" alt="Picture of a whiteboard showing the complete mathematical steps for unit conversion from 65 miles per hour to kilometers per hour." /></a></div><em>CAPTION: 65 miles per hour is how many kilometers per hour? So long as the fractions are correct, we can <a href="https://wikipedia.org/wiki/Conversion_of_units#Conversion_factors" class="remarkup-link remarkup-link-ext" rel="noreferrer">validate the conversion</a> by checking that the units cancel each other out. In type checking, our function parameters, function return types, and object properties must <strong>align</strong> in a similar way but the process is automated.</em></p>

<p>Many bugs could be caught before arriving in production if every patch had its typing validated—but don&#039;t take my word for it. <a href="https://www.mediawiki.org/wiki/Continuous_integration/Phan" class="remarkup-link remarkup-link-ext" rel="noreferrer">Phan, the PHP type checker</a>, is now a required validation test for any change to MediaWiki Core as well as many extensions. It&#039;s like a bunch of built-in unit tests specifically for types. Without automation, these tests can require thousands of lines of hand written code that are tedious and time consuming to author, read, and maintain (e.g., see the otherwise excellent <a href="https://phabricator.wikimedia.org/diffusion/EPOP/" class="remarkup-link" rel="noreferrer">Popups extension</a>). In the worst cases, no tests are written at all.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/brjmjspp6ho67fczzy3e/PHID-FILE-y3lsnsm62zouy2i2abyy/clockwork.jpg" class="phabricator-remarkup-embed-image" data-sigil="lightboxable" data-meta="0_33"><img src="https://phab.wmfusercontent.org/file/data/brjmjspp6ho67fczzy3e/PHID-FILE-y3lsnsm62zouy2i2abyy/clockwork.jpg" width="640" alt="Photograph of the interior of a pocket watch showing intricate gears and fine craftsmanship." /></a></div><em>CAPTION: Types must align like clockwork or the machine stops running. Image by <a href="https://commons.wikimedia.org/wiki/File:Uhrwerk_einer_Taschenuhr.jpg" class="remarkup-link remarkup-link-ext" rel="noreferrer">ElooKoN</a> / <a href="https://creativecommons.org/licenses/by-sa/4.0" class="remarkup-link remarkup-link-ext" rel="noreferrer">CC BY-SA</a>.</em></p>

<h3 class="remarkup-header">Documentation should be correct</h3>

<p>Good typing is just as important in documentation. For JavaScript, documentation is largely written in <a href="https://wikipedia.org/wiki/JSDoc" class="remarkup-link remarkup-link-ext" rel="noreferrer">JSDoc</a> (or its <a href="https://phabricator.wikimedia.org/T138401" class="remarkup-link" rel="noreferrer">deprecated competitor, JSDuck</a>). Wikipedians seem to agree that documentation is a <em>very</em> <a href="https://doc.wikimedia.org/" class="remarkup-link remarkup-link-ext" rel="noreferrer">good</a> <a href="https://www.mediawiki.org/" class="remarkup-link remarkup-link-ext" rel="noreferrer">idea</a>. If documentation is a good idea, correct and up-to-date documentation is an even better one. There&#039;s a tool for that: it&#039;s called TypeScript.</p>

<p>If you haven&#039;t heard of TypeScript yet, it may be because <a href="https://codesearch.wmflabs.org/search/?q=.&amp;i=nope&amp;files=tsconfig.json&amp;repos=" class="remarkup-link remarkup-link-ext" rel="noreferrer">it&#039;s not very common at Wikimedia</a> except for the uber-amazing work by the <a href="https://meta.wikimedia.org/wiki/Wikimedia_Deutschland" class="remarkup-link remarkup-link-ext" rel="noreferrer">WMDE</a> and <a href="https://www.wikidata.org/" class="remarkup-link remarkup-link-ext" rel="noreferrer">Wikidata</a> communities (e.g., see <a href="https://phabricator.wikimedia.org/source/wikibase-termbox/" class="remarkup-link" rel="noreferrer">wikibase-termbox</a> which is over 80% TypeScript) as well as explorations several years back by <a href="https://www.mediawiki.org/wiki/User:JHernandez_(WMF)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Joaquín Oltra Hernández</a>. However, it is now <em><a href="https://www.npmtrends.com/typescript" class="remarkup-link remarkup-link-ext" rel="noreferrer">immensely</a></em> <a href="https://insights.stackoverflow.com/survey/2019#technology-_-most-loved-dreaded-and-wanted-languages" class="remarkup-link remarkup-link-ext" rel="noreferrer">popular</a> <a href="https://github.com/microsoft/TypeScript/" class="remarkup-link remarkup-link-ext" rel="noreferrer">globally</a> and proven itself by capability to be far more than just a fashionable trend from 2012.</p>

<p>So what is TypeScript exactly? TypeScript is JavaScript with types. Whether you choose to use it for functional code like WMDE or not, <strong>TypeScript features the ability to <a href="https://github.com/joakin/check-js-with-ts" class="remarkup-link remarkup-link-ext" rel="noreferrer">lint plain JavaScript files</a> for the type correctness of their JSDocs.</strong> You don&#039;t need Webpack and you don&#039;t need to make any functional changes to your code (unless it&#039;s incorrect and out-of-sync from the documentation—i.e., bug fixes). Your JavaScript is the same as it ever was but now, if your documentation and program don&#039;t match, TypeScript will report an error.</p>

<p>This isn&#039;t just better documentation, it&#039;s documentation as accurate as we can write in an automated way. Who doesn&#039;t want better documentation?</p>

<h3 class="remarkup-header">What changes are needed?</h3>

<p>Typing at the seams. In practice, this usually means documenting function inputs and outputs, and user types using <a href="https://jsdoc.app/#block-tags" class="remarkup-link remarkup-link-ext" rel="noreferrer">JSDoc syntax</a>. E.g.:</p>

<div class="remarkup-code-block" data-code-lang="js" data-sigil="remarkup-code-block"><div class="remarkup-code-header">JSDocs</div><pre class="remarkup-code"><span></span><span class="cm">/**</span>
<span class="cm"> * Template properties for a portlet.</span>
<span class="cm"> * @typedef {Object} PortletContext</span>
<span class="cm"> * @prop {string} portal-id Identifier for wrapper.</span>
<span class="cm"> * @prop {string} html-tooltip Tooltip message.</span>
<span class="cm"> * @prop {string} msg-label-id Aria identifier for label text.</span>
<span class="cm"> * @prop {string} [html-userlangattributes] Additional Element attributes.</span>
<span class="cm"> * @prop {string} msg-label Aria label text.</span>
<span class="cm"> * @prop {string} html-portal-content</span>
<span class="cm"> * @prop {string} [html-after-portal] HTML to inject after the portal.</span>
<span class="cm"> * @prop {string} [html-hook-vector-after-toolbox] Deprecated and used by the toolbox portal.</span>
<span class="cm"> */</span>

<span class="cm">/**</span>
<span class="cm"> * @param {PortletContext} data The properties to render the portlet template with.</span>
<span class="cm"> * @return {HTMLElement} The rendered portlet.</span>
<span class="cm"> */</span>
<span class="kd">function</span> <span class="nx">wrapPortlet</span><span class="p">(</span> <span class="nx">data</span> <span class="p">)</span> <span class="p">{</span>
  <span class="kr">const</span> <span class="nx">node</span> <span class="o">=</span> <span class="nb">document</span><span class="p">.</span><span class="nx">createElement</span><span class="p">(</span> <span class="s1">&#39;div&#39;</span> <span class="p">);</span>
  <span class="nx">node</span><span class="p">.</span><span class="nx">setAttribute</span><span class="p">(</span> <span class="s1">&#39;id&#39;</span><span class="p">,</span> <span class="s1">&#39;mw-panel&#39;</span> <span class="p">);</span>
  <span class="nx">node</span><span class="p">.</span><span class="nx">innerHTML</span> <span class="o">=</span> <span class="nx">mustache</span><span class="p">.</span><span class="nx">render</span><span class="p">(</span> <span class="nx">portalTemplate</span><span class="p">,</span> <span class="nx">data</span> <span class="p">);</span>
  <span class="k">return</span> <span class="nx">node</span><span class="p">;</span>
<span class="p">}</span></pre></div>

<p><em>CAPTION: If this code was undocumented or the types inaccurate, would you always get the data properties right? Maybe you would, but what about everyone else?</em></p>

<p>Most programmers are already typing their JavaScript to some extent with JSDocs, so often only refinements are needed. In other cases, TypeScript&#039;s excellent type inference abilities can be leveraged so that no changes are required.</p>

<p>Type definitions are a useful supplement to JSDocs. Definitions are non-functional documentation that support type annotations inline. For example, <a href="https://github.com/DefinitelyTyped/DefinitelyTyped/blob/77aa4ad/types/jquery/v2/index.d.ts" class="remarkup-link remarkup-link-ext" rel="noreferrer">the definition of the powerful but fantastically loose jQuery API</a> could find marvelous utility in many Wikimedia codebases for at-your-fingertips documentation needs. Another very relevant example that ships with TypeScript itself is <a href="https://github.com/Microsoft/TypeScript/blob/28319a5/src/lib/dom.generated.d.ts#L5234" class="remarkup-link remarkup-link-ext" rel="noreferrer">the DOM definition</a>, which will alert you to misalignments such as attempting to access a <tt class="remarkup-monospaced">classList</tt> on a <a href="https://developer.mozilla.org/en-US/docs/Web/API/Node" class="remarkup-link remarkup-link-ext" rel="noreferrer">Node</a> instead of an <a href="https://developer.mozilla.org/en-US/docs/Web/API/Element/classList" class="remarkup-link remarkup-link-ext" rel="noreferrer">Element</a>. Thorough type checking is similar and the perfect complement to <a href="https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Popups/+/76e02fa/.eslintrc.es5.json" class="remarkup-link remarkup-link-ext" rel="noreferrer">ESLint checks for ES5-only sources</a> or more broadly <a href="https://github.com/wikimedia/eslint-config-wikimedia" class="remarkup-link remarkup-link-ext" rel="noreferrer">ESLint&#039;s safety checks</a>.</p>

<p>Type definitions are also a convenient way to describe globals and, more generally, share types. Definitions are either shipped with the NPM package itself or <a href="https://definitelytyped.org/" class="remarkup-link remarkup-link-ext" rel="noreferrer">DefinitelyTyped</a> (e.g., <tt class="remarkup-monospaced">npm i -D @types/jquery</tt>) and are now standard practice for most noteworthy JavaScript libraries. Imagine if this degree of accuracy could be achieved in some of our most well-used codebases. Integrations between skins, extensions, Core, and peripheral libraries would be validated for alignment. It would be harder to break things and a much more welcoming experience for newcomers.</p>

<p><span class="remarkup-nav-sequence"><span class="phui-tag-view phui-tag-type-shade phui-tag-grey phui-tag-shade phui-tag-icon-view "><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-keyboard-o" data-meta="0_0" aria-hidden="true"></span>npm install jsdoc typescript</span></span><span class="remarkup-nav-sequence-arrow"> → </span><span class="phui-tag-view phui-tag-type-shade phui-tag-grey phui-tag-shade phui-tag-icon-view "><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-cog" data-meta="0_1" aria-hidden="true"></span>tsconfig.json</span></span><span class="remarkup-nav-sequence-arrow"> → </span><span class="phui-tag-view phui-tag-type-shade phui-tag-grey phui-tag-shade phui-tag-icon-view "><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-plus" data-meta="0_2" aria-hidden="true"></span>tsc</span></span><span class="remarkup-nav-sequence-arrow"> → </span><span class="phui-tag-view phui-tag-type-shade phui-tag-indigo phui-tag-shade phui-tag-icon-view "><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-file-text-o" data-meta="0_3" aria-hidden="true"></span><em>Document!</em></span></span></span></p>

<p>The actual project setup for adding documentation checks to an existing repository is minimal and requires no functional changes:</p>

<ol class="remarkup-list">
<li class="remarkup-list-item">Add JSDoc and TypeScript as NPM development dependencies. Optionally: add any missing types for third-party libraries used.</li>
<li class="remarkup-list-item">Add a tsconfig.json to tell TypeScript to lint JavaScript documentation you wish to validate.</li>
<li class="remarkup-list-item">Add <tt class="remarkup-monospaced">tsc</tt> to the project&#039;s NPM test script.</li>
</ol>

<p>The real work is in fleshing out the missing documentation with JSDocs. However, TypeScript is quite flexible about how one chooses to opt in or out of documentation validation. If code isn&#039;t worth documenting, it&#039;s probably not worth keeping, but typing can be consciously deferred in a number of ways. The most straightforward is probably with a <tt class="remarkup-monospaced">// @ts-ignore</tt> comment. Think of it as progressively enhanced code.</p>

<p>An <a href="https://gerrit.wikimedia.org/r/#/c/mediawiki/skins/Vector/+/575081/" class="remarkup-link remarkup-link-ext" rel="noreferrer">example project setup for Vector is here</a> which shows how typing and documentation can be retrofitted nicely even on codebases that predate TypeScript and make heavy use of <a href="https://api.jquery.com/" class="remarkup-link remarkup-link-ext" rel="noreferrer">sophisticated APIs like jQuery</a>.</p>

<h4 class="remarkup-header">Editor support</h4>

<p>It&#039;s unnecessary for setting up a project, but worth mentioning, that by ensuring that even a machine can <a href="https://www.typescriptlang.org/play/?strictNullChecks=true&amp;noUnusedLocals=true&amp;noUnusedParameters=true&amp;removeComments=true&amp;ssl=1&amp;ssc=1&amp;pln=24&amp;pc=1&amp;useJavaScript=true#code/PQKhFgCgAIWgVApgWwA4BsCGAXR1UBOA9qogdgJaIDO0AZkQdJvo9uotgHRSzQAC2AJ6kAJojrQA3gHkARgCtEAY2wBfaAAU2HbAGEiAO1wAPbLzj9CJadWwEKhgOYbUbTOgC0FUdACS4sYUdFRMDEwA7gSYqKQEPDCW1qi29o4u0AAW2Mhe2ERE6JQp8AVFFCnINNSYTogJfFbEKVJ2Ds4ayNROnlhyiF4+0ACCDiw+iEEhZPSM0H0D0KbcFgLJqe0ZANrZuZ4ArtRkWM44aXL7uNQAuiOiohSURh7QAKIcVcbM2OeXNA1JZobdKdbq9TD9dAjMbzCGLZYAtZA1ppDpZHJeNzkDyeZRGXDGVZNGwozYaHYYzyYOi4AieLHYDy3AAS8AAsgAZJZEaCOJSqZg0mbYTJ4BkeRHElptEHQCl7TIFADWngAbip8nTqbTPPlCnIiCZbgARRCEFQ4RC+TCGXyHK3QORCJai7n6w2sbHoAHAKBQUAQRJrTDRZDSbTkXQGYyIMwaUQ4FjwV3JMiUGjc6AESbiJgisU6ThLFAYS3QCKPTKS7PYfYEQzSVmc94oSbqBCu7O2sgOhm6H1QOj7QyqChGcvRVAR9icAAU0ATjOgAEppLxoBu8YY7NBDERxNAALwLojKfafbjKbOWlsX+cAcgeqvvK4A3OuN3vxFwjthhj8HAuXAHx8e8ABpoHvZAInpG0BhfZd3xgDdd33epHEMMgmy5Y9kEORllFFLgu1zedxXQJA0CwXAIMXFhEI-LNODrBsv0QJC1CgIA" class="remarkup-link remarkup-link-ext" rel="noreferrer">model your documentation</a> means that your code editor can understand it too. For most editors, this means you&#039;ll get accurate, split-second documentation lookup and documentation type checking. <a href="https://code.visualstudio.com/" class="remarkup-link remarkup-link-ext" rel="noreferrer">Visual Studio Code</a> has a superb out-of-box experience for JavaScript including documentation awareness and code completion, but <a href="https://github.com/Microsoft/TypeScript/wiki/TypeScript-Editor-Support" class="remarkup-link remarkup-link-ext" rel="noreferrer">other editors are supported too</a>.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/nyiilq6qhhjk3dbkemut/PHID-FILE-jj4o66yvbdz4limrjdrg/visual-studio-code-error-tip.png" class="phabricator-remarkup-embed-image" data-sigil="lightboxable" data-meta="0_34"><img src="https://phab.wmfusercontent.org/file/data/nyiilq6qhhjk3dbkemut/PHID-FILE-jj4o66yvbdz4limrjdrg/visual-studio-code-error-tip.png" width="640" alt="Screenshot from Visual Studio Code integrated development environment showing the same red squigglies we saw before but this time its detecting a fatal error instead of a typo." /></a></div><em>CAPTION: Errors are identified as you write. There&#039;s that typo again but this time it could be your next unbreak now or your next type checker error.</em></p>

<p>You would see similar output from a continuous integration job or the command line:</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/xrnvxepmpytwemxjddy4/PHID-FILE-gh7q36grb4fia22ujjaz/executing-typescript-from-the-command-line.gif" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_35"><img src="https://phab.wmfusercontent.org/file/data/xrnvxepmpytwemxjddy4/PHID-FILE-gh7q36grb4fia22ujjaz/executing-typescript-from-the-command-line.gif" height="918" width="1868" loading="lazy" alt="Animation of the same flawed code presenting an error on the command line when checked." /></a></div><em>CAPTION: The command line output is just as informative.</em></p>

<p>And here are those excellent docs:</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/jtjewgqleoujrmvtzwbf/PHID-FILE-52swi4busqwaxlf272ya/visual-studio-code-documentation-tip.png" class="phabricator-remarkup-embed-image" data-sigil="lightboxable" data-meta="0_36"><img src="https://phab.wmfusercontent.org/file/data/jtjewgqleoujrmvtzwbf/PHID-FILE-52swi4busqwaxlf272ya/visual-studio-code-documentation-tip.png" width="640" alt="Screenshot of the same code editor presenting documentation in a highly integrated and useful manner." /></a></div><em>CAPTION: Documentation is a mouse hover away. Coding with documentation at hand is a breeze and the expectation for many modern developers writing their first MediaWiki patches.</em></p>

<h3 class="remarkup-header">Conclusion</h3>

<p>65 miles per hour is 104.60736 kilometers per hour. Language changes the way we think, and documentation is <a href="https://www.wikipedia.org/" class="remarkup-link remarkup-link-ext" rel="noreferrer">the encyclopedia</a> of code. Tooling that improves our abilities to understand, reason, and express ourselves through language improves our ability to engineer.</p>

<p>In my own personal and professional development, I&#039;ve found accurate documentation to be a great treasure that gives me confidence and efficiency in the code that I read and write. Maybe we should have the same hopes and expectations of our documentation that newcomers do. Maybe with better documentation—documentation that is as accurate as we can automate—some of Wikipedia&#039;s many JavaScript errors could be identified and eliminated as easily as changing units from mph to kph. <em>Maybe</em> with better documentation, we could write better software, faster. Software that users love using and developers enjoy writing. Let&#039;s get to work!</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/6sjbce24zdzkvw6733ww/PHID-FILE-z67et67bnavz5toymdzp/puzzle.jpg" class="phabricator-remarkup-embed-image" data-sigil="lightboxable" data-meta="0_37"><img src="https://phab.wmfusercontent.org/file/data/6sjbce24zdzkvw6733ww/PHID-FILE-z67et67bnavz5toymdzp/puzzle.jpg" width="640" alt="Photograph of a blank jigsaw puzzle where pieces differ only by shape." /></a></div><em>CAPTION: Programs are like jigsaw puzzles where types are the shapes. Check assembly before shipping. Image by <a href="https://commons.wikimedia.org/wiki/File:Puzzle_Krypt-2.jpg" class="remarkup-link remarkup-link-ext" rel="noreferrer">Muns and Schlurcher</a> / <a href="https://creativecommons.org/licenses/by-sa/2.0" class="remarkup-link remarkup-link-ext" rel="noreferrer">CC BY-SA</a>.</em></p>

<p><em>Thanks to <a href="https://en.wikipedia.org/wiki/User:Phuedx" class="remarkup-link remarkup-link-ext" rel="noreferrer">Sam Smith</a>, <a href="https://www.mediawiki.org/wiki/User:JHernandez_(WMF)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Joaquín Oltra Hernández</a>, and <a href="https://meta.wikimedia.org/wiki/User:Leszek_Manicki_(WMDE)" class="remarkup-link remarkup-link-ext" rel="noreferrer">Leszek Manicki</a> for reviewing and providing feedback.</em></p>

<div class="remarkup-note"><span class="remarkup-note-word">NOTE:</span> <em>Documentation on building better documentation is being written <a href="https://www.mediawiki.org/wiki/TypeScript" class="remarkup-link remarkup-link-ext" rel="noreferrer">on wiki</a> with the help of editors like you!</em></div></div></content></entry><entry><title>Why does building a skin require PHP knowledge?</title><link href="/phame/live/9/post/188/why_does_building_a_skin_require_php_knowledge/" /><id>https://phabricator.wikimedia.org/phame/post/view/188/</id><author><name>Jdlrobson (Jon Robson)</name></author><published>2020-02-03T06:37:29+00:00</published><updated>2020-06-10T23:34:21+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p>One of my longstanding pet peeves is that skin development for MediaWiki is so hard. I propose a radical change to how skins are installed and ask for feedback.</p>

<p>Having watched teenagers use Myspace.com and then tumblr.com and watching Wikimedians build all sorts of things using wikitext templating it&#039;s clear that skinning anything should be possible with a mixture of basic knowledge of web technology (HTML,CSS maybe JSON) and/or <a href="https://en.wikipedia.org/wiki/Cargo_cult_programming" class="remarkup-link remarkup-link-ext" rel="noreferrer">cargo cult programming</a>. The MediaWiki skin ecosystem is pretty sparse and when  skins are created they don&#039;t tend to be published for wider consumption or are lost in github repos that are never linked to. Some never even get built. After almost 10 years in this movement it&#039;s easy to see why.</p>

<p>At a recent offsite I got all my team to stand up in a room and asked them to sit down if they felt comfortable with HTML. A few sat down and I told them unfortunately they couldn&#039;t build a skin. When I asked them if they felt comfortable editing CSS, a few more sat down and I told them the same thing. Eventually everyone sat down. What was interesting was who sat down and when. The designers sat down at the mention of PHP (while comfortable with CSS and JS) as did many frontend engineers. Meanwhile backend engineers sat down at the mention of PHP.</p>

<p>Our skin code is pretty complicated.  We currently encourage skin development by guiding users to the <a href="https://www.mediawiki.org/wiki/Skin:Example" class="remarkup-link remarkup-link-ext" rel="noreferrer">ExampleSkin</a>. This extension is pretty scary to many developers not already in our ecosystem and many designers who are in it. There is an extreme amount of PHP and knowledge of folder structure and MediaWiki-concepts such as ResourceLoader is needed before someone can even start.</p>

<p>Currently to create a skin at minimum you must</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Download and setup MediaWiki</li>
<li class="remarkup-list-item">Learn git and clone the ExampleSkin repo</li>
<li class="remarkup-list-item">Understand ResourceLoader</li>
<li class="remarkup-list-item">Understand our i18n system</li>
<li class="remarkup-list-item">Understand how skin.json works</li>
<li class="remarkup-list-item">Edit PHP to generate HTML</li>
<li class="remarkup-list-item">Edit CSS</li>
</ul>

<p>To encourage a healthy skin system we need to lower many of the barriers to implementing a skin. It should be as simple as:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Clone a repo</li>
<li class="remarkup-list-item">Edit some CSS and HTML</li>
<li class="remarkup-list-item">Run some npm commands</li>
</ul>

<p>During the implementation of <a href="/tag/mobilefrontend/" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_39"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-briefcase" data-meta="0_38" aria-hidden="true"></span>MobileFrontend</span></a> and <a href="/tag/minervaneue/" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_41"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-briefcase" data-meta="0_40" aria-hidden="true"></span>MinervaNeue</span></a> many changes were made to the skin system to help build the new skin whilst maintain the old skin.  It also intentionally made some breaking changes from traditional skin - for example no longer were languages or categories part of the HTML of an article. JavaScript and CSS shipped by all skins was turned off in preference for its own versions. In my opinion this was the skin&#039;s right. A good skinning system allows you to create radically different skins and innovate. If our skin system was healthy we&#039;d likely have skins of all sorts of shapes and sizes. A good skin system also makes maintenance easier. Right now because of class inheritance, it&#039;s very difficult to make changes to our existing skins or our core skin PHP without worrying about breaking something elsewhere. Similar changes and challenges happened with <a href="/tag/timeless/" class="phui-tag-view phui-tag-type-shade phui-tag-blue phui-tag-shade phui-tag-icon-view " data-sigil="hovercard" data-meta="0_43"><span class="phui-tag-core "><span class="visual-only phui-icon-view phui-font-fa fa-briefcase" data-meta="0_42" aria-hidden="true"></span>Timeless</span></a> that I&#039;m sure Isarra can attest to!</p>

<h2 class="remarkup-header">Exploring different approaches</h2>

<p>I&#039;ve been lamenting this situation for some time. A while back I created an extension called SimpleSkins that reduced the <a href="https://github.com/jdlrobson/SimpleSkins/tree/master/skins/SimpleSkinMinervaNeue" class="remarkup-link remarkup-link-ext" rel="noreferrer">Minerva skin to 2 files</a> with some ambitious and breaking changes to the PHP skin code that I dreamed of one day upstreaming.</p>

<p>At a recent hackathon, with this idea still in mind, I took a slightly different approach. Instead of trying to make a skin a folder of several files I thought - what if the skin folder was an output of a build step? Similar to the SimpleSkin approach I again focused on a folder of frontend friendly technologies and <a href="https://github.com/jdlrobson/mediawiki-dali-skin-builder/tree/master/src/vectorlike" class="remarkup-link remarkup-link-ext" rel="noreferrer">reduced Vector to 3 files - Mustache (template), JS (with require support), LESS(css)</a> however the generation of skin.json, PHP <a href="https://github.com/jdlrobson/mediawiki-dali-skin-builder/blob/master/dev-scripts/build.js#L58" class="remarkup-link remarkup-link-ext" rel="noreferrer">was left to a build script</a>. Remarkably this worked and was relatively straightforward. One interesting insight I had this time however was that no skin developer should require a MediaWiki install to build the skin - with templates a lot of data can be stubbed or come from an API endpoint. Not having to install a MediaWiki install is a big deal!</p>

<p>With a good architecture a lot of our skin system can be abstracted away. A skin without qqq codes is still useful provided en.json has been generated. ResourceLoader module config is much easier to auto-generate now we have packageFiles provided we enforce a common entry point e.g. index.js and index.css/less. The PHP skin class should and can do more. Instead of having skins that extend SkinTemplate e.g. SkinVector we should have a skin rendering class that renders folders that contain skin meta data....</p>

<h2 class="remarkup-header">How we do it.</h2>

<p>Forgetting about existing technology choices and working with what we&#039;ve got, I&#039;d propose automating most of the skin registration process - to the point that PHP is irrelevant and JS and JSON files are optional.</p>

<p>I strongly believe the following is possible:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">An online skin builder that saves and exports skin folders to download folder or github similar to jsfiddle</li>
<li class="remarkup-list-item">A valid skin is a folder with at minimum 2 files - index.mustache and index.(css/less)</li>
<li class="remarkup-list-item">You should be able to copy and paste an existing skin and get a new skin without any modification except the folder name.</li>
</ul>

<p>To achieve such a goal we would need a SkinRenderer class that locates a skin directory and renders the template inside it (Mustache is currently the template language we support in core). SkinRenderer when passed the skin key <tt class="remarkup-monospaced">skinnew</tt> for example would find the folder <tt class="remarkup-monospaced">skinnew</tt> in the skins folder and the files index.less, index.js and skin.mustache. It would pass skin.mustache data (which is subject to deprecation policy and well documented and it would register a ResourceLoader module using index.less and index.js and packageFiles. qqq.json and en.json if needed could live in the i18n folder as they currently do but their absence would not cause any problems.</p>

<p>A developer would fork a new version of the ExampleSkin which provides the basic file components, run <tt class="remarkup-monospaced">npm install</tt> and then <tt class="remarkup-monospaced">npm start</tt>. This would pull our core frontend technologies - mustache and LESS from npm and then pass the skin through a tool such as  parceljs that allows live updating, <a href="https://youtu.be/OyBzMVNdg2I" class="remarkup-link remarkup-link-ext" rel="noreferrer">the workflow of which is demonstrated in this video</a>. However unlike in my hack experiment, installing the skin would be as simple as copying that folder into mediawiki&#039;s skin folder rather than running a build step :)</p>

<h2 class="remarkup-header">What do I do next?</h2>

<p>Am I alone in thinking it should be possible to build skins without PHP? Do you have different opinions on what the skin system should be? Do you have concerns about how such a system would scale or whether such a system would get adoption? What do you think of <a href="https://github.com/jdlrobson/mediawiki-dali-skin-builder" class="remarkup-link remarkup-link-ext" rel="noreferrer">my skin builder tool</a> and should I work on it more? If so I&#039;d love to hear more from you. Any feedback you can provide would be helpful to decide whether I should prepare and push and RFC.</p>

<p>Thank you for your time!</p></div></content></entry><entry><title>Migrating code from MediaWiki&#039;s ResourceLoader to Webpack</title><link href="/phame/live/9/post/146/migrating_code_from_mediawiki_s_resourceloader_to_webpack/" /><id>https://phabricator.wikimedia.org/phame/post/view/146/</id><author><name>Jdlrobson (Jon Robson)</name></author><published>2019-03-13T15:15:19+00:00</published><updated>2019-03-19T10:44:49+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p>The lack of tooling or support for tooling has been causing problems in complicated code bases like the codebase for our mobile site, so we carved out a proposal to create a bridge from our existing codebase to a more modern one using Webpack. I&#039;ll talk about what we did and why.</p>

<p>The majority of Wikipedia&#039;s front-end assets are served by a system called <a href="https://www.mediawiki.org/wiki/ResourceLoader" class="remarkup-link remarkup-link-ext" rel="noreferrer">ResourceLoader</a>, that has been part of the MediaWiki software since 2010. Of particular interest to us is how it packs JavaScript assets, by concatenating and compressing them. This capability predates <a href="https://en.wikipedia.org/wiki/Webpack" class="remarkup-link remarkup-link-ext" rel="noreferrer">Webpack</a> (2012) and similar tools so this should not be considered a case of <a href="https://en.wikipedia.org/wiki/Not_invented_here" class="remarkup-link remarkup-link-ext" rel="noreferrer">Not invented here</a>.</p>

<p>This tooling has allowed developers to build front-end code without tooling. Back in 2010, this was actually the norm. If you look at similar projects which have been around for as long as ours in the open source ecosystem, you&#039;ll find <a href="https://github.com/internetarchive/openlibrary/blob/master/Makefile" class="remarkup-link remarkup-link-ext" rel="noreferrer">Makefiles concatenating JavaScript files</a> - artifacts of this era.</p>

<p>As the JavaScript ecosystem has flourished, it&#039;s becoming impossible to build JavaScript without some reliance on tools. Needless to say most front-end developers are now accustomed to working <em>with</em> tooling. Being unable to use tooling to build our JavaScript has arguably handicapped us a little, especially as we turn our attention to more complicated ambitious projects such as the page previews feature.</p>

<p>Unlike many JavaScript module systems, the ResourceLoader system we use focuses on the collection and delivery of various loosely coupled assets which it discovers from a manifest rather than a file. Currently, when writing JavaScript we need to define a module inside a special file called <a href="https://github.com/wikimedia/mediawiki-extensions-MobileFrontend/blob/master/extension.json#L387" class="remarkup-link remarkup-link-ext" rel="noreferrer">extension.json</a> which is then interpreted in PHP and sent to the user which looks like so:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">&quot;mobile.toc&quot;: {
        // define which environments to run this module
&quot;targets&quot;: [
                &quot;mobile&quot;,
                &quot;desktop&quot;
        ],
        // manifest of dependencies (think require/import)
&quot;dependencies&quot;: [
                &quot;mobile.startup&quot;,
                &quot;mobile.toc.images&quot;
        ],
// script files in order they need to be concatenated
        &quot;scripts&quot;: [
                &quot;resources/mobile.toc/TableOfContents.js&quot;
        ],
        // styles that should be loaded via JavaScript
&quot;styles&quot;: [
                &quot;resources/mobile.toc/toc.less&quot;
        ],
        // manifest of templates to load to support JavaScript
&quot;templates&quot;: {
                &quot;toc.hogan&quot;: &quot;resources/mobile.toc/toc.hogan&quot;,
                &quot;heading.hogan&quot;:  &quot;resources/mobile.toc/tocHeading.hogan&quot;
        },
        // message keys needed to load to support internationalization
&quot;messages&quot;: [
                &quot;toc&quot;
        ]
},</pre></div>

<p>Without a Node.js module system, our developers have had to work with a client-side library similar to <a href="https://requirejs.org/" class="remarkup-link remarkup-link-ext" rel="noreferrer">Require.js</a> and manual editing of what&#039;s essentially a manifest to discover JavaScript whilst keeping a mental picture of how it all fits together and the code is split. We had to manage the JavaScript dependency trees ourselves. As my work colleague, Stephen put so nicely:</p>

<blockquote><p>we essentially had to fill out paperwork to create a file&quot; and now &quot;adding a new file is as easy as right click new file&quot;.</p></blockquote>

<p>Using Webpack to handle our JavaScript rather than our own in-house ResourceLoader has achieved several things for us:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Webpack manages complicated dependency trees for us which avoids loading errors (e.g. code loading in the wrong order)</li>
<li class="remarkup-list-item">gives us more control over public and private interfaces to community gadgets</li>
<li class="remarkup-list-item">allows us to expose interfaces for testing</li>
<li class="remarkup-list-item">encourages separation of logic into reusable modules (files) without any mental strain</li>
<li class="remarkup-list-item">allows delegation of problems such as code splitting to tooling rather than the human mind</li>
</ul>

<h2 class="remarkup-header">Making code easier to work with</h2>

<p>Previously, adding or even renaming any source file to our repository required not only creating the file but registering it by listing it in an array in a JSON file (see developer notes at the end of this article for more). We had <strong>86 JavaScript files</strong> which we reduced to 19 files built via Webpack from 101 source files. The increase in source files reflects the teams ability to embrace Webpack and separate code responsibilities more effortlessly.</p>

<p>Now that we make use of Webpack, we only have to require it via a require statement. This might seem basic stuff, but it has made a big difference.</p>

<p>Similarly, we&#039;ve not had to worry about polluting the global namespace. Previously all our files were wrapped in an IIFE but now we&#039;ve been able to remove that wrapper and also the indenting associated with it.</p>

<h2 class="remarkup-header">A familiar stack</h2>

<p>While we didn&#039;t measure it, and it could just because we hire awesome people, we&#039;ve all noticed that our new hire got up and running with our code much quicker than previous hires.</p>

<p>We&#039;re using tooling that is well supported and has utilized new tooling to help us write better code. We&#039;re making use of bundlesize to track the size of our JavaScript assets in the code itself (and preventing unexpected increases with our continuous integration stack). We&#039;re also making use of nyc to track code coverage (see below).</p>

<h2 class="remarkup-header">Faster unit tests</h2>

<p>Previously, our unit tests couldn&#039;t be run cheaply on the command line in Node.js. Any npm script wanting to run them would have to boot up a browser e.g. Phantomjs and have a working MediaWiki instance. This was slow, manual and error-prone as tests from other unrelated projects could break our own tests. It was pseudo-integration testing than unit testing. Now, with a few minor changes (mock libraries that emulate &quot;MediaWiki&quot;), we can run these tests from the command line in headless mode using the qunit node library. We use Webpack to build a file that can be run in the old method, to avoid confusing developers in other teams who are used to running tests this way. During our refactor, 44 QUnit tests were slowly migrated to run in Node.js.</p>

<p>This is obviously much faster as it doesn&#039;t require a MediaWiki install and doesn&#039;t require booting up a browser. As a result, there is more motivation for engineers to write tests in the first place, in fact we&#039;ve already added 15 new test files.</p>

<h2 class="remarkup-header">Code coverage</h2>

<p>Previously, our bespoke tooling made testing coverage very tricky and manual. As a result we didn&#039;t measure. Now that we are using Webpack and headless Node.js tests, we are able to work this out. After porting a file to the new module loading system, we made sure to document the code coverage of the file. It has shown us that roughly 50% of the JavaScript code we run is covered by tests. More alarmingly, 45 of our 81 files had 0% coverage. In particular it became clear that we were avoiding writing tests where it meant exposing private interfaces on a global JavaScript variable.</p>

<p>As we refactor with our remaining project time we aim closer to 100% test coverage, now that our tooling makes it easier to write tests and we are now able to enforce coverage in the repository via the nyc library meaning that code coverage can only get better.</p>

<h2 class="remarkup-header">More future-proof code</h2>

<p>While we&#039;ve been forced to look at the code, we&#039;ve been noticing ways to improve it and prepare it for a modern future. We&#039;ve been replacing calls to jQuery with calls to a wrapper for jQuery, meaning it&#039;s becoming clearer about what we use jQuery for, and when and how we might not need it. By keeping the definition of our project around problems rather than solutions, it&#039;s been easy to justify and prioritize this work.</p>

<p>Similarly, we&#039;ve been migrating to use ES5 functions where possible instead of jQuery.</p>

<p>While we&#039;re not removing jQuery from our stack just yet, <a href="https://medium.com/r/?url=https%3A%2F%2Fphabricator.wikimedia.org%2FT200868" class="remarkup-link remarkup-link-ext" rel="noreferrer">we&#039;ve found inspiration</a> in other efforts to do this such as <a href="https://githubengineering.com/removing-jquery-from-github-frontend/" class="remarkup-link remarkup-link-ext" rel="noreferrer">Github</a> to at least make this a real possibility.</p>

<h2 class="remarkup-header">Versioning</h2>

<p>Our mobile front-end relies heavily on the Hogan template library. Previously, any vendor JavaScript had to copy and pasted into the repository itself. When we started we didn&#039;t actually know what version of Hogan we were using and had to diff it with several production versions! Now that we use Webpack, we can pull Hogan directly from the npm repository, so we know exactly what we are shipping and can upgrade easily if necessary. We are exploring leaning more on our tooling with ideas to use transpiling and include template source code in JavaScript files.</p>

<h2 class="remarkup-header">Webpack didn&#039;t solve everything</h2>

<p>While Webpack has helped us organize our JavaScript files better, it doesn&#039;t seem to solve all our problems (yet). For example. since Wikipedia supports over 200 languages, we haven&#039;t found a way Webpack can ship message strings in the scaleable way that ResourceLoader does. I&#039;m excited about the prospect of identifying these problems and filling in those blanks where necessary, but right now we have a nice balance of the best of ResourceLoader and Webpack in our codebase.</p></div></content></entry><entry><title> Minimal MediaWiki for frontend engineers</title><link href="/phame/live/9/post/144/minimal_mediawiki_for_frontend_engineers/" /><id>https://phabricator.wikimedia.org/phame/post/view/144/</id><author><name>Jdlrobson (Jon Robson)</name></author><published>2019-02-21T19:07:02+00:00</published><updated>2019-02-26T21:28:17+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p>I use OSX. Vagrant has not been kind to me, but I&#039;m hopeful that Docker will make development a lot easier for me in future.<br />
Until then, I use <a href="https://www.mamp.info/en/" class="remarkup-link remarkup-link-ext" rel="noreferrer">MAMP</a> which provides a pretty easy LAMP setup. I wanted to share it with other frontend engineers as this minimal setup works well for me - it&#039;s fast, it minimises the extensions I need to update and most importantly brings me closer to problems with frontend end-users are experiencing.</p>

<h2 class="remarkup-header">MAMP replicating Wikimedia paths</h2>

<p>In MAMP I have the web server set to apache and use a symlink to point the wiki folder to a git clone of <a href="https://github.com/wikimedia/mediawiki" class="remarkup-link remarkup-link-ext" rel="noreferrer">mediawiki/core</a>.</p>

<p>I setup the wiki via the web installer - which if you&#039;ve never tried yourself, I urge you to give it a go! It&#039;s not that complicated really, and that&#039;s quite impressive!</p>

<p>I have the following defined in LocalSettings.php to match production wikis.</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code"># this is is important as otherwise things like page previews will not work correctly when using the proxying tips I tell you below
# With this setup articles can be viewed on the path `/wiki/Main_Page`
$wgArticlePath = &quot;/wiki/$1&quot;;</pre></div>



<h2 class="remarkup-header">Extensions and proxying content</h2>

<p>I am lucky to work with extensions that require minimum setup - most are simply git clone,  configure and play e.g.</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">wfLoadExtension( &#039;Popups&#039; );
wfLoadExtension( &#039;MobileFrontend&#039; );
wfLoadSkin( &#039;MinervaNeue&#039; );</pre></div>

<p>I have no working instance of Wikidata or VisualEditor - these are black boxes to me. As a general rule I only install what I need.  When I do need to test integrations with them, I seek configurations that can point to production.</p>

<p>For instance, the following config allows me to load production content into VisualEditor without setting up a REST base server:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">wfLoadExtension( &#039;VisualEditor&#039; );
// Enable by default for everybody
$wgDefaultUserOptions[&#039;visualeditor-enable&#039;] = 1;
$wgVirtualRestConfig[&#039;modules&#039;][&#039;parsoid&#039;] = array(
    // URL to the Parsoid instance
    // Use port 8142 if you use the Debian package
    &#039;url&#039; =&gt; &#039;https://en.wikipedia.org&#039;,
    // Parsoid &quot;domain&quot;, see below (optional)
    &#039;domain&#039; =&gt; &#039;en.wikipedia.org&#039;,
);
$wgVisualEditorFullRestbaseURL = &#039;https://en.wikipedia.org/api/rest_&#039;;</pre></div>

<p>Note, that this will trigger CORs problems on your localhost, but that MediaWiki&#039;s API supports CORs for localhost for readonly requests provided you pass a query string parameter &quot;origin=*&quot;.</p>

<p>The following code when placed in the right place (I usually throw this line into core or above the api request I want to proxy) gives me <br />
content to feed from production into VisualEditor:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">$.ajaxSetup({ data: {  origin: &#039;*&#039;  }  } );</pre></div>

<p>Whenever my team needs to work with any kind of service of API, I&#039;ve always found it much more useful to proxy content from production as otherwise I miss bugs and I find replication difficult. Using Special:Import is slow and broken. In particular, if you import pages linked to Wikidata, you also need to clone the Wikidata page for that article to be replicated locally.</p>

<p>User generated content is particularly important when working with content on mobile. We added support to MobileFrontend to proxy content and it can easily be enabled with the following config in LocalSettings.php:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">$wgMFContentProviderClass = &#039;MobileFrontend\ContentProviders\MwApiContentProvider&#039;;
$wgMFMwApiContentProviderBaseUri = &quot;https://en.wikipedia.org/w/api.php&quot;;</pre></div>

<p>With these two changes, any pages I view in mobile will be proxied from production. This is currently available in the beta cluster - check out <a href="https://en.m.wikipedia.beta.wmflabs.org/wiki/Singapore" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://en.m.wikipedia.beta.wmflabs.org/wiki/Singapore</a> for example. The pages 404 to avoid indexing, but will be live copies of the production equivalent.</p>

<h3 class="remarkup-header">Proxying content for desktop too!</h3>

<p>In Popups (<a href="https://medium.com/freely-sharing-the-sum-of-all-knowledge/why-it-took-a-long-time-to-build-that-tiny-link-preview-on-wikipedia-d5bd734df8fe" class="remarkup-link remarkup-link-ext" rel="noreferrer">page previews feature</a>), this is pretty easy to avoid installing Popups dependencies PageImages and TextExtracts I use 2 lines of config:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">$wgPopupsGateway = &quot;restbaseHTML&quot;;
$wgPopupsRestGatewayEndpoint = &#039;https://en.m.wikipedia.org/api/rest_v1/page/summary/&#039;;</pre></div>

<p>This configures my localhost to make use of the production REST endpoint to source summaries. If I create a page &quot;Popups test page&quot; and create red links on a page &quot;Dog&quot; and create a page &quot;Dog&quot; with 2 lines of text, previewing the link Dog on &quot;Popups test page&quot; will show me the production preview for the Dog article. However, this doesn&#039;t fully work as I need links to those pages for page previews to work. Since page previews runs on desktop I can use the same proxying I use for mobile....</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">// Enable MobileFrontend&#039;s &quot;content provider&quot; for desktop too!
$wgMFAlwaysUseContentProvider = true;</pre></div>

<p>Sometimes I need to make articles, so I provide an override to allow me to edit and view local pages that don&#039;t live on my production wiki:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">// This will ensure that any local pages are served instead of production copies where available.
$wgMFContentProviderTryLocalContentFirst = true;</pre></div>



<h3 class="remarkup-header">Proxying read-only APIs in mobile</h3>

<p>Sometimes, I want to proxy JavaScript as well, which MobileFrontend also allows me to do. If I&#039;m testing workflows with read-only (ie. no POSTs or authentication) I can proxy APIs. This is useful for testing pages like <a href="https://en.m.wikipedia.org/wiki/Special:Nearby" class="remarkup-link remarkup-link-ext" rel="noreferrer">Special:Nearby</a> without having to create lots of articles with coordinates near your current location.</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">// Redirect API requests to English Wikipedia
$wgMFContentProviderScriptPath = &quot;https://en.wikipedia.org/w&quot;;</pre></div>



<h2 class="remarkup-header">Cache!</h2>

<p>I use memcached to cache. Without this your wiki might be a little on the slow side:</p>

<div class="remarkup-code-block" data-code-lang="text" data-sigil="remarkup-code-block"><pre class="remarkup-code">$wgMainCacheType = CACHE_MEMCACHED;
$wgMemCachedServers = array( &#039;127.0.0.1:11211&#039; );
$wgParserCacheType = CACHE_MEMCACHED; # optional
$wgMessageCacheType = CACHE_MEMCACHED; # optional</pre></div>



<h2 class="remarkup-header">Conclusions</h2>

<p>That&#039;s it.. that&#039;s my setup. All the extensions I work on tend to mirror production config as closely as they possibly can so shouldn&#039;t require any additional setup, so I urge you if you haven&#039;t... try setting up a minimal MediaWiki and see what&#039;s the minimal you can get away with to be productive.</p>

<p>Let me know in comments if you run into any problems or I&#039;ve converted you into a more effective engineer :-).</p></div></content></entry><entry><title>Page previews front-end tooling: Conclusions</title><link href="/phame/live/9/post/97/page_previews_front-end_tooling_conclusions/" /><id>https://phabricator.wikimedia.org/phame/post/view/97/</id><author><name>Jhernandez (Joaquin Oltra Hernandez)</name></author><published>2018-04-19T18:12:01+00:00</published><updated>2018-04-19T18:12:01+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p><a href="https://phabricator.wikimedia.org/phame/post/view/93/extension_popups_page_previews_front-end_tooling/" class="remarkup-link" rel="noreferrer">Table of contents</a></p>

<p>These changes and workflows have empowered the team to evolve and work effectively on the project, and we think they could benefit other projects that are heavy on client side UIs. Despite a few rough corners, and conversations to be had to streamline this kind of tooling into our ecosystem, we believe it is a win.</p>

<p>We are very happy to discuss, evolve the current setup, and help other teams or projects adopt the same kind of tooling. We would love creating opportunities for shared unified tools around these workflows for the ecosystem.</p>

<p>Please reach out in the comments, in phab tasks, or by email to <tt class="remarkup-monospaced">reading-web-team@wikimedia.org</tt>.</p>

<hr class="remarkup-hr" />

<p>If you want to read more about page previews in general, here are some of the recent blog posts:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item"><a href="https://medium.com/freely-sharing-the-sum-of-all-knowledge/how-we-designed-page-previews-for-wikipedia-and-what-could-be-done-with-them-in-the-future-7a5fa6b07b96" class="remarkup-link remarkup-link-ext" rel="noreferrer">How we designed page previews for Wikipedia — and what could be done with them in the future</a></li>
<li class="remarkup-list-item"><a href="https://medium.com/freely-sharing-the-sum-of-all-knowledge/why-it-took-a-long-time-to-build-that-tiny-link-preview-on-wikipedia-d5bd734df8fe" class="remarkup-link remarkup-link-ext" rel="noreferrer">Why it took a long time to build that tiny link preview on Wikipedia</a></li>
</ul></div></content></entry><entry><title>Fast and isolated JS unit tests</title><link href="/phame/live/9/post/96/fast_and_isolated_js_unit_tests/" /><id>https://phabricator.wikimedia.org/phame/post/view/96/</id><author><name>Jhernandez (Joaquin Oltra Hernandez)</name></author><published>2018-04-19T18:11:52+00:00</published><updated>2018-04-19T18:11:52+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p><a href="https://phabricator.wikimedia.org/phame/post/view/93/extension_popups_page_previews_front-end_tooling/" class="remarkup-link" rel="noreferrer">Table of contents</a> - Versions: <a href="https://chimeces.com/popups-tech-posts-series/01-tooling/03-fast-and-isolated-JS-unit-tests.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">HTML</a>, <a href="https://github.com/joakin/popups-tech-posts-series/blob/master/docs/01-tooling/03-fast-and-isolated-JS-unit-tests.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">Markdown</a></p>

<h3 class="remarkup-header">Intro</h3>

<p>For testing frontend code, MediaWiki provides a browser based QUnit setup. For running the tests, you have to spin up MediaWiki, usually through vagrant, and load Special:JavaScriptTest in your browser, which will run all the QUnit tests for all the extensions and MediaWiki itself. From then on, you can filter by module or string, hide passed tests, and a few other things like select to load the assets in production mode or development mode (<tt class="remarkup-monospaced">debug=true</tt>).</p>

<p>Like any testing strategy, this setup comes with tradeoffs. Specifically, there are a couple of big problems that we have had when working on the frontend code, which we set out to address when working in <tt class="remarkup-monospaced">Extension:Popups</tt>:</p>

<ol class="remarkup-list">
<li class="remarkup-list-item">Tests take a long time to run</li>
<li class="remarkup-list-item">It is very hard to write isolated unit tests</li>
</ol>

<h4 class="remarkup-header">Tests usually take a long time to run</h4>

<p>With this setup, tests have to go through to the MediaWiki server, ResourceLoader, and then run in the browser. This is especially noticeable in development mode, which we often enable to get readable stack traces and error messages on test failures, but makes the test run take a lot longer.</p>

<p>There are also big startup costs, for powering up vagrant and the whole system.</p>

<p>As a result of this, writing tests is costlier than it should be, which discourages developers to write and run tests, and over time ends up affecting the quality of our code and QUnit test suites. It also puts significant barriers to TDD style workflows, which rely on constantly running tests and require a fast feedback cycle with the developer.</p>

<ul class="remarkup-list">
<li class="remarkup-list-item"><a href="https://i.imgur.com/Dbu2XTM.gif" class="remarkup-link remarkup-link-ext" rel="noreferrer">Example test run in local vagrant on a laptop</a> (Warning: slow and long gif)</li>
</ul>

<h4 class="remarkup-header">It is very hard to write isolated unit tests</h4>

<p>In this environment, the real MediaWiki, global state, and the browser environment are available. As a result, tests are written often as integration tests, relying on implicit behavior, modules and state being available to perform the testing.</p>

<p>This is not a problem by itself, integration tests are important and have very valid use cases and a place in the testing process, but this environment itself makes it extremely complicated to write isolated unit tests, because of all the global state, variables, and existing loaded code.</p>

<p>As a result, tests written end up coupled to implicit behavior and state, which makes them very brittle or overly verbose because of the extensive mocking, and most of them are big integration tests, which makes them slower to run. All of this adds up to the previous point, making it an even slower moving setup for writing tests, with the same outcomes.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/r6axb2axbyzulksse7fk/PHID-FILE-twotyql5nwqte7gz6jms/03-stubbing.png" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_44"><img src="https://phab.wmfusercontent.org/file/data/r6axb2axbyzulksse7fk/PHID-FILE-twotyql5nwqte7gz6jms/03-stubbing.png" height="708" width="784" loading="lazy" alt="Over stubbing and mocking and integration tests are common" /></a></div></p>

<h3 class="remarkup-header">Requirements</h3>

<p>Given that:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Untested code is unmaintained</li>
<li class="remarkup-list-item">Tests that run slow are never run</li>
<li class="remarkup-list-item">Monolithic integration tests are slow to write, read, modify, debug, and execute; isolated unit tests are the opposite</li>
<li class="remarkup-list-item">Code that is difficult to test in isolation may be indicative of functional concerns</li>
<li class="remarkup-list-item">Efficient tests greatly contribute to efficient development</li>
</ul>

<p>We need a way to write tests for our frontend JavaScript that:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Encourage and enforce isolation</li>
</ul>

<ul class="remarkup-list">
<li class="remarkup-list-item">Without global state or a full MediaWiki environment running</li>
<li class="remarkup-list-item">Start up and run the tests very fast</li>
<li class="remarkup-list-item">Re-run tests when our source files change automatically, without having to wait for the developer to go to the browser and run the tests</li>
<li class="remarkup-list-item">Indicate clearly when a failure occurs and where it is</li>
</ul>

<p>Additional considerations:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">We should rely on familiar tools, at least initially to ease transition and migration of existing tests to the new setup</li>
</ul>

<h3 class="remarkup-header">Solution</h3>

<p>We discussed options, and the solution we ended up on was:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Running the test files in Node.js</li>
</ul>

<ul class="remarkup-list">
<li class="remarkup-list-item">For speed, ease of setup and running it in CI, and the isolated environment</li>
<li class="remarkup-list-item">With <a href="https://www.npmjs.com/package/qunitjs" class="remarkup-link remarkup-link-ext" rel="noreferrer">QUnit</a>, <a href="https://www.npmjs.com/package/jquery" class="remarkup-link remarkup-link-ext" rel="noreferrer">jQuery</a>, <a href="https://www.npmjs.com/package/sinon" class="remarkup-link remarkup-link-ext" rel="noreferrer">Sinon</a> and <a href="https://www.npmjs.com/package/jsdom" class="remarkup-link remarkup-link-ext" rel="noreferrer">jsdom</a></li>
</ul>

<ul class="remarkup-list">
<li class="remarkup-list-item">To ease migration of existing unit tests to this setup</li>
<li class="remarkup-list-item">Because of familiarity with the tools (like <em>Special:JavascriptTest</em>)</li>
</ul>

<p>You can read some more details in the architecture design record <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/doc/adr/0007-prefer-running-qunit-tests-in-node-js.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">7. Prefer running QUnit tests in Node.js</a>.</p>

<h3 class="remarkup-header">Results</h3>

<p>We implemented a new npm script <tt class="remarkup-monospaced">test:node</tt> that is run in CI as part of the script <tt class="remarkup-monospaced">test</tt> on the npm job of the extension.</p>

<p>Tests were slowly migrated to <tt class="remarkup-monospaced">tests/node-qunit</tt> from <tt class="remarkup-monospaced">tests/qunit</tt> if it was possible to make them <em>isolated</em>. <em>Integration</em> tests were kept in <tt class="remarkup-monospaced">tests/qunit</tt> as it made sense since they used the MediaWiki environment more heavily.</p>

<p>We created a small wrapper around QUnit -<a href="https://github.com/joakin/mw-node-qunit/" class="remarkup-link remarkup-link-ext" rel="noreferrer">mw-node-qunit</a>- that we&#039;ve been using, which essentially gives us a CLI tool that sets up QUnit with jsdom, jQuery, and Sinon so that it was easier to migrate the QUnit tests in.</p>

<p>It was quite straight forward to migrate, especially since most of the Extension:Popups tests from the refactor had been written in a TDD fashion, and were already mostly isolated.</p>

<p>There was a bit of figuring out because eventually some pieces of code use <tt class="remarkup-monospaced">mw.*</tt> functions or helpers, so we created a [[<a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/tests/node-qunit/stubs.js|" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/tests/node-qunit/stubs.js|</a><tt class="remarkup-monospaced">stubs.js</tt>]] where we created a few stub helpers for the tests.</p>

<p>We also kept a couple of tests as integration tests in <tt class="remarkup-monospaced">tests/qunit</tt>, but eventually we did some work to refactor the code and made unit tests for the new code, so we got rid of the integration tests in MediaWiki entirely.</p>

<p>With this setup, tests run quite fast, and it is feasible (and we do) run them in watch mode while developing, giving you fast feedback and running your code on save, all from the terminal/editor.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/vanihr5z4b5zcezpiobh/PHID-FILE-lkijkhk7etcldgztbado/03-node-qunit.gif" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_45"><img src="https://phab.wmfusercontent.org/file/data/vanihr5z4b5zcezpiobh/PHID-FILE-lkijkhk7etcldgztbado/03-node-qunit.gif" height="577" width="717" loading="lazy" alt="Test run in the CLI" /></a></div></p>

<p>The environment doesn&#039;t have any global state, or implicit knowledge of MediaWiki, which forces us to write properly isolated tests that don&#039;t rely on implicit behavior or state.</p>

<p>Finally, the move to a Node.js-based toolchain means we are easily able to leverage other great open source tools without much fuss, for example, for code coverage. We added another script -[[<a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/package.json#L12|" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/package.json#L12|</a><tt class="remarkup-monospaced">coverage</tt>]]- which just run the test command, but with the code coverage tool [[<a href="https://istanbul.js.org/|" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://istanbul.js.org/|</a><tt class="remarkup-monospaced">istanbul</tt>]] first, and just like that we got back coverage reports for the frontend code.</p>

<p>We recommend this approach for others wanting to improve how they test, and would be happy to help you figure out if this approach would work for you. For example, you can use this CLI runner, even if your JS sources just use globals instead of common.js or ES modules.</p>

<h3 class="remarkup-header">Problems</h3>

<p>Overall, the move has gone great and we don&#039;t many issues to report.</p>

<p>When migrating existing tests, it is sometimes a bit tricky to figure out how to move them to the isolated environment, since most of the <a href="https://doc.wikimedia.org/mediawiki-core/master/js/" class="remarkup-link remarkup-link-ext" rel="noreferrer">MediaWiki JS library</a> is not available as an npm package, so in some occasions we had to restructure the code a bit to not implicitly assume as much of MediaWiki being available, and other times we had to set up some stubs for the tests to run well. This had the added benefit that the dependency on MediaWiki core libraries is explicit in the tests, so we should notice when adding new dependencies or changing them because of the failing tests, keeping the behavior and dependencies explicit.</p>

<p>Another extra thing that we have been doing has been maintaining [[<a href="https://github.com/joakin/mw-node-qunit/|" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/joakin/mw-node-qunit/|</a><tt class="remarkup-monospaced">mw-node-qunit</tt>]], which has taken a bit of additional time. Making sure our wrapper works well with qunitjs, and updating the dependency versions to not fall behind and leverage improvements on the libraries.</p>

<p>We will also be looking into moving the repository to the Wikimedia organization in GitHub if other teams or projects adopt this testing strategy.</p>

<h3 class="remarkup-header">Conclusions</h3>

<p>This change has worked really well for us. We are able to run our tests really fast, even without vagrant running. The environment is isolated and really good for unit testing. The CLI wrapper had the specific helpers to ease migration from the existing tests, so it was fairly painless to switch.</p>

<p>Because of all of these, the extension has excellent code coverage, developers have an easier time contributing tests, and doing TDD is feasible. There is less uncertainty when refactoring and adding features, and the codebase is easy to work with. A big part of it is because of the unit testing story.</p>

<p>We&#039;re looking forward to adopting the same approach in other repositories and helping others do the same.</p>

<hr class="remarkup-hr" />

<ul class="remarkup-list">
<li class="remarkup-list-item"><a href="https://phabricator.wikimedia.org/phame/post/view/93/extension_popups_page_previews_front-end_tooling/" class="remarkup-link" rel="noreferrer">Table of contents</a></li>
<li class="remarkup-list-item">Previous post: <a href="https://phabricator.wikimedia.org/phame/post/view/95/better_minification_for_the_frontend_sources/" class="remarkup-link" rel="noreferrer">Better minification for the frontend sources</a></li>
<li class="remarkup-list-item">Next post: <a href="https://phabricator.wikimedia.org/phame/post/view/97/extension_popups_page_previews_front-end_tooling_conclusions/" class="remarkup-link" rel="noreferrer">Conclusions</a></li>
</ul></div></content></entry><entry><title>Better minification for the frontend sources</title><link href="/phame/live/9/post/95/better_minification_for_the_frontend_sources/" /><id>https://phabricator.wikimedia.org/phame/post/view/95/</id><author><name>Jhernandez (Joaquin Oltra Hernandez)</name></author><published>2018-04-19T18:11:45+00:00</published><updated>2018-04-19T18:11:45+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p><a href="https://phabricator.wikimedia.org/phame/post/view/93/extension_popups_page_previews_front-end_tooling/" class="remarkup-link" rel="noreferrer">Table of contents</a> - Versions: <a href="https://chimeces.com/popups-tech-posts-series/01-tooling/02-better-minification-for-frontend-sources.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">HTML</a>, <a href="https://github.com/joakin/popups-tech-posts-series/blob/master/docs/01-tooling/02-better-minification-for-frontend-sources.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">Markdown</a></p>

<h3 class="remarkup-header">Intro</h3>

<p>MediaWiki via <a href="https://www.mediawiki.org/wiki/ResourceLoader/Features#Minification" class="remarkup-link remarkup-link-ext" rel="noreferrer">ResourceLoader</a> uses <a href="https://www.mediawiki.org/wiki/ResourceLoader/Features#JavaScriptMinifier" class="remarkup-link remarkup-link-ext" rel="noreferrer">JavaScriptMinifier</a> to minimize JavaScript files so that their size is as small as possible.</p>

<p>Since the minification happens in the PHP server at runtime, even with a read through cache tradeoffs were made so that the server could minify in a performant way, giving up size gains for speed.</p>

<p>The JavaScript ecosystem has continued evolving minifiers based on Node.js tooling that can&#039;t be used on a PHP server easily.</p>

<p>There are gains to be had if we were to use node based minifiers and just give the smallest possible bundle to ResourceLoader for serving.</p>

<h3 class="remarkup-header">Requirements</h3>

<p>Payload served can to users (specifically, JavaScript files), should be as small as possible.</p>

<h3 class="remarkup-header">Solution</h3>

<p>Given we have introduced a build step previously, the solution seemed pretty straightforward.</p>

<p>We discussed we would introduce a minification step as part of the build process, and then the committed assets in <tt class="remarkup-monospaced">/resources/dist</tt> would be minified and as small as possible.</p>

<p>We first did research and got numbers to check if there would actually be any gains and if it would be worth it. You can read about it here:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Architecture design record <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/doc/adr/0008-enable-minification-of-bundle-with-uglifyjs.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">8. Enable minification of bundle with UglifyJS</a></li>
<li class="remarkup-list-item"><a href="https://www.mediawiki.org/wiki/Extension:Popups/Minifying_assets_with_uglifyjs" class="remarkup-link remarkup-link-ext" rel="noreferrer">mediawiki.org: Extension:Popups/Minifying assets with uglifyjs</a></li>
</ul>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/ftzaxct5jlxl7pkxxyj2/PHID-FILE-r5gse3zaywlhum2cvf2c/02-comparison.png" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_46"><img src="https://phab.wmfusercontent.org/file/data/ftzaxct5jlxl7pkxxyj2/PHID-FILE-r5gse3zaywlhum2cvf2c/02-comparison.png" height="110" width="780" loading="lazy" alt="Table with comparison of sizes" /></a></div></p>

<p>Then researched how to best introduce the minifier. As a standalone CLI we could run it in our now single <tt class="remarkup-monospaced">index.js</tt> file, but if we leveraged other bundler features like multiple entry points or code splitting then the minification commands could become very complex.</p>

<p>As such, we chose to integrate the minifier as a plugin to the webpack bundler. See <a href="https://webpack.js.org/plugins/uglifyjs-webpack-plugin/" class="remarkup-link remarkup-link-ext" rel="noreferrer">uglifyjs-webpack-plugin</a>.</p>

<h3 class="remarkup-header">Results</h3>

<p>We added UglifyJS minification via Webpack (<a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/webpack.config.js#L50-L71" class="remarkup-link remarkup-link-ext" rel="noreferrer">config</a>).</p>

<p>Minifying via UglifyJS brought the bundle size down from the previous version ~40%, gzipped ~25%.</p>

<p>Also, theoretically, with EcmaScript modules webpack with uglify can perform tree shaking of unused code. Webpack will mark unused exports which would then be removed in the production bundle by uglify. See the guide <a href="https://webpack.js.org/guides/tree-shaking/#minify-the-output" class="remarkup-link remarkup-link-ext" rel="noreferrer">Tree shaking</a>.</p>

<h3 class="remarkup-header">Problems</h3>

<p>Initially we had to do some research to instruct ResourceLoader to not apply minification to this already minified files. What we wanted was to avoid it so that the source maps comment would be preserved, and then we would have source maps on the production version of the code.</p>

<p>In the end, we had to give up, and ended up <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/commit/14e78466b234f0cfc700e415b65ffa3a2ca05ac7" class="remarkup-link remarkup-link-ext" rel="noreferrer">removing the banner</a> as it interfered with the minification of other modules in production, but we still have source maps in development.</p>

<h3 class="remarkup-header">Conclusions</h3>

<p>This was a pretty straight forward addition that brought us benefits with little cost. It was enabled by the change to introduce a build step.</p>

<p>The gain in size is not super significant given how small the JS code base is, but if applied to bigger code bases we could get great improvements for free.</p>

<hr class="remarkup-hr" />

<ul class="remarkup-list">
<li class="remarkup-list-item"><a href="https://phabricator.wikimedia.org/phame/post/view/93/extension_popups_page_previews_front-end_tooling/" class="remarkup-link" rel="noreferrer">Table of contents</a></li>
<li class="remarkup-list-item">Previous post: <a href="https://phabricator.wikimedia.org/phame/post/view/94/automatic_javascript_file_bundling_and_library_consumption/" class="remarkup-link" rel="noreferrer">Automatic JavaScript file bundling and library consumption</a></li>
<li class="remarkup-list-item">Next post: <a href="https://phabricator.wikimedia.org/phame/post/view/96/fast_and_isolated_js_unit_tests/" class="remarkup-link" rel="noreferrer">Fast and isolated JS unit tests</a></li>
</ul></div></content></entry><entry><title>Automatic JavaScript file bundling and library consumption</title><link href="/phame/live/9/post/94/automatic_javascript_file_bundling_and_library_consumption/" /><id>https://phabricator.wikimedia.org/phame/post/view/94/</id><author><name>Jhernandez (Joaquin Oltra Hernandez)</name></author><published>2018-04-19T18:11:38+00:00</published><updated>2018-04-19T18:11:38+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p><a href="https://phabricator.wikimedia.org/phame/post/view/93/extension_popups_page_previews_front-end_tooling/" class="remarkup-link" rel="noreferrer">Table of contents</a> - Versions: <a href="https://chimeces.com/popups-tech-posts-series/01-tooling/01-automatic-file-bundling.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">HTML</a> -  <a href="https://github.com/joakin/popups-tech-posts-series/blob/master/docs/01-tooling/01-automatic-file-bundling.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">Markdown</a></p>

<h3 class="remarkup-header">Intro</h3>

<p>With MediaWiki <a href="https://www.mediawiki.org/wiki/ResourceLoader" class="remarkup-link remarkup-link-ext" rel="noreferrer">ResourceLoader</a> JavaScript files are run without a module system, so files have to export properties to and consume properties from the global scope, defined by other files.</p>

<p>This has caused issues in the development workflows related to the project&#039;s JavaScript sources, and the use of JavaScript libraries from the OSS ecosystem. Organizing the code, trying to keep it maintainable, and understanding how it is structured has a big cost on the project&#039;s quality, development and maintenance.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/arntprli3y6hdbyynwng/PHID-FILE-q6tkwnqvn2rkptjtnsch/01-globals.png" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_47"><img src="https://phab.wmfusercontent.org/file/data/arntprli3y6hdbyynwng/PHID-FILE-q6tkwnqvn2rkptjtnsch/01-globals.png" height="484" width="665" loading="lazy" alt="JS code defining global variables all around" /></a></div></p>

<h4 class="remarkup-header">Application JavaScript sources</h4>

<p>The order in which these files needs to be loaded so that all dependencies are set properly then needs to be [[<a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/398ffb0e435f61133f6478f306ef266e147c9dea/extension.json#L75-L112|manually" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://github.com/wikimedia/mediawiki-extensions-Popups/blob/398ffb0e435f61133f6478f306ef266e147c9dea/extension.json#L75-L112|manually</a> specified in <tt class="remarkup-monospaced">extension.json</tt>]] by file name. As such, there are two sources of truth:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">The configuration that specifies some order of files,</li>
<li class="remarkup-list-item">And the source code that implicitly uses files in some order.</li>
</ul>

<p>The order in which parts your program is interpreted is defined outside of the program itself. At the file level, your file must be written with foreknowledge about which files have been loaded, which is defined elsewhere.</p>

<p>If you are doing anything mildly complex, you will end up with a big list of JavaScript files that depend on each other. Having to manually understand that dependency graph based on the list of dependencies on a JSON file and how the files use, define, or re-define global variables from or for other files, and managing a module&#039;s internal state, can be quite of a headache.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/kbqjkzb3eij7dnxlckyl/PHID-FILE-suqmk5wigsg2nlkwh4bo/01-resourcemodule-config.png" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_48"><img src="https://phab.wmfusercontent.org/file/data/kbqjkzb3eij7dnxlckyl/PHID-FILE-suqmk5wigsg2nlkwh4bo/01-resourcemodule-config.png" height="441" width="838" loading="lazy" alt="JSON configuration skyrockets as soon as you start doing anything interesting in the frontend" /></a></div></p>

<p>Changes to the source code, moving lines in the same file, refactoring code to other and new files, adding new code that uses other files, removing code, ...</p>

<p>All of these become really hard really quickly. In the end, <strong>organizing your code, trying to keep it maintainable, and understanding how it is structured have a very big cost on the project&#039;s quality, development and maintenance</strong>.</p>

<p>Figuring out if the files are specified in order properly in the configuration after and while you make changes has a high potential for obscure runtime bugs by forgetting or not seeing implicit dependencies in the manual order of files.</p>

<h4 class="remarkup-header">Libraries</h4>

<p>In order to consume libraries for the front-end code, right now we rely on manually (or by script) pulling down npm dependencies or files from a website and including them in the repository. This is hard to verify, keep up to date, and cumbersome. If necessary, it should be possible to specify dependencies with the versions and have them be automatically included in the assets we serve from a trustworthy source. It should be easy to check if the libraries are outdated, and update them.</p>

<h3 class="remarkup-header">Requirements</h3>

<p>For developing JavaScript files for the front-end:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">There must be a single source of truth for the dependency resolution</li>
<li class="remarkup-list-item">Files should be authored in a standard module system instead of relying in the global namespace</li>
<li class="remarkup-list-item">Moving code and files should be low cost and not trigger runtime errors in production if the dependencies are not properly specified</li>
<li class="remarkup-list-item">If possible, it should be easier to tap into npm libraries for our front-end needs, and check for updates (see <a href="https://phabricator.wikimedia.org/T107561" class="remarkup-link" rel="noreferrer">related discussion</a>)</li>
</ul>

<h3 class="remarkup-header">Solution</h3>

<p>We considered our options, and after discussion among the engineers, we decided that:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">We would use a Node.js based file bundler (<a href="https://webpack.js.org/" class="remarkup-link remarkup-link-ext" rel="noreferrer">webpack</a>), in order to:</li>
</ul>

<ul class="remarkup-list">
<li class="remarkup-list-item">automatically bundle our JS sources with a single source of truth for intra-module dependencies and asset load order (the source code itself via import/export statements)</li>
<li class="remarkup-list-item">use a standard module system (we started with common.js and afterwards migrated to ES modules when finalized)</li>
<li class="remarkup-list-item">validate the dependency graph and asset building before going to production (webpack statically analyzes the asset tree and fails on build)</li>
<li class="remarkup-list-item">We would introduce a build step to compile sources into production worthy assets</li>
<li class="remarkup-list-item">We would use ResourceLoader as the way to <strong>serve</strong> the JS bundle(s) and to specify other dependencies (images, i18n messages, styles)</li>
</ul>

<p>You can read some more details in the architecture design record <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/master/doc/adr/0004-use-webpack.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">4. Use webpack</a> that we wrote when we discussed this in the team.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/rnombyzkcw5uqpupeqma/PHID-FILE-iaiovgt77wsk2kuy3ybd/01-sources-with-esmodules.png" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_49"><img src="https://phab.wmfusercontent.org/file/data/rnombyzkcw5uqpupeqma/PHID-FILE-iaiovgt77wsk2kuy3ybd/01-sources-with-esmodules.png" height="579" width="1142" loading="lazy" alt="Sources with ES modules" /></a></div></p>

<h4 class="remarkup-header">Why webpack and not <tt class="remarkup-monospaced">&lt;my-favorite-tool&gt;</tt>?</h4>

<p>We looked at the ecosystem of bundlers at the time, and this is a summary of our evaluation:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">Browserify: Great tool, shrinking community and support in favor of others</li>
<li class="remarkup-list-item">Rollup: Great tool, specialized in bundling code for producing libraries, not applications</li>
<li class="remarkup-list-item">Webpack: Great tool, community, maintainers, financial support, mindshare and ecosystem of tools</li>
</ul>

<p>We are not married to, or using any webpack specific features, so we can migrate from this tool in the future if it becomes a problem.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/qrzo5bxswbzdogfjrnzi/PHID-FILE-yevenhpgbk6zffwqf62c/01-webpack-build.gif" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_50"><img src="https://phab.wmfusercontent.org/file/data/qrzo5bxswbzdogfjrnzi/PHID-FILE-yevenhpgbk6zffwqf62c/01-webpack-build.gif" height="818" width="815" loading="lazy" alt="Webpack production build" /></a></div></p>

<h3 class="remarkup-header">Results</h3>

<ul class="remarkup-list">
<li class="remarkup-list-item">We introduced <tt class="remarkup-monospaced">webpack</tt> (<a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/webpack.config.js" class="remarkup-link remarkup-link-ext" rel="noreferrer">config</a>) and <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/package.json#L4-L5" class="remarkup-link remarkup-link-ext" rel="noreferrer">build scripts</a></li>
<li class="remarkup-list-item">The JS files are located in <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/tree/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/src" class="remarkup-link remarkup-link-ext" rel="noreferrer">src/</a> and use EcmaScript modules (<a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/src/actions.js#L5-L7" class="remarkup-link remarkup-link-ext" rel="noreferrer">example</a>)</li>
<li class="remarkup-list-item">The production assets get built into <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/tree/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/resources/dist" class="remarkup-link remarkup-link-ext" rel="noreferrer">/resources/dist</a> to be served by ResourceLoader</li>
<li class="remarkup-list-item">We added a CI step in the <tt class="remarkup-monospaced">npm-test</tt> job that checks built assets have been committed and up to date (<a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/package.json#L11" class="remarkup-link remarkup-link-ext" rel="noreferrer">check-built-assets</a>)</li>
<li class="remarkup-list-item">Added a precommit hook that automatically runs the build step and adds the built assets (<a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/package.json#L13" class="remarkup-link remarkup-link-ext" rel="noreferrer">precommit</a>)</li>
</ul>

<p>These are some of the benefits we have seen after the introduction of these changes:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item">ResourceModules configuration reduced for JS files, file order is automatic now, source code is the single truth for dependency order</li>
</ul>

<ul class="remarkup-list">
<li class="remarkup-list-item"><a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/398ffb0e435f61133f6478f306ef266e147c9dea/extension.json#L75-L112" class="remarkup-link remarkup-link-ext" rel="noreferrer">Before</a>, and <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/extension.json#L89-L90" class="remarkup-link remarkup-link-ext" rel="noreferrer">after</a> (no manual ordering of files)</li>
<li class="remarkup-list-item">Decreased cost and burden for splitting modules, reusing code, adding new code since file order is automatically sorted out</li>
<li class="remarkup-list-item">Source maps in development that point to the original modules in <tt class="remarkup-monospaced">src/</tt> thanks to webpack</li>
<li class="remarkup-list-item">Anecdotally, faster ResourceLoader performance by offloading file parsing and bundling to the Node.js CLI tool watcher (<tt class="remarkup-monospaced">npm start</tt>)</li>
</ul>

<ul class="remarkup-list">
<li class="remarkup-list-item">The simpler the dependency tree, the easier it is for the RL to decide if a module has been invalidated and less processing it needs to do on server and client</li>
<li class="remarkup-list-item">We bundle <tt class="remarkup-monospaced">redux</tt> and <tt class="remarkup-monospaced">redux-thunk</tt> from npm, based on the <a href="https://github.com/wikimedia/mediawiki-extensions-Popups/blob/2ddf8a96d8df27d6b5e8b4dd8ef33581951db9fe/package.json#L31-L32" class="remarkup-link remarkup-link-ext" rel="noreferrer">pinned versions</a> from <tt class="remarkup-monospaced">package.json</tt></li>
</ul>

<ul class="remarkup-list">
<li class="remarkup-list-item"><tt class="remarkup-monospaced">npm outdated</tt> will show if there are new versions of the libraries</li>
<li class="remarkup-list-item"><tt class="remarkup-monospaced">npm update redux</tt> will update the library, and be bundled in our sources when we <tt class="remarkup-monospaced">npm run build</tt></li>
<li class="remarkup-list-item">The libraries passed <a href="https://phabricator.wikimedia.org/T151902" class="remarkup-link" rel="noreferrer">security review</a>, the hash is in <tt class="remarkup-monospaced">package-lock.json</tt> and the build is verified independently in CI by a jenkins job from the npm version</li>
<li class="remarkup-list-item">By using standard module systems, it enables us to use our sources in Node.js, which unlocks the possibility to run unit tests for the frontend code in Node.js for faster test runs and feedback loop for developers</li>
<li class="remarkup-list-item">Easy introspection of the frontend code with tools like <a href="https://www.npmjs.com/package/source-map-explorer" class="remarkup-link remarkup-link-ext" rel="noreferrer">source-map-explorer</a> and <a href="https://www.npmjs.com/package/webpack-bundle-analyzer" class="remarkup-link remarkup-link-ext" rel="noreferrer">webpack-bundle-analyzer</a></li>
</ul>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/eduiovfnfqi7pxrjtbpn/PHID-FILE-rtjejw4tdj2y2djepqgy/01-source-map-explorer.png" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_51"><img src="https://phab.wmfusercontent.org/file/data/eduiovfnfqi7pxrjtbpn/PHID-FILE-rtjejw4tdj2y2djepqgy/01-source-map-explorer.png" height="658" width="671" loading="lazy" alt="source-map-explorer view" /></a></div></p>

<h3 class="remarkup-header">Problems</h3>

<p>The approach is a bit controversial since MediaWiki extensions usually don&#039;t have build steps, so there was no easy setup for CI and the deployment pipeline to run a build step before running jobs or deploying.</p>

<p>As such, we ended up committing the built sources to the repository, which is fine with the CI step and the pre-commit hook mentioned before, but has an annoying inconvenience: every time a patch is merged with the corresponding generated asset (<tt class="remarkup-monospaced">resources/dist/*</tt>), any pending patches on Gerrit that also need to regenerate the asset (because they touch sources in <tt class="remarkup-monospaced">src/</tt>), will now be on merge conflict with master.</p>

<p>We have discussed in phabricator and in wikitech-l:</p>

<ul class="remarkup-list">
<li class="remarkup-list-item"><a href="https://lists.wikimedia.org/pipermail/wikitech-l/2017-June/088264.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">Wikitech-l: How does a build process look like for a mediawiki extension repository?</a></li>
<li class="remarkup-list-item"><a href="https://phabricator.wikimedia.org/T158980" class="remarkup-link" rel="noreferrer">T158980: Generate compiled assets from continuous integration</a></li>
</ul>

<p>But sadly didn&#039;t get to any concrete steps. If you think you can help with this issue we would really appreciate your help, as we would like to help other projects, people and teams be able to use build steps in their extensions.</p>

<p>Right now, we sidestep the issues with a bot we have configured for the repository, that responds to the command <tt class="remarkup-monospaced">rebase</tt> (parallel to the <tt class="remarkup-monospaced">recheck</tt> command that jenkins-bot responds to. On comment, the bot will download the patch, rebase it with master, run the build step, and submit a new patch without the conflict.</p>

<p>As an interim solution it works and allows us to move along, but if we want to adopt this process for other projects we would really like to have a more streamlined solution.</p>

<h3 class="remarkup-header">Conclusions</h3>

<p>This change has worked very well for us, decreasing the cognitive load and allowing us to work more effectively on our JS files. We recommend it if you have many JS files or ResourceLoader modules, and the order and dependencies are causing you headaches.</p>

<p>We hope to work together in standardizing some sort of CI + deploy process so that projects on the MediaWiki ecosystem can leverage build steps to improve their workflows and leverage powerful tools.</p>

<hr class="remarkup-hr" />

<ul class="remarkup-list">
<li class="remarkup-list-item"><a href="https://phabricator.wikimedia.org/phame/post/view/93/extension_popups_page_previews_front-end_tooling/" class="remarkup-link" rel="noreferrer">Table of contents</a></li>
<li class="remarkup-list-item">Next post: <a href="https://phabricator.wikimedia.org/phame/post/view/95/better_minification_for_the_frontend_sources/" class="remarkup-link" rel="noreferrer">Better minification for the frontend sources</a></li>
</ul></div></content></entry><entry><title>Extension:Popups (Page Previews) front-end tooling</title><link href="/phame/live/9/post/93/extension_popups_page_previews_front-end_tooling/" /><id>https://phabricator.wikimedia.org/phame/post/view/93/</id><author><name>Jhernandez (Joaquin Oltra Hernandez)</name></author><published>2018-04-19T18:11:29+00:00</published><updated>2018-04-19T19:08:04+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p><a href="https://www.mediawiki.org/wiki/Extension:Popups" class="remarkup-link remarkup-link-ext" rel="noreferrer">Extension:Popups</a> is a <a href="https://www.mediawiki.org/wiki/MediaWiki" class="remarkup-link remarkup-link-ext" rel="noreferrer">MediaWiki</a> extension that shows previews when hovering a link in a popup.</p>

<p>Extra requirements and a desire to find better ways to code for the frontend stack led to a series of interesting decisions in terms of tooling that we think could benefit other projects in the MediaWiki ecosystem.</p>

<p>In this series of posts we will explain the different technical decisions and choices in technology and tooling for the front-end part of the extension. We will provide reasoning, explanations, pros and cons, and our conclusions.</p>

<h3 class="remarkup-header">Table of contents:</h3>

<ol class="remarkup-list">
<li class="remarkup-list-item"><a href="https://phabricator.wikimedia.org/phame/post/view/94/automatic_javascript_file_bundling_and_library_consumption/" class="remarkup-link" rel="noreferrer">Automatic JavaScript file bundling and library consumption</a> (<a href="https://chimeces.com/popups-tech-posts-series/01-tooling/01-automatic-file-bundling.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">HTML</a>, <a href="https://github.com/joakin/popups-tech-posts-series/blob/master/docs/01-tooling/01-automatic-file-bundling.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">Markdown</a>)</li>
<li class="remarkup-list-item"><a href="https://phabricator.wikimedia.org/phame/post/view/95/better_minification_for_the_frontend_sources/" class="remarkup-link" rel="noreferrer">Better minification for frontend sources</a> (<a href="https://chimeces.com/popups-tech-posts-series/01-tooling/02-better-minification-for-frontend-sources.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">HTML</a>, <a href="https://github.com/joakin/popups-tech-posts-series/blob/master/docs/01-tooling/02-better-minification-for-frontend-sources.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">Markdown</a>)</li>
<li class="remarkup-list-item"><a href="https://phabricator.wikimedia.org/phame/post/view/96/fast_and_isolated_js_unit_tests/" class="remarkup-link" rel="noreferrer">Fast and isolated JS unit tests</a> (<a href="https://chimeces.com/popups-tech-posts-series/01-tooling/03-fast-and-isolated-JS-unit-tests.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">HTML</a>, <a href="https://github.com/joakin/popups-tech-posts-series/blob/master/docs/01-tooling/03-fast-and-isolated-JS-unit-tests.md" class="remarkup-link remarkup-link-ext" rel="noreferrer">Markdown</a>)</li>
<li class="remarkup-list-item"><a href="https://phabricator.wikimedia.org/phame/post/view/97/extension_popups_page_previews_front-end_tooling_conclusions/" class="remarkup-link" rel="noreferrer">Conclusions</a></li>
</ol></div></content></entry><entry><title>mustache.js replaced with JavaScript template literals in Extension:Popups</title><link href="/phame/live/9/post/90/mustache.js_replaced_with_javascript_template_literals_in_extension_popups/" /><id>https://phabricator.wikimedia.org/phame/post/view/90/</id><author><name>Niedzielski (Stephen Niedzielski)</name></author><published>2018-04-03T17:21:54+00:00</published><updated>2019-04-20T00:59:36+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p>The <a href="https://www.mediawiki.org/wiki/Extension:Popups" class="remarkup-link remarkup-link-ext" rel="noreferrer">Popups MediaWiki extension</a> previously used HTML UI templates inflated by the <a href="https://github.com/janl/mustache.js" class="remarkup-link remarkup-link-ext" rel="noreferrer">mustache.js template system</a>. This provided good readability but added an 8.1 KiB dependency* for functionality that was only used in a few places. We replaced Mustache with ES6 syntax without changing existing device support or readability and now ship 7.8 KiB less of minified uncompressed assets to desktop views where Popups was the only consumer.</p>

<h3 class="remarkup-header">Background</h3>

<p>Given that <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Template_literals" class="remarkup-link remarkup-link-ext" rel="noreferrer">ES6 template literals</a> provided similar readability** and are part of JavaScript itself, we considered this to be a favorable and sustainable alternative to Mustache templates. Additionally, although the usage of template strings require transpilation, adding support enabled <a href="http://es6-features.org/" class="remarkup-link remarkup-link-ext" rel="noreferrer">other ES6 syntaxes</a> to be used, such as let / const, arrow functions, and destructuring, all of which Extension:Popups now leverages in many areas.</p>

<p>We compared the sizes <a href="https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Popups/+/d35286a" class="remarkup-link remarkup-link-ext" rel="noreferrer">before</a> and <a href="https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Popups/+/4281670" class="remarkup-link remarkup-link-ext" rel="noreferrer">after</a> transpiling templates and they proved favorable:</p>

<div class="remarkup-table-wrap"><table class="remarkup-table">
<tr><td></td><td>index.js (gzip)</td><td>index.js</td><td>ext.popups</td><td>ext.popups.main</td><td>ext.popups.images</td><td>mediawiki.template.mustache</td><td>Total</td></tr>
<tr><td>Before</td><td>10.84 KiB</td><td>32.88 KiB</td><td>96 B</td><td>52.5 KiB</td><td>3.1 KiB</td><td>8.1 KiB</td><td><strong>65224 B</strong></td></tr>
<tr><td>After</td><td>11.46 KiB</td><td>35.15 KiB</td><td>96 B</td><td>52.7 KiB</td><td>3.1 KiB</td><td>0.0 KiB</td><td><strong>57193 B</strong></td></tr>
<tr></tr>
</table></div>

<p>Where “index.js (gzip)” is the minified gzipped size of the resources/dist/index.js Webpack build product as reported by <a href="https://github.com/siddharthkp/bundlesize" class="remarkup-link remarkup-link-ext" rel="noreferrer">bundlesize</a>, “index.js” is the minified <em>un</em>compressed size of the same bundle as reported by <a href="https://github.com/danvk/source-map-explorer" class="remarkup-link remarkup-link-ext" rel="noreferrer">source-map-explorer</a> and <a href="https://webpack.js.org/configuration/performance" class="remarkup-link remarkup-link-ext" rel="noreferrer">Webpack performance hints</a>, and the remaining columns are the sum of minified uncompressed assets for each relevant module as reported by [[ <a href="https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/master/resources/src/mediawiki/mediawiki.inspect.js" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/master/resources/src/mediawiki/mediawiki.inspect.js</a> | <tt class="remarkup-monospaced">mw.loader.inspect()</tt> ]] with the last column being a total of these inspect() modules.</p>

<p>The conclusions to draw from this table are that transpiling templates does minimally increase the size of the Webpack bundle but that the overhead is less than that of the mustache.js dependency so the overall effect is a size improvement. Additionally, note that the transpiled bundle now encompasses the HTML templates which source-map-explorer reports as contributing a 2.53 KiB minified uncompressed portion of the 35.15 KiB bundle. (Previously, templates were part of ext.popups.main but only via ResourceLoader aggregation; now templates are part of index.js.) Allowing for rounding errors and inlining, this brings the approximate overhead of transpilation itself to nearly zero, 35.15 KiB - 32.88 KiB - 2.53 KiB ≈ 0, which suggests transpiling as a viable solution for improving code elsewhere that must be written in modern form without compromising on compatibility or performance.</p>

<p>We used the <a href="https://babeljs.io/repl" class="remarkup-link remarkup-link-ext" rel="noreferrer">Babel transpiler</a> with <a href="https://babeljs.io/docs/plugins/preset-env" class="remarkup-link remarkup-link-ext" rel="noreferrer">babel-preset-env</a> to translate only the necessary JavaScript from ES6 to ES5 for <a href="https://www.mediawiki.org/wiki/Compatibility#Browsers" class="remarkup-link remarkup-link-ext" rel="noreferrer">grade A browsers</a>. The overhead for this functionality may be nonzero in some cases but is expected to diminish in time and always be less than the size of the mustache.js dependency. Please note that while most ES6 syntaxes are supported, the transpiler does not provide polyfills for new APIs (e.g., <tt class="remarkup-monospaced">Array.prototype.includes()</tt>) unless configured to do so via <a href="https://babeljs.io/docs/usage/polyfill" class="remarkup-link remarkup-link-ext" rel="noreferrer">babel-polyfill</a>. As polyfills add more overhead and are related but independent of syntax, API changes were not considered in this refactoring.</p>

<p>Manual HTML escaping of template parameters was a necessary part of this change. This functionality is built into the <a href="https://github.com/janl/mustache.js/#variables" class="remarkup-link remarkup-link-ext" rel="noreferrer">double-curly brace syntax of mustache.js</a> but is now performed using [[ <a href="https://www.mediawiki.org/wiki/ResourceLoader/Core_modules#mediaWiki.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://www.mediawiki.org/wiki/ResourceLoader/Core_modules#mediaWiki.html</a> | <tt class="remarkup-monospaced">mw.html.escape()</tt> ]]. These calls are a blemish on the code but appear only in the templates themselves and would be replaced transparently in a UI library with declarative rendering (such as Preact). We also anticipate that the template literal syntax would transition neatly to such a library. We don&#039;t know that Extension:Popups will ever want to use a UI library and accept these shortcomings may always exist.</p>

<p>*As reported by [[ <a href="https://www.mediawiki.org/wiki/ResourceLoader/Core_modules#mw.loader.inspect" class="remarkup-link remarkup-link-ext" rel="noreferrer">https://www.mediawiki.org/wiki/ResourceLoader/Core_modules#mw.loader.inspect</a> | <tt class="remarkup-monospaced">mw.loader.inspect()</tt> ]] on March 22nd, 2018.<br />
**<a href="https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Popups/+/d35286a/resources/ext.popups.main/templates/preview.mustache" class="remarkup-link remarkup-link-ext" rel="noreferrer">The Mustache version of previews</a>:</p>

<div class="remarkup-code-block" data-code-lang="handlebars" data-sigil="remarkup-code-block"><pre class="remarkup-code"><span></span><span class="x">&lt;div class=&quot;mwe-popups&quot; role=&quot;tooltip&quot; aria-hidden&gt;</span>
<span class="x">  &lt;div class=&quot;mwe-popups-container&quot;&gt;</span>
<span class="x">    </span><span class="cp">{{</span><span class="m m-Attribute">#hasThumbnail</span><span class="cp">}}</span><span class="x"></span>
<span class="x">    &lt;a href=&quot;</span><span class="cp">{{</span><span class="nv">url</span><span class="cp">}}</span><span class="x">&quot; class=&quot;mwe-popups-discreet&quot;&gt;&lt;/a&gt;</span>
<span class="x">    </span><span class="cp">{{</span><span class="m m-Attribute">/hasThumbnail</span><span class="cp">}}</span><span class="x"></span>
<span class="x">    &lt;a dir=&quot;</span><span class="cp">{{</span><span class="nv">languageDirection</span><span class="cp">}}</span><span class="x">&quot; lang=&quot;</span><span class="cp">{{</span><span class="nv">languageCode</span><span class="cp">}}</span><span class="x">&quot; class=&quot;mwe-popups-extract&quot; href=&quot;</span><span class="cp">{{</span><span class="nv">url</span><span class="cp">}}</span><span class="x">&quot;&gt;&lt;/a&gt;</span>
<span class="x">    &lt;footer&gt;</span>
<span class="x">      &lt;a class=&quot;mwe-popups-settings-icon mw-ui-icon mw-ui-icon-element mw-ui-icon-popups-settings&quot;&gt;&lt;/a&gt;</span>
<span class="x">    &lt;/footer&gt;</span>
<span class="x">  &lt;/div&gt;</span>
<span class="x">&lt;/div&gt;</span></pre></div>

<p><a href="https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Popups/+/4281670/src/ui/templates/pagePreview.js" class="remarkup-link remarkup-link-ext" rel="noreferrer">The ES6 version of the same template</a> explicates dependencies but must manually escape them. The HTML snippet is quite similar but a call <tt class="remarkup-monospaced">trim()</tt> is made so that parsing the result only creates a single text Node.</p>

<div class="remarkup-code-block" data-code-lang="js" data-sigil="remarkup-code-block"><pre class="remarkup-code"><span></span><span class="cm">/**</span>
<span class="cm"> * @param {ext.popups.PreviewModel} model</span>
<span class="cm"> * @param {boolean} hasThumbnail</span>
<span class="cm"> * @return {string} HTML string.</span>
<span class="cm"> */</span>
<span class="kr">export</span> <span class="kd">function</span> <span class="nx">renderPagePreview</span><span class="p">(</span>
	<span class="p">{</span> <span class="nx">url</span><span class="p">,</span> <span class="nx">languageCode</span><span class="p">,</span> <span class="nx">languageDirection</span> <span class="p">},</span> <span class="nx">hasThumbnail</span>
<span class="p">)</span> <span class="p">{</span>
	<span class="k">return</span> <span class="sb">`</span>
<span class="sb">		&lt;div class=&#39;mwe-popups&#39; role=&#39;tooltip&#39; aria-hidden&gt;</span>
<span class="sb">			&lt;div class=&#39;mwe-popups-container&#39;&gt;</span>
<span class="sb">				</span><span class="si">${</span><span class="nx">hasThumbnail</span> <span class="o">?</span> <span class="sb">`&lt;a href=&#39;</span><span class="si">${</span><span class="nx">url</span><span class="si">}</span><span class="sb">&#39; class=&#39;mwe-popups-discreet&#39;&gt;&lt;/a&gt;`</span> <span class="o">:</span> <span class="s1">&#39;&#39;</span><span class="si">}</span><span class="sb"></span>
<span class="sb">				&lt;a dir=&#39;</span><span class="si">${</span><span class="nx">languageDirection</span><span class="si">}</span><span class="sb">&#39; lang=&#39;</span><span class="si">${</span><span class="nx">languageCode</span><span class="si">}</span><span class="sb">&#39; class=&#39;mwe-popups-extract&#39; href=&#39;</span><span class="si">${</span><span class="nx">url</span><span class="si">}</span><span class="sb">&#39;&gt;&lt;/a&gt;</span>
<span class="sb">				&lt;footer&gt;</span>
<span class="sb">					&lt;a class=&#39;mwe-popups-settings-icon mw-ui-icon mw-ui-icon-element mw-ui-icon-popups-settings&#39;&gt;&lt;/a&gt;</span>
<span class="sb">				&lt;/footer&gt;</span>
<span class="sb">			&lt;/div&gt;</span>
<span class="sb">		&lt;/div&gt;</span>
<span class="sb">	`</span><span class="p">.</span><span class="nx">trim</span><span class="p">();</span>
<span class="p">}</span></pre></div></div></content></entry><entry><title>Beacons</title><link href="/phame/live/9/post/61/beacons/" /><id>https://phabricator.wikimedia.org/phame/post/view/61/</id><author><name>phuedx (Sam Smith)</name></author><published>2017-08-16T12:58:16+00:00</published><updated>2023-10-05T17:42:24+00:00</updated><content type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml"><p>The Reading Web team recently discovered <a href="https://bugzilla.mozilla.org/show_bug.cgi?id=1379762" class="remarkup-link remarkup-link-ext" rel="noreferrer">a bug in Firefox</a> wherein a load event is fired when Firefox loads certain pages from <a href="https://developer.mozilla.org/en-US/Firefox/Releases/1.5/Using_Firefox_1.5_caching" class="remarkup-link remarkup-link-ext" rel="noreferrer">its Back-Forward Cache (BFCache)</a>. To JavaScript on those pages, this event is a second load event (the first having been fired before the user navigated away from the page). This proved to be problematic for the cornerstone of our instrumentation, <a href="https://www.mediawiki.org/wiki/Extension:EventLogging" class="remarkup-link remarkup-link-ext" rel="noreferrer">the EventLogging extension</a> and delayed the deployment of Page Previews for approximately three months.</p>

<h3 class="remarkup-header">Background</h3>

<p>The Page Previews instrumentation revolves around the notion of a link interaction. Every time the user hovers over a link with their mouse or focuses on a link with their keyboard, a new interaction begins. Every link interaction has a unique identifier (herein, a “token”).</p>

<p>The token is a 64-bit integer in hexadecimal format that’s generated using <a href="https://www.w3.org/TR/WebCryptoAPI/#Crypto-method-getRandomValues" class="remarkup-link remarkup-link-ext" rel="noreferrer">crypto#getRandomValues()</a>, which <em>should</em> use a <a href="https://www.w3.org/TR/WebCryptoAPI/#Crypto-description" class="remarkup-link remarkup-link-ext" rel="noreferrer">“well-established cryptographic PRNG seeded with high-quality entropy”</a>. If this is the case, then the probability of token collision should approach 50% when over 4 billion tokens are generated.</p>

<p>On Monday, 27th March 2017, <a href="https://phabricator.wikimedia.org/p/Tbayer/" class="phui-tag-view phui-tag-type-person " data-sigil="hovercard" data-meta="0_53"><span class="phui-tag-core phui-tag-color-person"><span class="phui-tag-dot phui-tag-color-grey"></span>@Tbayer</span></a> <a href="https://phabricator.wikimedia.org/T161769" class="remarkup-link" rel="noreferrer">reported an unusually high number of events with duplicate tokens (&quot;duplicate events&quot;) being inserted into the EventLogging MySQL table</a>. Naturally, we assumed that this was being caused by one or more bugs in the instrumentation. During the following two sprints we tracked down and fixed what we thought was all of them. We even went so far as to <a href="https://phabricator.wikimedia.org/T163198" class="remarkup-link" rel="noreferrer">instrument the instrumentation</a> so that we could be confident that the fixes that were deployed weren’t causing more duplicate events to be logged. However, while our instrumentation was assuring us that Page Previews wasn&#039;t generating duplicate events, the number of events with duplicate tokens wasn&#039;t affected.</p>

<p>While we were tracking down and fixing bugs, <a href="https://phabricator.wikimedia.org/T161769#3195922" class="remarkup-link" rel="noreferrer">@Tbayer investigated further and reported that Firefox v51 and v52 were sending circa 98% of the duplicate events</a>. He also noted that the distribution of OS&#039;s that were sending duplicate events didn&#039;t deviate from the general distribution for pageviews. The Readers Web engineers tried to consistently reproduce the issue in Firefox but to no avail.</p>

<h3 class="remarkup-header">The Discovery</h3>

<p><a href="https://phabricator.wikimedia.org/p/Jdlrobson/" class="phui-tag-view phui-tag-type-person " data-sigil="hovercard" data-meta="0_54"><span class="phui-tag-core phui-tag-color-person">@Jdlrobson</span></a>&#039;s aha! moment was when he noticed that a lot of the duplicate events were being logged on pages with dense clusters of links, which the user might be clicking accidentally and immediately navigating back. Immediately after trying this, he saw duplicate events being logged by Firefox v54 and raised <a href="https://phabricator.wikimedia.org/T170018" class="remarkup-link" rel="noreferrer">a thorough – and startling – bug report</a>.</p>

<p>Under certain conditions, Firefox will serialize an entire page, including the state of the JavaScript VM, to memory in order to make navigating backward and forward between pages very fast. This feature is often referred to as the &quot;Back-Forward Cache&quot; (the BFCache). <a href="https://phabricator.wikimedia.org/p/Jdlrobson/" class="phui-tag-view phui-tag-type-person " data-sigil="hovercard" data-meta="0_55"><span class="phui-tag-core phui-tag-color-person">@Jdlrobson</span></a> discovered that when Firefox loads a page from the BFCache, the <tt class="remarkup-monospaced">load</tt> event fired. From the point of view of the JavaScript on that page, however, this is a second <tt class="remarkup-monospaced">load</tt> event, the first being fired before the page was serialized prior to navigation.</p>

<p><a href="https://github.com/wikimedia/mediawiki-extensions-EventLogging/blob/28240462/modules/ext.eventLogging.subscriber.js" class="remarkup-link remarkup-link-ext" rel="noreferrer">The EventLogging protocol</a> subscribes to the <tt class="remarkup-monospaced">event.*</tt> topic after the document and its sub-resources have loaded so that logging events doesn&#039;t consume resources before or during rendering the page. So, when Firefox resumed a page from the BFCache the subscriber was re-subscribed due to the second <tt class="remarkup-monospaced">load</tt> event, leading to logging of exact duplicate events.</p>

<p>The strict conditions that the BFCache requires and the user having to navigate backward or forward to a serialized page to trigger the issue very neatly explain the wild peaks and troughs in the daily % of Popups events with duplicate tokens that we were seeing.</p>

<h3 class="remarkup-header">Workaround</h3>

<p>Both <a href="https://phabricator.wikimedia.org/p/Jdlrobson/" class="phui-tag-view phui-tag-type-person " data-sigil="hovercard" data-meta="0_56"><span class="phui-tag-core phui-tag-color-person">@Jdlrobson</span></a> and I suggested a workaround.</p>

<p>While I was trying to reproduce and understand how the bug was affecting the EventLogging codebase, I noticed that the <tt class="remarkup-monospaced">DOMContentLoaded</tt> event was behaving correctly and only firing once. I suggested that EventLogging register the subscriber when the latter event fires. However, the subscriber is meant to be registered as late as possible so as not to impact page load time on resource constrained devices.</p>

<p><a href="https://phabricator.wikimedia.org/p/Jdlrobson/" class="phui-tag-view phui-tag-type-person " data-sigil="hovercard" data-meta="0_57"><span class="phui-tag-core phui-tag-color-person">@Jdlrobson</span></a>&#039;s suggestion, on the other hand, was delightfully simple: make EventLogging register the subscriber exactly once. <a href="https://gerrit.wikimedia.org/r/#/c/363957/" class="remarkup-link remarkup-link-ext" rel="noreferrer">The workaround itself</a> was a 1 character change, which, as they often do, required a much larger comment to provide much-needed context.</p>

<p><div class="phabricator-remarkup-embed-layout-center"><a href="https://phab.wmfusercontent.org/file/data/gbi7ez66mkmayuz4czjl/PHID-FILE-lws6ttxyxyotaoua574f/Screen_Shot_2017-08-11_at_10.47.32.png" class="phabricator-remarkup-embed-image-full" data-sigil="lightboxable" data-meta="0_52"><img src="https://phab.wmfusercontent.org/file/data/gbi7ez66mkmayuz4czjl/PHID-FILE-lws6ttxyxyotaoua574f/Screen_Shot_2017-08-11_at_10.47.32.png" height="375" width="604" loading="lazy" alt="Screen Shot 2017-08-11 at 10.47.32.png (375×604 px, 30 KB)" /></a></div></p>

<p>Just under an hour after the change was deployed, <a href="https://phabricator.wikimedia.org/T170018#3429243" class="remarkup-link" rel="noreferrer">the number of duplicate events per day dropped from between 10-30% to roughly 0.08%</a>. This new rate is consistent with the background noise levels of duplication in our other much simpler instrumentation, e.g. <a href="https://meta.wikimedia.org/wiki/Schema:ReadingDepth" class="remarkup-link remarkup-link-ext" rel="noreferrer">ReadingDepth</a> and <a href="https://meta.wikimedia.org/wiki/Schema:RelatedArticles" class="remarkup-link remarkup-link-ext" rel="noreferrer">RelatedArticles</a>.</p>

<h3 class="remarkup-header">What We Learned</h3>

<p>This issue took us a little over three months to track down and fix so we had /a lot/ of time to reflect on what we could&#039;ve done better.</p>

<h4 class="remarkup-header">How Are We Doing?</h4>

<p>The Page Previews instrumentation is complex. The complexity of the instrumentation is proportional to the number of general questions we&#039;re asking about the feature. This complexity meant that implementing, testing, and QA&#039;ing the instrumentation all took considerable time. Since our initial hypothesis was that the issue(s) were in the instrumentation itself, we also spent comparable amounts of time trying to re-verify that the instrumentation was working correctly.</p>

<p>Were we to answer only one or two questions at a time, i.e. collect less data, we may have saved ourselves some time as it would&#039;ve been simpler to test our hypothesis. As software engineers, we regularly sacrifice velocity for confidence in our implementation; I don&#039;t see how this is any different.</p>

<h4 class="remarkup-header">QA Without A Test Plan</h4>

<p><a href="https://phabricator.wikimedia.org/project/view/2739/" class="remarkup-link" rel="noreferrer">The Readers Web kanban board</a> has a <strong>Needs QA</strong> column. For better or worse, technical tasks like implementing the instrumentation, tend to skip this column as QA tends to be done at the browser level. Moreover, the Readers Web engineers didn&#039;t set an expectation that QA would be done as part of code review and if it was, then there was no test plan created before or after merging the code.</p>

<p>This situation has since greatly improved as a response to a variety of problems. We&#039;ve agreed that <em>all</em> tasks should move from the <strong>Needs Code Review</strong> column to <strong>Needs QA</strong> after all of the associated changes have been merged into the codebase – if a task is exceptional, then it must be documented. We&#039;ve also agreed that before a task can be moved into the <strong>Needs QA</strong> column it must have a test plan in its description.</p>

<p>We&#039;ve yet to talk about setting expectations around planning, executing, and documenting QA as part of code review as a team. When we do, I&#039;m sure that we&#039;ll write about it.</p>

<h4 class="remarkup-header">Integration Testing</h4>

<p>One of the goals of the Page Previews architecture and a driving force behind the design of new changes is testability. The extension is remarkably well covered by its unit tests.</p>

<p>The instrumentation is no exception and is 100% covered by unit tests. However, the focus of the unit tests was and still is the correctness of the properties of the events produced by the system, which is a result of misrepresenting events as POJO&#039;s and not <a href="https://martinfowler.com/bliki/ValueObject.html" class="remarkup-link remarkup-link-ext" rel="noreferrer">value objects</a> or types. So while the vanity metric of coverage is maximized, we lack proof that key invariants are holding.</p>

<p>Focussing on higher level integration tests, <a href="https://en.wikipedia.org/wiki/Fuzzing" class="remarkup-link remarkup-link-ext" rel="noreferrer">fuzzing</a>, and <a href="https://en.wikipedia.org/wiki/Mutation_testing" class="remarkup-link remarkup-link-ext" rel="noreferrer">mutation testing</a> to prove the instrumentation correct would have allowed us to immediately decline our initial hypothesis that the bug was in the instrumentation. On the other hand, the suite of unit tests will give us confidence when refactoring the system to allow for these changes.</p>

<h3 class="remarkup-header">Notes</h3>

<ol class="remarkup-list">
<li class="remarkup-list-item">The title of this post comes from both <a href="https://www.w3.org/TR/beacon/" class="remarkup-link remarkup-link-ext" rel="noreferrer">the Beacon API</a>, which is used to log events for the Page Previews instrumentation, and <a href="http://cloudkickermusic.com/album/beacons" class="remarkup-link remarkup-link-ext" rel="noreferrer">Cloudkicker’s Beacons album</a>, which is pretty darn rad.</li>
<li class="remarkup-list-item"><a href="http://cloudkickermusic.com/album/the-discovery" class="remarkup-link remarkup-link-ext" rel="noreferrer">The Discovery</a>, another Cloudkicker album, is also equally rad 🤘</li>
</ol></div></content></entry></feed>