An average user stays in your site if the page loads in < 1000ms. In 2014, an average page size is 1.9MB. See the below chart for more stats.
via source httparchive
The core content of site should render with in
1000ms. If it fails, then the user is never gonna come back to your site again. By improving page load time, the revenue and downloads increased tremendously for many popular companies.
- Let's take Walmart for every
100msof improvement to their site, they grew revenue up to
- Yahoo traffic increased by
- Mozilla speed up their site by
160 millionmore firefox downloads/year.
Set performance budget.
Measuring the current performance.
Find the problem causing performance issue.
And finally yay, optimize it.
There are several ways to optimize your site, Lets see about it
Speed index is the average time at which visible parts of page is painted in the browser. Expressed in milliseconds and also depends on the viewport size. See below image ( video frames to show web page load at seconds ).
Lower speed index score is the better score.
Speed index can be measured by using Webpagetest (maintained by google).
Webpage test has lots features like running multiple test in different locations using different browsers. And can measure other metrics like load time, number of DOM elements, first byte time etc..
Eg: checkout the measured result of amazon here using webpagetest.
Watch the below video by Patrick Meenan to know more about webpagetest
If you know how a browser works, then you know about how
HTML, CSS, JS are parsed by browser and which is blocking the rendering of the page. If you don't, see the below simple diagrammatic representation below.
- First browser parses
HTMLmarkup to construct
DOM tree( DOM = Document Object Model )
- Then parses
CSSOM tree( CSSOM = CSS Object Model )
- Before combining both DOM and CSSOM tree to contruct Render tree,
JSfiles are parsed and executed.
Now you understood how parsing is done in a browser. Lets see which is blocking the construction of rendering tree.
CSS is treated as render blocking. For the construction of CSSOM, all the CSS are downloaded regardless of whether they are used in current page or not.
To solve this render-blocking, go through below steps
- Inline the critical CSS, that is most important styles used by page above the fold in head tag inside
- Remove the unused CSS.
So how do I find unused CSS.
Use Pagespeed Insight to get stats like unused CSS, render-blocking CSS and JS files etc.
Eg: Flipkart's Pagespeed Insight result here.
- Use CSS Stats to get total no of elements used, no of unique styles, fonts etc.
- Pagespeed Insight Chrome Extention.
- Tag Counter Chrome Extention.
HTML markup, the parsing is stopped. Only after executing the script, the HTML rendering will be continued. So this block's the
rendering of the page.
To solve this
defer attribute in
<script async>will download the file during the
HTML parsingand execute it as soon as the file is downloaded.
<script defer>will download the file during the
HTML parsingand will execute it after
HTML parsingis completed.
async and deferboth are used in Google Analytics
Memory leaks and Bloat is one of the problems faced by web developers. Lets see how to find a memory leak and later solve them.
- Using Chrome Task Manager to check
memory used by appas well
js memory(total + live memory). If your memory keeps on growing on each action, then you can suspect there is a memory leak.
Heap Profiler to find memory leak. Open chrome devTools and go to profiles tab and select take heap snapshot.
If you don't know about chrome DevTools, read my previous post.
- Summary View - To show total number of objects allocated and its instance,
Shallow Size(size of the memory of obj itself) and
Retained Size(size of the memory that will be freed once automatic GC happens + unreachable object).
- Comparison View - To compare two or more snapshots before and after a operation to check memory leak.
- Containment View - To show overall view of your app object structure + DOMWindow Objects (that is global obj's), GC roots, Native objects (from the browser).
- Dominators View - This will show the dominators tree of a heap graph.
Read more in detail about Heap profiler.
DOM elements causes DOM Leak and prevents automatic garbage collection(GC) process.
Lets see an example
<div id="container"> <h1 id="heading">I am just a heading nothing much</h1> </div>
var parentEle = document.getElementById('container'); //get parent ele reference var headingEle = document.getElementById('heading'); //get child ele reference parentEle.remove(); //removes parent element from DOM //but its child ref still exist, So parentEle won't collect GC'd and causes DOM Leak
Let's fix this DOM leak by making its reference
headingEle = null; //Now parentEle will be GC'd
The above are common problems faced by web developers. Thats all for today. If you like my post share it or have a doubt comment below. Thanks!!