HTTP/2 from scratch

What is HTTP/2?

HTTP/2 is a replacement for how HTTP is expressed “on the wire.” It is not a ground-up rewrite of the protocol; HTTP methods, status codes and semantics are the same, and it should be possible to use the same APIs as HTTP/1.x (possibly with some small additions) to represent the protocol.

The focus of the protocol is on performance; specifically, end-user perceived latency, network and server resource usage. One major goal is to allow the use of a single connection from browsers to a Web site.

It also transfer in binaries format instead of text.

The basis of the work was SPDY, but HTTP/2 has evolved to take the community’s input into account, incorporating several improvements in the process.

Please check my github project to understand in clear.

What are the key differences to HTTP/1.x?

At a high level, HTTP/2:

  • is binary, instead of textual
  • is fully multiplexed, instead of ordered and blocking
  • can therefore use one connection for parallelism
  • uses header compression to reduce overhead
  • allows servers to “push” responses proactively into client caches


HTTP/2 is comprised of two specifications:

  • Hypertext Transfer Protocol version 2 – RFC7540
  • HPACK – Header Compression for HTTP/2 – RFC7541

For more details:

Performance and optimization?

Have you ever noticed that when you surf the web some sites load much quicker than others. How fast any particular web page loads in your browser. A site that loads quickly has good performance. A site that makes you stare at a white screen for a long time has bad performance. The performance of a website relies on several factors like


Studies have shown that visitors will abandon a site in as little as three seconds if it does not load properly which impacts very badly on business.

As of July 2016, the average size of a web page is 2.4 megabytes.That means every time you visit a link you are likely downloading 2.4 megabytes of content of which only a small fraction actually matters.

screen-shot-2016-10-26-at-10-43-29-am Every bit of data costs money, and thus serving up a bloated website is wasting company’s money. If the average size of your web pages is 2.4 megabytes and you get a thousand visitors, you just pushed 2.4 gigabytes of data through the web. And if each of those visitors loads a couple of pages, that number is multiplied. 


Most web hosts operate with bandwidth levels, and once you exceed them, you pay a premium. Hence your site has to be easily search & findable which can be achieved by indexing.


Through indexing, your site will be ranked in google search which increase your company revenue. So all together, if you want your site to be found on search engines, making sure they are optimized for performance is an important step. 

So, what is performance?  how fast and effective your site loads in the visitor’s browser. And to build a website with great performance, we need to optimize everything within it, from the images, to the code and other elements, to how they are handled on the server, delivered through the network, and processed by the browser.


Where can we optimize?

Minifying/compressing css and js files are no longer required/supported by HTTP/2. Images need to be processed well and reduce in size along with responsiveness. Server push will take care of caching etc.

Before you startup and getting into project..

Gothru Postcss and NPM package manager

Lets see how DOM gets loaded sequentially when we have assets like HTML,CSS,Images and Javascript files. It starts with HTML and then it loads CSS,if you have multiple CSS then one after the other all CSS files get loaded and goes on the same with images , javascript files. Here DOM loads only when browser requested web server to load page with web assets using HTTP 1.1 protocol. Following image explains clearly how HTTP1.1 allows DOM to load all its assets.


This uses almost 6 TCP connections to load one after other like shown below


Now lets see how HTTP/2 handles the same..

Prior to that lets see what are requirements to convert to HTTP/2 protocol


By any chance, if any of above requirements fails, browser will use HTTP1.1 protocol by default with out any breaking. HTTP/2 encrypts the web traffic and also uses only HTTPS(you can use openSSL for free) and not HTTP. SPDY(google protocol) been introduced here in HTTP/2 for a better performance.

Lets see how HTTP/2 handles requests..


Using HTTP/2 browser can request and receive many different files at the same time and doesn’t have to wait for one file to finish before starting the download of the next one.

How to measure performance??

We only know checking performance in devtools in network like shown below.


HTTP1.1                                                                             HTTP/2

Also you can see the performance of your website in following links

Optimizing Images:

Images will take down the performance of a site as images loading consumes average of 70% downloading DOM assets.

Stable image formats

GIF – Small in size and never can be used

JPEG – Universal support, progressive loading, lossy compression and relatively low in size

PNG – lossless data compression, support transparency, complex png is greater in file size than JPEG.

When to use?? Computer generated graphics and image transparency


code based, rendered in the browser

style and manipulate using css and javascript

scales to any size or resolution

not universally supported. required PNG fallback


Do manual optimization using photoshop and make blur of unnecessary areas.


Lets date with code now..

You can check my project at this link and just clone it.

Once after you clone it, do npm install and then gulp –verbose to run our project.

Have you ever known that you can optimize your images using gulp? Seriously, i just heard now ;).

Try using gulp-image or gulp-imagemin modules to automate optimization of image.

I opened my project in cmd/terminal and installing gulp-image now.

npm install gulp-image –save-dev

Now go to gulp file and write the following code..


and now run the task “gulp imageoptim” in cmd to see all images and its folders in production folder with compressed format.

Code Optimization for HTTP2..


Automated Minification of HTML,JS and CSS:

We will use gulp-htmlmin – minify html

npm install gup-htmlmin –save-dev

include this in gulp file like

htmlmin = require(‘gulp-htmlmin’)


write the above code in gulpfile for task html and run “gulp html” in command prompt and verify index.html in production folder to see 1 line html code.

gulp-minify – minify javascript

do the same thing like we did earlier with

.pipe(minify()) and run “gulp javascript” which creates new minified files like flexslider-min.js etc. Now map these files in index.html to load faster.

cssnano – minify css

cssnano = require(‘cssnano’)

go to gulp css task and write the code like shown below and finally run gulp to run all tasks(html,javascript,css) and see how faster your site loads in browser.


Modularize CSS for HTTP/2

Instead of loading a gigantic css file, we will split into modules and load the module that is necessary for the page and also load the module that got recently updated.

Merged few css files and called up in index.html individually. Don’t get panic of multi css loads, we will necessary css files using http/2. For more info, just goto my github project and check my “modularize css” commit.

Deferring NonCritical CSS

what is defer? what is async?

Check here

To defer or async js files we have a process to achieve it. But for css files we do not have a simple process like how we do for js files. Hence we do like below.

In our scenario we delete stories and footer.css in index.html and place it in bottom of index.html before end of </body>tag with custom script that loads deferred styles. For more check my github commit

Can we load <link> stylesheets in body for better performance?? Yes by using HTTP/2 support. Generally DOM renders stops when it finds css files, to finish loading of all css files. This behavior will change in HTTP/2. For more see here.

Javascript loading using defer and async

First of all, HTTP/2 makes JavaScript concatenation pretty much unnecessary. You’ll remember, when a page loads in HTTP/1, any time the browser encounters a JavaScript, it stops rendering the page, then downloads the JavaScript file, then re-renders the page.

Lets see how javascript loads regular..


When the browser parses the page, it starts with the HTML and starts parsing it until it encounters the call for the JavaScript. Then it stops parsing the HTML while it downloads and then executes the JavaScript. Only after both the download and execution are complete does it continue parsing the HTML. And here you see why we usually place our JavaScripts at the bottom of the page, so we’re not blocking the HTML parsing until the JavaScript kicks in.


If we append the async attribute to our script call, the way the browser parses the content changes. Now the browser will continue to parse the HTML while the JavaScript is downloading, and the only time the parsing stops is when the JavaScript is executed.

So this improves the performance of the page quite dramatically. The problem with async is you don’t really have control over when the JavaScript will be executed. So that means, if you have a bunch of different scripts called in, and they’re all being called in asynchronously, they will just start executing once they’re finished downloaded, and if you have dependencies, you may end up having a file execute before its dependency has been loaded in the page, and that will cause problems.


The defer attribute kinda solves some of this, and it does it by completely deferring the execution of a JavaScript until everything else has happened.

So, we parse all the HTML, the JavaScript is downloaded whenever it’s encountered, and the execution only happens once the entire HTML is fully parsed. The other thing to know about defer is, when you use defer, the execution of the JavaScript will happen in the order the deferred elements are listed on the page. That means, if you list a bunch of different elements with defer, they will execute from the top down.


In above image, we used async for 4 scripts(which can load parallel while DOM renders) and defer for 1 script(which need to be load at last after DOM renders) and no async/defer for jQuery script.


Compress data using ZGIP

This depends on your hosting server. You may use apache/ngnx etc. You need configure things in .htaccess file in servers like explain below. – check this for multiple servers and its configurations

Cache files in browser

You can cache files like (css,js, any image type extensions) and set for a limit to expiration.


In Above image we can see that we cached static assets like css, js for 1 year and images for 1 month. Now it may strike in your mind, how my updated site will get load if i cache for 1 year? Yes, you may updated your site with images, new look by css, or new modules etc. So we have a way to reload specific files using  Cache-Busting.



We literally force the browser to download new files and we can do this in a really smart way by simply renaming the files any time we change them.

Now on the face of it that sounds really complicated because we don’t want to rename style.cssto style version one and style version two and so on, but we can use something called a file hash to do this. A file hash is a unique number that’s generated from the contents of a file. And that means anytime you change a file the file hash will change as well and gets a new name and will be brought into the browsers. There are simple tools to achieve this functionality.


As you got to know file-hash, we will name each file with some hashtag and will be cached in browser. If you make changes in style-123456.css, it gets rename and when browser hits server it feels like a new file been added and will be cached immediately for 1 year like shown in below.


We have tools to achieve this functionality and those are

gulp-rev – This tool automatically appends hashes to the end of file names.

gulp-rev-replace – which goes into index.html and any other HTML files, or other files, finds references to the files that were just hashed and changes them so that we don’t have to manually change these files.

 rev-del – to go through the manifest and figure out which files are the currently updated versions and delete all the ones that are not so we don’t end up with a ton of different hashed versions of our files in our storage.

Lets see how we will use it in gulp file..

Lets install and include all these 3 modules in gulp file. Now, i am creating a folder named ‘limbo’ as a temp folder to move all changes in files(html,css,js) to this temp before it renames files and from that we will move to production after renaming files(revision and deletions of old files using gulp-rev-replace and rev-del)

Now run gulp to see changes like below..limbo folder and hashtag added to files.


check rev-manifest.json to find the hashtag mappings for the file changes…


Now lets test it, just go to development folder->CSS->header.css and change color value in .masthead and don’t save it immediately. Goto production folder->CSS-> and verify the folder name carefully. It should be replaced from header-123456.css to header-321342.css after saving of header.css file.

By this we can do cache-busting to replace updated file in browser.

To understand more..check my github commit.

Server Push


For server push, you have to configure the files you have to load in advance in .htaccess file of your hosting server. Set link in response header to push all the files you specified in link.


Even you can do the same server push by changing your index.html to index.php and appending the following code in it.


function push_to_browser($as, $uri) {
header(‘Link: ‘ . $uri . ‘; rel=preload; as=’ . $as, false);

$assets = array(
‘<//,400i,700,700i,900,900i>’ => ‘style’,
‘</style-main.css>’ => ‘style’,
‘/CSS/header.css’ => ‘style’,
‘<//>’ => ‘script’,
‘</JS/libs/jquery.flexslider-min.js>’ => ‘script’,
‘</images/mainpromo/welcome01-1600.jpg>’=> ‘image’

array_walk( $assets, ‘push_to_browser’)


Note: Please do install php in your machine along with npm module (php2html) in gulp file. Also use it in html task. For more details, please check my github commit.

Leverage CDN’s for performance


You can also check this

Now lets try adding SSL certificates to it and enable https..



Now, enable https in gulp webserver like shown below..


Lets run gulp now to see the following


Click proceed and see your project runs with https. For more check my github commit.

Implementation of http2/SPDY protocol referrence took from