Browse Source

Revised and edited blog post on web optimization with node.

pull/4/head
jrtechs 6 years ago
parent
commit
cc54d44937
3 changed files with 178 additions and 139 deletions
  1. BIN
      entries/web-development/media/3e2e138f85024c1a96ba0ad55bc5d2ed.png
  2. BIN
      entries/web-development/media/a6594f978c7925bcf3194a1c97029bd3.png
  3. +178
    -139
      entries/web-development/node-website-optimization.md

BIN
entries/web-development/media/3e2e138f85024c1a96ba0ad55bc5d2ed.png View File

Before After
Width: 958  |  Height: 729  |  Size: 128 KiB

BIN
entries/web-development/media/a6594f978c7925bcf3194a1c97029bd3.png View File

Before After
Width: 1230  |  Height: 744  |  Size: 140 KiB

+ 178
- 139
entries/web-development/node-website-optimization.md View File

@ -1,26 +1,34 @@
It is a well known fact that a fast website is critical towards having high user retention.
Google looks favorable upon websites which are well optimized and fast. If you are using
a CMS like WordPress or Wix, a lot of optimization is done automatically. If you like to
build stuff from scratch like me, there is a ton of work required to optimize a website.
It is a well-known fact that a fast website is critical towards having high user
retention. Google looks favorable upon websites which are well optimized and
fast. If you are using a CMS like WordPress or Wix, a lot of optimization is
done automatically. If you like to build stuff from scratch like me, there is a
ton of work required to optimize a website.
This post will cover the 8 things that I did to decrease the load time of this node blog
by two seconds.
This post will cover the 8 things that I did to decrease the load time of this
node blog by two seconds.
#### After Optimization #### After Optimization
![Website Speed After Improvements](media/websiteOptimization/beforeImprovements.png)
![Website Speed After Improvements](media/a6594f978c7925bcf3194a1c97029bd3.png)
Website Speed After Improvements
#### Before Optimization #### Before Optimization
![Website Speed Before Improvements](media/websiteOptimization/beforeImprovements.png)
![Website Speed After Improvements](media/a6594f978c7925bcf3194a1c97029bd3.png)
Website Speed Before Improvements
## 1: Optimize Images
1: Optimize Images
------------------
Since images are the largest portion of a website's size, optimizing and reducing the
size of images will greatly decrease load time. In a perfect web development world, everyone would
use SVG images which are extremely small and don't need compression. However, since most of us
use PNG and JPEG images I wrote a script to automatically optimize images for the web.
Since images are the largest portion of a website's size, optimizing and
reducing the size of images will decrease load time. In a perfect web
development world, everyone would use SVG images which are extremely small and
don't need compression. I wrote a script to automatically optimize JPEG and PNG
images for the web since most people don’t use SVG images.
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#!/bin/bash #!/bin/bash
# Simple script for optimizing all images for a website # Simple script for optimizing all images for a website
@ -44,38 +52,43 @@ for folder in "${folders[@]}"; do
optipng -o7 -preserve "$f" optipng -o7 -preserve "$f"
done done
done done
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When ran, this script will go through the img, and entries folder recursively and optimize all
the images in there. If an image is more than 690px wide, it will scale it down to save size. In
most cases, it is useless to have images with a width greater than 690 because it will just get
scaled by the client's web browser.
When ran, this script will go through the ‘img, and ‘entries’ folder recursively
and optimize all the images in there. If an image is more than 690px wide, it
will scale it down to save size. In most cases it is useless to have images with
a width greater than 690px because it will just get scaled by the client's web
browser.
If you are running a Debian based linux distro, you can download the
dependencies for this script with the following commands:
If you are running a Debian based linux distro, you can download the dependencies for this script with
the following commands:
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
apt-get install jpegoptim apt-get install jpegoptim
apt-get install optipng apt-get install optipng
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The goal of this script is to make most of the images under 100kb for the web. It is ok to have
a few images above 100k, however, you should really avoid having images above 200kb.
The goal of this script is to make most of the images under 100kb for the web.
It is ok to have a few images above 100kb; however, you should really avoid
having images above 200kb.
2: Take advantage of Async calls
--------------------------------
## 2: Take advantage of Async calls
One of the largest benefits of Node is its Async abilities where code is
executed in a multi-threaded fashion. This can become a callback hell if not
handled correctly, but, with good code structure it can become very useful. When
code is executed in parallel, you can decrease run time by doing other stuff
while waiting on costly file IO and database calls.
One of the largest benefits of node is its async abilities where code is executed in a
multi-threaded fashion. This can become a callback hell if not handled correctly, but, with
good code structure it can become very easy. When code is executed in parallel, you can greatly
decrease run time by doing other stuff while waiting on costly file IO or database calls.
The problem with async code is that it is hard to coordinate. Node has a lot of ways to handel
synchronization, but, I prefer to use [Promises](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise)
. Here is a simple example where async code can become inefficient.
The problem with async code is that it is hard to coordinate. Node has a lot of
ways to handel synchronization, but, I prefer to use
[Promises](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise).
Here is a simple example where Async code can be misused
Good Code Async: Good Code Async:
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Promise.all([includes.printHeader(), Promise.all([includes.printHeader(),
require(file).main(filename, request), require(file).main(filename, request),
includes.printFooter()]).then(function(content) includes.printFooter()]).then(function(content)
@ -86,10 +99,11 @@ Promise.all([includes.printHeader(),
{ {
console.log(err); console.log(err);
}); });
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Bad Async Code: Bad Async Code:
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
includes.printHeader(res).then(function() includes.printHeader(res).then(function()
{ {
return require(file).main(res, filename, request); return require(file).main(res, filename, request);
@ -100,20 +114,22 @@ includes.printHeader(res).then(function()
{ {
console.log(err); console.log(err);
}) })
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In the first example three blocks of async code are executed in parallel and in the second example
three blocks of async code are executed one after another. Many people may initially do the second
option because it may seem like you have to create and render the footer after you render the header
and body of the page.
In the first example three blocks of async code are executed in parallel and in
the second example three blocks of async code are executed one after another.
Many people may initially do the second option because it may seem like you must
create and render the footer after you render the header and body of the page.
A great way to handel async calls is by having most of your methods returning promises which resolve to
the HTML or DB information that they produce. When you run Promise.all, it returns an array of the
objects which enables you to preserve the order ie header, body, footer. After you do this for all of
your code, it creates a "perfect" async tree which actually runs very fast.
A great way to handel async calls is by having most of your methods returning
promises which resolve to the HTML or DB information that they produce. When you
run Promise.all, it returns an array of the objects which enables you to
preserve the order ie header, body, footer. After you do this for all your code,
it creates a "perfect" async tree which actually runs very fast.
Another Good Async Example: Another Good Async Example:
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/** /**
* Calls posts and sidebar modules to render blog contents in order * Calls posts and sidebar modules to render blog contents in order
* *
@ -134,66 +150,70 @@ main: function(requestURL)
}) })
}); });
} }
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
## 3: Client Side Caching
3: Client-Side Caching
----------------------
Client side caching is where the client's web browser stores static content they
receive from your website. For example,
if a client caches a downloaded css style sheet, they won't have to download it again for the next page
they visit on your website with that style sheet.
Client-side caching is where the client's web browser stores static content they
download from your website. For example, if a client caches a CSS style sheet,
they won't have to download it again for the next page they visit.
You should cache all images, Javascript and css files since those typically don't change.
It is a good idea to set the expiration date of the cache to be something longer than a week, I typically
set mine for a month.
You should cache all images, JavaScript and CSS files since those typically
don't change. It is a good idea to set the expiration date of the cache to be
something longer than a week, I typically set mine for a month.
In order for a web browser to accept and cache files, you must set some tags in the HTTP header of the
response which you send to the client. In this header you must specify the content type, some cache variables
like its max age. You also must assign a ETag to the header to give the client a way to verify the content
of the cache. This enables the client to detect if there was a change to the file. Some people set the ETag
equal to the version of the stylesheet or javascript, but, it is far easier to just set it equal to the hash of the
file. I use md5 to hash the files since it is fast and I'm not worried about hash collisions for this particular
application.
For a web browser to accept and cache files, you must set some tags in the HTTP
header. In the HTTP header you must specify the content type, cache variables
like max age. You also must assign a ETag to the header to give the client a way
to verify the content of the cache. This enables the client to detect if there
was a change to the file and download it again. Some people set the ETag equal
to the version of the stylesheet or JavaScript, but, it is far easier to just
set it equal to the hash of the file. I use md5 to hash the files since it is
fast and I'm not worried about hash collisions for this application.
You can do this in NGINX if you use it to serve static files, but, you can also do it
in Node.
You can do this in NGINX if you use it to serve static files, but, you can also
do it directly in Node.
#### Caching CSS #### Caching CSS
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
var eTag = crypto.createHash('md5').update(content).digest('hex'); var eTag = crypto.createHash('md5').update(content).digest('hex');
result.writeHead(200, {'Content-Type': 'text/css', 'Cache-Control': result.writeHead(200, {'Content-Type': 'text/css', 'Cache-Control':
'public, max-age=2678400', 'ETag': '"' + eTag + '"', 'public, max-age=2678400', 'ETag': '"' + eTag + '"',
'Vary': 'Accept-Encoding'}); 'Vary': 'Accept-Encoding'});
result.write(content); result.write(content);
result.end(); result.end();
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#### Caching Images #### Caching Images
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
var eTag = crypto.createHash('md5').update(content).digest('hex'); var eTag = crypto.createHash('md5').update(content).digest('hex');
result.writeHead(200, {'Content-Type': 'image/png', result.writeHead(200, {'Content-Type': 'image/png',
'Cache-Control': 'public, max-age=2678400', 'Cache-Control': 'public, max-age=2678400',
'ETag': '"' + eTag + '"'}); 'ETag': '"' + eTag + '"'});
result.write(content); result.write(content);
result.end(); result.end();
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
## 4: Server Side Caching
4: Server-Side Caching
----------------------
Even with the best async server, there are still ways to improve performance. If Even with the best async server, there are still ways to improve performance. If
you cache all the static pages that you generate in a HashMap, you can quickly access
it for the next web user without ever having to query the database or read files.
you cache all the static pages that you generate in a HashMap, you can quickly
access it for the next web user without ever having to query the database or
read files.
#### Ex:
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
const cache = require('memory-cache'); const cache = require('memory-cache');
var html = cache.get(filename); var html = cache.get(filename);
if(html == null) if(html == null)
{ {
// Generate page contents // Generate page contents
Promise.all([includes.printHeader(), Promise.all([includes.printHeader(),
require(file).main(filename, request), require(file).main(filename, request),
includes.printFooter()]).then(function(content) includes.printFooter()]).then(function(content)
@ -208,59 +228,73 @@ else
res.write(html); res.write(html);
res.end(); res.end();
} }
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
I found that it is the fastest to cache everything from static html pages, CSS, JavaScript,
and Images. For a larger site this may consume a boat load of ram, but, storing images in a HashMap
greatly reduce load time since you don't need to read the file from a disk. For a smaller node application
like my blog the benefits of using the server side cache nearly cut my load time in half.
I found that it is the fastest to cache everything from static html pages, CSS,
JavaScript, and images. For a larger site this may consume a boat load of ram,
but, storing images in a HashMap reduces load time since you don't need to read
the file from a disk. For my blog, server-side caching nearly cut my load time
in half.
Make sure that you don't accidentally cache a dynamic page like the CMS page in your admin section.
Make sure that you don't accidentally cache a dynamic page like the CMS page in
your admin section—hard to realize while debugging.
To demonstrate the performance increase of this method, I restarted my web server (clearing the cache) and ran
a speed test which ran three trials. The first two trials were slow since the server didn't cache anything. However,
the third trial ran extreamly fast since all the contents were in the server's cache.
To demonstrate the performance increase of this method, I restarted my web
server (clearing the cache) and ran a speed test which ran three trials. The
first two trials were slow since the server didn't have anything in its cache.
However, the third trial ran extreamly fast since all the contents were in the
server's cache.
![Server Cache Example](media/websiteOptimization/serverCache.png)
![Server Cache Example](media/3e2e138f85024c1a96ba0ad55bc5d2ed.png)
## 5: Enable Compression
Server Cache Example
Compressing content before it is transferred over the internet can significantly decrease the loading time of your
website. The only trade off of this approach is that it takes more CPU resources, however, it is well worth it for the
performance gains. Using Gzip on CSS and html can reduce the size by 60-70%.
5: Enable Compression
---------------------
If you are running an NGINX server, you can enable Gzip there. There is also a simple node module which will
use Gzip compression on an Express app.
Compressing content before it is transferred over the internet can significantly
decrease the loading time of your website. The only trade off from this approach
is that it takes more CPU resources, however, it is well worth it for the
performance gains. Using Gzip on CSS and HTML can reduce the size by 60-70%.
If you are running an NGINX server, you can enable Gzip there. There is also a
simple node module which will use Gzip compression on an Express app.
#### Gzip on Express App #### Gzip on Express App
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
npm install compression npm install compression
var compression = require('compression') var compression = require('compression')
app.use(compression()); app.use(compression());
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
## 6: Remove Unused CSS Definitions
If you use a css library like Bootstrap or W3CSS, you will have a ton of css classes which go
unused. The standard BootStrap css file is around 210kb. After I removed unused CSS definitions
the size of the BootStrap file was only 16kb.
6: Remove Unused CSS Definitions
--------------------------------
There are tons of tools online, however, for my blog I used PurgeCSS which is a
node library.
If you use a CSS library like Bootstrap or W3-CSS, you will have a ton of css
classes which go unused. The standard BootStrap CSS file is around 210kb. After
I removed unused CSS definitions the size of the BootStrap file was only 16kb
for my website.
For my blog I used PurgeCSS which is a node library.
This command will install PurgeCSS for CLI (command line interface). This command will install PurgeCSS for CLI (command line interface).
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
npm i -g purgecss npm i -g purgecss
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This is an example of how you could use PurgeCSS to remove unused css
definitions.
This is an example of how you would use PurgeCSS to remove unused css definitions.
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
purgecss --css css/app.css --content src/index.html --out build/css/ purgecss --css css/app.css --content src/index.html --out build/css/
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
PurgeCSS CLI options. PurgeCSS CLI options.
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
purgecss --css <css> --content <content> [option] purgecss --css <css> --content <content> [option]
Options: Options:
@ -271,40 +305,42 @@ Options:
[array] [default: []] [array] [default: []]
-h, --help Show help [boolean] -h, --help Show help [boolean]
-v, --version Show version number [boolean] -v, --version Show version number [boolean]
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This is not the ideal solution since some CSS definitions may be used on some pages
yet unused on other pages. When running this command be sure to select a page which
uses all of your CSS to prevent loosing some CSS styling on certain pages.
This is not the ideal solution since some CSS definitions may be used on some
pages yet unused on other pages. When running this command be sure to select a
page which uses all your CSS to prevent losing some CSS styling on certain
pages.
You don't have to use this through the command line, you can run this directly in your
node app to make this automated. Check out their [documentation](https://www.purgecss.com/) to
learn more.
You don't have to use this through the command line, you can run this directly
in your node app to make it automated. Check out their
[documentation](https://www.purgecss.com/) to learn more.
7: Minify CSS and Javascript
----------------------------
## 7: Minify CSS and Javascript
This is the easiest thing you can do to reduce the size of your website. You
just run your CSS and JavaScript through a program which strips out all
unnecessary characters.
This is possibly the easiest thing you can do to reduce the size of your website.
Essentially you just run your CSS and JavaScript through a program which strips out
all unnecessary characters.
Ex of Minified CSS:
Ex Of Minified CSS:
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.bg-primary{background-color:#3B536B!important}#mainNav{font-family:Montserrat,'Helvetica Neue',Helvetica,Arial,sans-serif;font-weight:700;text-transform:uppercase;padding-top:15px;padding-bottom:15px}#mainNav .navbar-nav{letter-spacing:1px}#mainNav .navbar-nav li.nav-item a.nav-link{color:#fff}#mainNav .navbar-nav li.nav-item a.nav-link:hover{color:#D2C0FF;outline:0}#mainNav .navbar-toggler{font-size:14px;padding:11px;text-transform:uppercase;color:#fff;border-color:#fff}.navbar-toggler{padding:.25rem .75rem;font-size:1.09375rem;line-height:1;background-color:transparent;border:1px solid transparent;border-radius:.25rem}.table .thead-dark{color:#fff;background-color:#513E7D;border-color:#32383e}footer{color:#fff}footer h3{margin-bottom:30px}footer .footer-above{padding-top:50px;background-color:#3B536B}footer .footer-col{margin-bottom:50px}footer .footer-below{padding:25px 0;background-color:#3B536B} .bg-primary{background-color:#3B536B!important}#mainNav{font-family:Montserrat,'Helvetica Neue',Helvetica,Arial,sans-serif;font-weight:700;text-transform:uppercase;padding-top:15px;padding-bottom:15px}#mainNav .navbar-nav{letter-spacing:1px}#mainNav .navbar-nav li.nav-item a.nav-link{color:#fff}#mainNav .navbar-nav li.nav-item a.nav-link:hover{color:#D2C0FF;outline:0}#mainNav .navbar-toggler{font-size:14px;padding:11px;text-transform:uppercase;color:#fff;border-color:#fff}.navbar-toggler{padding:.25rem .75rem;font-size:1.09375rem;line-height:1;background-color:transparent;border:1px solid transparent;border-radius:.25rem}.table .thead-dark{color:#fff;background-color:#513E7D;border-color:#32383e}footer{color:#fff}footer h3{margin-bottom:30px}footer .footer-above{padding-top:50px;background-color:#3B536B}footer .footer-col{margin-bottom:50px}footer .footer-below{padding:25px 0;background-color:#3B536B}
```
There are Node libraries which can minify CSS and Javascript, however, if you are really
lazy, just use a website like [this](https://cssminifier.com/).
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
There are Node libraries which can minify CSS and Javascript, however, if you
are lazy, just use a website like [this](https://cssminifier.com/).
## 8: Keep Minimal JavaScript
8: Keep Minimal JavaScript
--------------------------
Ignoring the amount of Node dependencies you have, it is critical to minimize
the amount of dependencies the client needs. I was able to completely remove
BootStrap's javascript and Jquery by simply writing a javascript function for my nav bar.
This reduced the size of my website by 100kb.
Ignoring the gross amount of Node dependencies you have, it is critical to
minimize the amount of dependencies the client needs. I completely removed
BootStrap's JavaScript and jQuery from my blog by simply writing a javascript
function for my nav bar. This reduced the size of my website by 100kb.
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
const e = document.querySelector(".navbar-toggler"); const e = document.querySelector(".navbar-toggler");
const t = document.querySelector(".navbar-collapse"); const t = document.querySelector(".navbar-collapse");
@ -321,20 +357,23 @@ e.onclick = function()
t.classList.add('collapse'); t.classList.add('collapse');
} }
} }
```
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
You should also debate how much you need 3rd party scripts like Google Analytics.
In most cases people don't full take advantage of Google Analytics, a simple backend analytics
service would work just as good while saving the client load time.
You should debate how much you need 3rd party scripts like Google Analytics. In
most cases people don't full take advantage of Google Analytics, a simple
backend analytics service would work just as good while saving the client load
time.
## Resources
Resources
---------
- [Pingdom Speed Test](https://tools.pingdom.com/)
- [Google Website Speed Test](https://developers.google.com/speed/pagespeed/insights/)
- [Code to My "Optimized" Node Blog](https://github.com/jrtechs/NodeJSBlog)
- [Purge CSS](https://www.purgecss.com/)
- [CSS and JavaScript Minifier](https://www.minifier.org/)
- [Pingdom Speed Test](https://tools.pingdom.com/)
- [Google Website Speed
Test](https://developers.google.com/speed/pagespeed/insights/)
- [Code to My "Optimized" Node Blog](https://github.com/jrtechs/NodeJSBlog)
- [Purge CSS](https://www.purgecss.com/)
- [CSS and JavaScript Minifier](https://www.minifier.org/)

Loading…
Cancel
Save