Personal blog written from scratch using Node.js, Bootstrap, and MySQL. https://jrtechs.net
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

379 lines
15 KiB

  1. It is a well-known fact that a fast website is critical towards having high user
  2. retention. Google looks favorable upon websites which are well optimized and
  3. fast. If you are using a CMS like WordPress or Wix, a lot of optimization is
  4. done automatically. If you like to build stuff from scratch like me, there is a
  5. ton of work required to optimize a website.
  6. This post will cover the 8 things that I did to decrease the load time of this
  7. node blog by two seconds.
  8. #### After Optimization
  9. ![Website Speed After Improvements](media/a6594f978c7925bcf3194a1c97029bd3.png)
  10. Website Speed After Improvements
  11. #### Before Optimization
  12. ![Website Speed After Improvements](media/a6594f978c7925bcf3194a1c97029bd3.png)
  13. Website Speed Before Improvements
  14. 1: Optimize Images
  15. ------------------
  16. Since images are the largest portion of a website's size, optimizing and
  17. reducing the size of images will decrease load time. In a perfect web
  18. development world, everyone would use SVG images which are extremely small and
  19. don't need compression. I wrote a script to automatically optimize JPEG and PNG
  20. images for the web since most people don’t use SVG images.
  21. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  22. #!/bin/bash
  23. # Simple script for optimizing all images for a website
  24. #
  25. # @author Jeffery Russell 7-19-18
  26. WIDTH="690>" # the ">" tag specifies that images will not get scaled up
  27. folders=("./entries" "./img")
  28. for folder in "${folders[@]}"; do
  29. for f in $(find $folder -name '*.jpg' -or -name '*.JPG'); do
  30. convert "$f" -resize $WIDTH "$f"
  31. jpegoptim --max=80 --strip-all --preserve --totals --all-progressive "$f"
  32. done
  33. for f in $(find $folder -name '*.png' -or -name '*.PNG'); do
  34. convert "$f" -resize $WIDTH "$f"
  35. optipng -o7 -preserve "$f"
  36. done
  37. done
  38. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  39. When ran, this script will go through the ‘img, and ‘entries’ folder recursively
  40. and optimize all the images in there. If an image is more than 690px wide, it
  41. will scale it down to save size. In most cases it is useless to have images with
  42. a width greater than 690px because it will just get scaled by the client's web
  43. browser.
  44. If you are running a Debian based linux distro, you can download the
  45. dependencies for this script with the following commands:
  46. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  47. apt-get install jpegoptim
  48. apt-get install optipng
  49. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  50. The goal of this script is to make most of the images under 100kb for the web.
  51. It is ok to have a few images above 100kb; however, you should really avoid
  52. having images above 200kb.
  53. 2: Take advantage of Async calls
  54. --------------------------------
  55. One of the largest benefits of Node is its Async abilities where code is
  56. executed in a multi-threaded fashion. This can become a callback hell if not
  57. handled correctly, but, with good code structure it can become very useful. When
  58. code is executed in parallel, you can decrease run time by doing other stuff
  59. while waiting on costly file IO and database calls.
  60. The problem with async code is that it is hard to coordinate. Node has a lot of
  61. ways to handel synchronization, but, I prefer to use
  62. [Promises](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise).
  63. Here is a simple example where Async code can be misused
  64. Good Code Async:
  65. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  66. Promise.all([includes.printHeader(),
  67. require(file).main(filename, request),
  68. includes.printFooter()]).then(function(content)
  69. {
  70. res.write(content.join(''));
  71. res.end();
  72. }).catch(function(err)
  73. {
  74. console.log(err);
  75. });
  76. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  77. Bad Async Code:
  78. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  79. includes.printHeader(res).then(function()
  80. {
  81. return require(file).main(res, filename, request);
  82. }).then(function()
  83. {
  84. return includes.printFooter(res);
  85. }).catch(function(err)
  86. {
  87. console.log(err);
  88. })
  89. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  90. In the first example three blocks of async code are executed in parallel and in
  91. the second example three blocks of async code are executed one after another.
  92. Many people may initially do the second option because it may seem like you must
  93. create and render the footer after you render the header and body of the page.
  94. A great way to handel async calls is by having most of your methods returning
  95. promises which resolve to the HTML or DB information that they produce. When you
  96. run Promise.all, it returns an array of the objects which enables you to
  97. preserve the order ie header, body, footer. After you do this for all your code,
  98. it creates a "perfect" async tree which actually runs very fast.
  99. Another Good Async Example:
  100. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  101. /**
  102. * Calls posts and sidebar modules to render blog contents in order
  103. *
  104. * @param requestURL
  105. * @returns {Promise|*}
  106. */
  107. main: function(requestURL)
  108. {
  109. return new Promise(function(resolve, reject)
  110. {
  111. Promise.all([renderPost(requestURL),
  112. require("../sidebar/sidebar.js").main()]).then(function(content)
  113. {
  114. resolve(content.join(''));
  115. }).catch(function(error)
  116. {
  117. reject(error);
  118. })
  119. });
  120. }
  121. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  122. 3: Client-Side Caching
  123. ----------------------
  124. Client-side caching is where the client's web browser stores static content they
  125. download from your website. For example, if a client caches a CSS style sheet,
  126. they won't have to download it again for the next page they visit.
  127. You should cache all images, JavaScript and CSS files since those typically
  128. don't change. It is a good idea to set the expiration date of the cache to be
  129. something longer than a week, I typically set mine for a month.
  130. For a web browser to accept and cache files, you must set some tags in the HTTP
  131. header. In the HTTP header you must specify the content type, cache variables
  132. like max age. You also must assign a ETag to the header to give the client a way
  133. to verify the content of the cache. This enables the client to detect if there
  134. was a change to the file and download it again. Some people set the ETag equal
  135. to the version of the stylesheet or JavaScript, but, it is far easier to just
  136. set it equal to the hash of the file. I use md5 to hash the files since it is
  137. fast and I'm not worried about hash collisions for this application.
  138. You can do this in NGINX if you use it to serve static files, but, you can also
  139. do it directly in Node.
  140. #### Caching CSS
  141. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  142. var eTag = crypto.createHash('md5').update(content).digest('hex');
  143. result.writeHead(200, {'Content-Type': 'text/css', 'Cache-Control':
  144. 'public, max-age=2678400', 'ETag': '"' + eTag + '"',
  145. 'Vary': 'Accept-Encoding'});
  146. result.write(content);
  147. result.end();
  148. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  149. #### Caching Images
  150. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  151. var eTag = crypto.createHash('md5').update(content).digest('hex');
  152. result.writeHead(200, {'Content-Type': 'image/png',
  153. 'Cache-Control': 'public, max-age=2678400',
  154. 'ETag': '"' + eTag + '"'});
  155. result.write(content);
  156. result.end();
  157. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  158. 4: Server-Side Caching
  159. ----------------------
  160. Even with the best async server, there are still ways to improve performance. If
  161. you cache all the static pages that you generate in a HashMap, you can quickly
  162. access it for the next web user without ever having to query the database or
  163. read files.
  164. #### Ex:
  165. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  166. const cache = require('memory-cache');
  167. var html = cache.get(filename);
  168. if(html == null)
  169. {
  170. // Generate page contents
  171. Promise.all([includes.printHeader(),
  172. require(file).main(filename, request),
  173. includes.printFooter()]).then(function(content)
  174. {
  175. res.write(content.join(''));
  176. res.end();
  177. cache.put(filename, content.join(''));
  178. })
  179. }
  180. else
  181. {
  182. res.write(html);
  183. res.end();
  184. }
  185. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  186. I found that it is the fastest to cache everything from static html pages, CSS,
  187. JavaScript, and images. For a larger site this may consume a boat load of ram,
  188. but, storing images in a HashMap reduces load time since you don't need to read
  189. the file from a disk. For my blog, server-side caching nearly cut my load time
  190. in half.
  191. Make sure that you don't accidentally cache a dynamic page like the CMS page in
  192. your admin section—hard to realize while debugging.
  193. To demonstrate the performance increase of this method, I restarted my web
  194. server (clearing the cache) and ran a speed test which ran three trials. The
  195. first two trials were slow since the server didn't have anything in its cache.
  196. However, the third trial ran extreamly fast since all the contents were in the
  197. server's cache.
  198. ![Server Cache Example](media/3e2e138f85024c1a96ba0ad55bc5d2ed.png)
  199. Server Cache Example
  200. 5: Enable Compression
  201. ---------------------
  202. Compressing content before it is transferred over the internet can significantly
  203. decrease the loading time of your website. The only trade off from this approach
  204. is that it takes more CPU resources, however, it is well worth it for the
  205. performance gains. Using Gzip on CSS and HTML can reduce the size by 60-70%.
  206. If you are running an NGINX server, you can enable Gzip there. There is also a
  207. simple node module which will use Gzip compression on an Express app.
  208. #### Gzip on Express App
  209. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  210. npm install compression
  211. var compression = require('compression')
  212. app.use(compression());
  213. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  214. 6: Remove Unused CSS Definitions
  215. --------------------------------
  216. If you use a CSS library like Bootstrap or W3-CSS, you will have a ton of css
  217. classes which go unused. The standard BootStrap CSS file is around 210kb. After
  218. I removed unused CSS definitions the size of the BootStrap file was only 16kb
  219. for my website.
  220. For my blog I used PurgeCSS which is a node library.
  221. This command will install PurgeCSS for CLI (command line interface).
  222. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  223. npm i -g purgecss
  224. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  225. This is an example of how you could use PurgeCSS to remove unused css
  226. definitions.
  227. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  228. purgecss --css css/app.css --content src/index.html --out build/css/
  229. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  230. PurgeCSS CLI options.
  231. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  232. purgecss --css <css> --content <content> [option]
  233. Options:
  234. --con, --content glob of content files [array]
  235. -c, --config configuration file [string]
  236. -o, --out Filepath directory to write purified css files to [string]
  237. -w, --whitelist List of classes that should not be removed
  238. [array] [default: []]
  239. -h, --help Show help [boolean]
  240. -v, --version Show version number [boolean]
  241. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  242. This is not the ideal solution since some CSS definitions may be used on some
  243. pages yet unused on other pages. When running this command be sure to select a
  244. page which uses all your CSS to prevent losing some CSS styling on certain
  245. pages.
  246. You don't have to use this through the command line, you can run this directly
  247. in your node app to make it automated. Check out their
  248. [documentation](https://www.purgecss.com/) to learn more.
  249. 7: Minify CSS and Javascript
  250. ----------------------------
  251. This is the easiest thing you can do to reduce the size of your website. You
  252. just run your CSS and JavaScript through a program which strips out all
  253. unnecessary characters.
  254. Ex of Minified CSS:
  255. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  256. .bg-primary{background-color:#3B536B!important}#mainNav{font-family:Montserrat,'Helvetica Neue',Helvetica,Arial,sans-serif;font-weight:700;text-transform:uppercase;padding-top:15px;padding-bottom:15px}#mainNav .navbar-nav{letter-spacing:1px}#mainNav .navbar-nav li.nav-item a.nav-link{color:#fff}#mainNav .navbar-nav li.nav-item a.nav-link:hover{color:#D2C0FF;outline:0}#mainNav .navbar-toggler{font-size:14px;padding:11px;text-transform:uppercase;color:#fff;border-color:#fff}.navbar-toggler{padding:.25rem .75rem;font-size:1.09375rem;line-height:1;background-color:transparent;border:1px solid transparent;border-radius:.25rem}.table .thead-dark{color:#fff;background-color:#513E7D;border-color:#32383e}footer{color:#fff}footer h3{margin-bottom:30px}footer .footer-above{padding-top:50px;background-color:#3B536B}footer .footer-col{margin-bottom:50px}footer .footer-below{padding:25px 0;background-color:#3B536B}
  257. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  258. There are Node libraries which can minify CSS and Javascript, however, if you
  259. are lazy, just use a website like [this](https://cssminifier.com/).
  260. 8: Keep Minimal JavaScript
  261. --------------------------
  262. Ignoring the gross amount of Node dependencies you have, it is critical to
  263. minimize the amount of dependencies the client needs. I completely removed
  264. BootStrap's JavaScript and jQuery from my blog by simply writing a javascript
  265. function for my nav bar. This reduced the size of my website by 100kb.
  266. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  267. const e = document.querySelector(".navbar-toggler");
  268. const t = document.querySelector(".navbar-collapse");
  269. e.onclick = function()
  270. {
  271. if (e.getAttribute("aria-expanded") == "false")
  272. {
  273. t.classList.remove('collapse');
  274. e.setAttribute('aria-expanded', true);
  275. }
  276. else
  277. {
  278. e.setAttribute("aria-expanded", false);
  279. t.classList.add('collapse');
  280. }
  281. }
  282. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  283. You should debate how much you need 3rd party scripts like Google Analytics. In
  284. most cases people don't full take advantage of Google Analytics, a simple
  285. backend analytics service would work just as good while saving the client load
  286. time.
  287. Resources
  288. ---------
  289. - [Pingdom Speed Test](https://tools.pingdom.com/)
  290. - [Google Website Speed
  291. Test](https://developers.google.com/speed/pagespeed/insights/)
  292. - [Code to My "Optimized" Node Blog](https://github.com/jrtechs/NodeJSBlog)
  293. - [Purge CSS](https://www.purgecss.com/)
  294. - [CSS and JavaScript Minifier](https://www.minifier.org/)