<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0">
    <channel>
        <title><![CDATA[Voya's blog RSS]]></title>
        <description><![CDATA[RSS feed for all blogs posted on Voya Code.]]></description>
        <link>https://voyacode.com</link>
        <generator>RSS for Node</generator>
        <lastBuildDate>Sat, 14 Mar 2026 19:14:48 GMT</lastBuildDate>
        <atom:link href="https://voyacode.com/blogs/rss" rel="self" type="application/rss+xml"/>
        <language><![CDATA[en]]></language>
        <ttl>60</ttl>
        <item>
            <title><![CDATA[Voya Code's 3 year anniversary]]></title>
            <description><![CDATA[It has now been three years since I first published this website, so it's yet again time to take a look at what has been happening on the site during the past year.<br><br><b>Year of the backend</b><br><br>In 2017 the focus was very much on the frontend, with transition to Angular and many new features. In 2018 however the focus was more behind the scenes, with a massive backend rework.<br><br>During last summer <a href="https://voyacode.com/blogs/39">I switched from Node.js Express backend to Nest</a>, which is a TypeScript based framework similar to Angular, but for backend. With the full rewrite I was able to make code structure clearer, and with stronger typing it will be easier to make modifications in the future. The backend is now also throgoughly unit tested, which will make it easier to upkeep and to add new features.<br><br><b>Progressive web apps and app shell</b><br><br>Another major behind-the-scenes change was that the website is now a <a href="https://voyacode.com/blogs/38">progressive web app</a>. With this change users can have better user experience with better caching and prefetching mechanics, and many parts of the website also work without online connection.<br><br>With PWA's service workers website now supports <a href="https://voyacode.com/blogs/40">push notifications</a>, which can be used for blog notifications. Another related change is that website now uses same project structure as Angular CLI, which will make future Angular version updates easier.<br><br>Another major improvement is the use of <a href="https://voyacode.com/blogs/41">app shell</a>, in which parts of the website have been pre-rendered beforehand. With this technique user will be able to see something even before JavaScript has had time to load, which increases perceived performance.<br><br><b>What's next?</b><br><br>Last year the focus was very much on the general infrastructure of the website, with few new usable features. This year I would like to do some more interactable projects, such as small games. I'm not completely sure yet what they are going to be, but I have many possible ideas. I'll likely be using WebGL for at least one of them, as it's something I would like to try out.<br><br>Overall how much I'm able to work on personal projects depends heavily on how busy I'll be with my studies and potential work. I won't make any promises, but I'm hoping to release at least one project during this spring.]]></description>
            <link>https://voyacode.com/blogs/45</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/45</guid>
            <pubDate>Tue, 08 Jan 2019 21:11:03 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Advent of Code 2018]]></title>
            <description><![CDATA[As last blog of the day, I'll quicky mention that I've completed all of last December's <a href="https://adventofcode.com/2018/about">Advent of Code 2018</a> coding challenges.<br><br>For those of you who don't know, Advent of Code is a series of programming challenges occurring in December 1 - 25. On each day there are two challenges, in which second is typically continuation from the first. The challenge gives a "puzzle", an input and the goal is to determine correct output. The challenges are not tied to any specific programming language, so they can be done with a language of programmer's choice.<br><br>I completed the challenges for days 1 - 12 with JavaScript/NodeJS. Even though challenges weren't that big, I started to miss types and auto-completion of TypeScript, so I switched to TypeScript/NodeJS for days 13 - 25. I got most of the challenges done on the day they got released, but had to do some of the last ones after a small delay because I was busy with exams and work. If you are interested in my solutions, you can check them out at <a href="https://github.com/Voya100/advent-of-code-2018"><b>GitHub</b></a>.<br><br>I really enjoyed the coding challenges. Some of the challenges required use of some specific data structures and algorithms, such as linked lists, trees, graphs, breadth-first search and octrees. Because JavaScript doesn't come with many data structures and algorithms by default, I needed to implement many of them myself. There haven't been too many opportunities to use them in other projects, so it was nice to put some of that knowledge to use (as well as learning some new).<br><br>That's all from me for now. I'll likely do a new blog for the Voya Code anniversary in a week or so, so see you then, and happy new year! :)<br>]]></description>
            <link>https://voyacode.com/blogs/44</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/44</guid>
            <pubDate>Tue, 01 Jan 2019 15:46:19 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[WebGL]]></title>
            <description><![CDATA[I recently completed a computer graphics course at my university, and that's how I got to take a closer look at WebGL. I have previously used normal canvas 2D context for a couple of projects, but WebGL is something little different.<br><br><b>What is WebGL?</b><br><br>For those of you who don't know what WebGL is, it's an OpenGL based API for JavaScript which can be used to render 2D and 3D graphics. Unlike the normal canvas API, it can be used to write code for GPU, which can enable much greater performance.<br><br>One of the most interesting features about WebGL is that it doesn't use (only) JavaScript. Instead GPU logic used to render the pixels is written in language called GLSL (OpenGL Shading Language). It uses very similar syntax as C++ and it has static typing, but it is in general more restricted. As an example, it doesn't support recursion. The program will need to know how much memory it requires before program execution, because it needs to reserve it beforehand.<br><br>To have some comparison between traditional canvas API and WebGL, let's imagine a situation where we have 100x100 px canvas, and we need to calculate value for each pixel separately. JavaScript is typically single-threaded, so if we would calculate each pixel value with JavaScript, we would need to do 10000 calculations sequentially. If those calculations are however instead done on GPU, it will be possible to do those calculations parallel across potentially hundreds of cores. In other words, the calculation speed could be even 100+ times faster. This is particularly important if the canvas is animated, because then calculations should be done in roughly 60 times in a second to keep up with 60 FPS.<br><br><b>My WebGL project and how shader works</b><br><br>Because my WebGL experiment is heavily linked to a course project, and I typically don't post course projects here, I won't be publishing it here as a project. Below is however a quick screenshot to give some idea on what it contains.<br><img src="https://voyacode.com/images/blogs/webgl-demo.jpg" style="width: 450px; margin: 5px;"><br>The demo uses a GLSL shader that I've mostly implemented myself. Some of the noteworthy features I've implemented are reflections (including reflections of reflections), refractions (as seen with the cube), HDR, soft shadows and basic antialiasing. Objects have been made intentionally unrealistically reflective, so that it displays reflections more clearly. Some of the objects are animated, although it can't be seen in the image.<br><br>How the shader in this case works is essentially that it defines one function, which takes pixel coordinate as input and it should give the pixel's colour as output. This essentially means that the logic needs to be quite generic, so that correct colour can be defined for the pixel.<br><br>The color of a pixel is defined by sending rays from the location of the camera to direction of the pixel. Code then is used to determine which object it hits, what is the colour of the object, what material properties it has, what is the collision angle, how the lights are located, and so on. If there are reflections or refractions, new rays get sent, and their recursive colour value is added to final colour. This process is done separately for each pixel on GPU, dozens of times in a second.<br><br>Typically objects in 3D graphics are represented with triangles, but in this particular project signed distance field functions were used instead. How they work is that instead of telling GPU coordinates of each edge, it's given a function which tells how far away the object is from given coordinate. If a ray goes to certain direction in steps, always moving distance smaller than distance to closest object, it will eventually get so close to an object that it can be said that they intersect. One benefit to this approach is that some shapes, such as spheres, can be represented with much greater accuracy than with triangles.<br><br><b>Future WebGL projects</b><br><br>I currently have at least two projects in mind in which I could be using WebGL. One thing I'm however debating with myself is whether I should be using a higher level API, such as Three.js, or stick with the lower level API by using GLSL more directly. I don't know yet when I'll have time to work on those projects, but I'm hoping to work on at least one of them during this year.]]></description>
            <link>https://voyacode.com/blogs/43</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/43</guid>
            <pubDate>Tue, 01 Jan 2019 15:42:34 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Autumn 2018 and Job as Teaching Assistant]]></title>
            <description><![CDATA[Hi everyone, and happy new year!<br><br>It has been quite a while since I've made a blog, so I thought to do a status update and tell a little what I've been up to in the past months. Even though I haven't worked on projects for this website for some time, it doesn't mean that I have been idle all this time when it comes to programming in general. To make up for the lack of blogs, I'll be posting three blogs today. In this one I'll talk briefly about my recent experience as teaching assistant, in second I'll talk about WebGL and in third about Advent of Code 2018.<br><br><b>Course assisting</b><br><br>One thing that happened during past autumn was that I got a part-time job as a teaching assistant for a basic web application course at my university. Some of my responsibilities include providing assistance regarding course's weekly exercises and large project work both at course's Slack channel and at in-person code help sessions at university, and grading of course projects.<br><br>Overall working has been really pleasant, as I really like to help people and solve different kinds of problems. One of the main reasons why I was asked for the job was that I had voluntarily given assistance to other students back when I was doing the course myself, and it has been nice to do it in more official capacity. It has also been a good learning experience, both in helping and technical skills.<br><br>The course in question uses Django as the backend framework, and normal JavaScript/jQuery in frontend. I had gotten familiar with Django when I did the course myself, but while teach assisting I got even deeper insight into its capabilities. One piece of wisdom I often carry with me is that the best way to learn is to teach others, and I'd say it definitively applies here as well.<br><br>Django is overall great and interesting framework, with great contrast to some of the Node solutions. I likely won't be using it in my personal projects any time soon, but I could definitively see myself working with it in the future. I might do a full blog about it sometime.<br><br>Besides my official responsibilities, I also got to work a little on the course's automatic grading system for weekly exercises. I had noticed some small issues with it when I had been doing the course as a student, and I was given permission to do some small fixes and improvements to it. The grading system uses python, node, QUnit, Docker and Selenium for different kinds of tests.<br><br>I have used Docker once before in work context, but this was first time I had to delve slightly deeper into its inner workings. I also finally got to install it on my personal computer, which might come handy if I end up needing it in any future projects.<br><br>Selenium is maybe one of the first proper UI end-to-end testing tools I've used. Besides the grading system, I also used it recently to do automated testing on a testing course project at university. It's overall quite simple to use, and I might consider using it in future if I feel a need to do some automated UI testing.<br><br>I don't know if I'll be doing any assisting work in the future, but it has overall been a good experience.]]></description>
            <link>https://voyacode.com/blogs/42</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/42</guid>
            <pubDate>Tue, 01 Jan 2019 15:16:11 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[App Shell]]></title>
            <description><![CDATA[Single-page applications are amazing, aren't they? With them it's very easy to make interactive web applications. With them developers can have full control over website's content and navigation. Page navigation is also smoother, because the browser only needs to download and render the content that is different on that page.<br><br>But, there is a cost to this. And that cost is called JavaScript.<br><br><b>The cost of JavaScript</b><br><br>The way Single-page applications are generally implemented is that all of the site's content is created with JavaScript. Let's take Voya Code's html body as an example:<br><br><code>&lt;body&gt;<br>    &lt;voya-app&gt;&lt;/voya-app&gt;<br>&lt;/body&gt;</code><br><br>When user visits a website, this happens:<br>1. Browser downloads index.html file<br>2. Browser downloads css and JavaScript<br>3. Browser parses css and JavaScript<br>4. JavaScript runs, and may download other JavaScript files it requires (return to step 3)<br><br>We can easily see that HTML in index.html is not going to show the user anything interesting: it's just a blank page. After css has downloaded it may show the website's background color, but it still isn't much. In other words, the user needs to wait for the browser to go to the last step, in which JavaScript is run and HTML generated for the user to see.<br><br>Now, is this a problem? Often not, because JavaScript runs pretty fast on most devices. Also with service workers almost everything will get cached, so download speed should be very fast after the first visit. But what about users that have slower devices or slow internet connection? A good first impression is important, <a href="https://developer.akamai.com/blog/2016/09/14/mobile-load-time-user-abandonment">considering that most mobile users can abandon a site with just a loading time of 3 seconds</a>.<br><br>But is there a solution for this, that doesn't involve converting the website to use static pages? There are a couple of ways to make things better, and app shell is one of them.<br><br><b>What is app shell?</b><br><br>App shell is a container for your web application. It is generally the basic structure that stays the same on every page, such as header, navigation and footer. The idea behind app shell is that it is included in the index.html file that is served, so that browser can render it before it starts working on JavaScript.<br><br>It is good to note that app shell technically doesn't make the website load faster; instead it makes it <i>feel</i> faster. With app shell the main body of the website is rendered almost immediately, which increases the <i>perceived</i> performance. The user will still need to wait for the main content to load, but it's a much better user experience than an empty page.<br><br><b>How to make an app shell and how does it work (with Angular)?</b><br><br>Back when I first heard about the concept of app shell in the context of web development, my impression was that you had to implement the app shell manually in your index.html. That it was something outside of the single-page app, and the SPA would just be contained inside it, making the app shell uninteractive. This, as I recently found out, is not true at all.<br><br>Angular has this project called Angular universal, which can be used to pre-render  Angular application routes into HTML. The universal project is mainly focused on doing server-side rendering, which can also be done on demand, but it also works for generating an app-shellified index.html as a part of a production build.<br><br>Transforming an existing Angular application to use app shell is really easy. First, you need to have a 'root' component that has a <code>&lt;router-outlet/&gt;</code>. The root component would be the main app-shell, and it would contain routes that lead to content pages of the website. This is how Angular apps are generally structured, so this is nothing new.<br><br>Then you just need to run a handy <a href="https://github.com/angular/angular-cli/wiki/generate-app-shell">Angular CLI command</a> and it will automatically generate Universal configurations and an app-shell route that will be rendered as the app shell. You can modify this generated app shell component to decide what content (if any) will be shown 'inside' the app shell while the content loads. This can, as an example, be a simple loading indicator that is shown until JavaScript takes over. And that's all you need to get it to work, assuming there aren't any envorinment related build errors.<br><br>So, what did this CLI tool exactly do and how does it all work? Well, it essentially generated a new Angular Universal project, and that project pre-renders one of the application paths (app-shell path) and generates the output into index.html file. One very important thing to note is that pre-rendering is done with Node, meaning that it doesn't have browser context. In other words, you can't reference any browser API's (such as window object) in paths that are processed by Angular universal. Luckily you can quite easily check whether the environment is browser or server inside the application logic and do different things based on that. There may also be some pieces of code that you don't want to run in the pre-rendering process such as analytics, so that's good to keep in mind.<br><br>The way the app shell works from user's perspective is as follows:<br>1. Browser downloads index.html, and is immediately able to show app shell to user (no JavaScript parsing required). App shell doesn't have any JavaScript interactivity yet, because JavaScript hasn't loaded yet.<br>2. JavaScript gets loaded and parsed. Angular code runs, and the code takes control of the app shell, including all pre-rendered elements (nav, footer, etc.). Page now acts just like an ordinary SPA, and it can fully render rest of the page content.<br><br>Assuming that pre-rendered app shell isn't visually too different from browser-rendered SPA, the transition from app shell to JavaScript control should be completely smooth to the end-user.<br><br><b>App shell on Voya Code</b><br><br>As you most likely have guessed with all this lead-up, Voya Code now has a working app shell. This app shell contains the header and footer sections of the page. The app shell also shows the same loading indicator that you may have seen while navigating on this website.<br><br>I used Google's Lighthouse tool to audit the website's performance before and after app shell.<br><br>Before:<br><img src="/blog-content/img/app-shell-lighthouse-before.jpg"><br>After:<br><img src="/blog-content/img/app-shell-lighthouse-after.jpg"><br>There is some variance with the performance results, but it can easily be seen that the first contentful paint appears to be roughly three times faster with the app shell. Time to interactive is slighly longer, which is partly caused by the fact that JavaScript needs to wait for HTML DOM to be ready. It is however not by much, and it's definitively worth the other time benefits.<br>(Quick note: tests were run with 'mobile' setting, which is why test images don't use full width of the page)<br><br><b>Issues with app shell development</b><br><br>Overall implementing the app shell didn't provide any major issues, but there were some minor 'bugs' that had to be solved. There are very few good up to date guides on app shells, so I'll mention those here in case you want to implement app shell yourself and run into the same issues.<br><br>- If Angular app has a wildcard path '/**', for 404 as an example, pre-rendering build will fail. Routes of the main app need to be reset to fix this. (see <a href="https://github.com/angular/angular-cli/issues/8929#issuecomment-361884581">Github issue</a>)<br>- If app uses service workers, there is a bug where pre-rendering of index.html does not generate correct hash-value for it. This breaks service worker's caching. This was fixed just 17 days ago, so you'll need to update to newest beta of @angular-devkit/build-angular (fixed by version "0.8.0-beta.1"). (<a href="https://github.com/angular/angular-cli/issues/8794">Github issue</a>)<br>- Angular CLI tool doesn't generate all related production settings to app-shell build in angular.json. You'll need to add some of them yourself.<br><br><b>Universal's server-side rendering for all paths?</b><br><br>The next logical step from app shell would be to implement server-side rendering so that all HTML would come as ready for the user. The basic idea is very similar as with app shell, but it renders all page content on the server and often in real-time.<br><br>Doing server-side rendering would have a few benefits. One would be that search engines (and some users) could see the page content even without JavaScript being enabled. Another is that the content could be visible to the user much faster, because it doesn't need to run JavaScript to display it.<br><br>There are however also a few things which I could see as problems for server-side rendering. One is that if rendering is done in 'real-time', it will increase the work that the server needs to do. Rendering it on server can also cause a small delay on the index.html request. These can likely be solved with efficient caching or by pre-rendering routes beforehand.<br><br>For users one potential risk of fully pre-rendered pages could be that the page looks complete, but isn't truly interactive yet. Even with pre-rendering, the browser still needs to load all those JavaScript files, and the website won't be very interactive before that. If there is a significant delay to JavaScript, users could get confused why a button or some other interactive element doesn't do anything. I think this could potentially be worse than displaying an incomplete page.<br><br>In terms of development one of the biggest challenges of server-side rendering is that all routes should be renderable by the server environment. In other words there needs to be conditional logic to ensure that browser APIs are not called in server context. Developer should also pay some attention on what components should or should not be rendered on the server. As an example, if there is a very heavily interactive component, it could be a good idea to render it only after JavaScript is ready to handle those interactions.<br><br>At the moment I'm quite happy with the app shell implementation. Server-side rendering is likely something that I will investigate more in the future, but at the moment it isn't at the top of my priorities.<br><br><b>In other news</b><br>- There has been an issue where website takes only around half of the browser width for a very brief duration on initialisation. JavaScript logic has been fixing this issue, but it has caused distracting 'content jumps'. This is now fixed; website will now take full width by default even without JavaScript.<br>- <code>&lt;noscript&gt;</code> message for those who don't have JavaScript enabled is now moved inside the app shell. Now, instead of just text and background color, they'll be able to see header and footer elements.<br>- Website logo in header now has a fixed size. This reserves space for the image beforehand, which removes content jumps its download could cause on slow internet connection. Logo size has also been reduced.<br>- Font-size has been increased for smaller devices, making text easier to read. This also increases Lighthouse's SEO score to 100.<br>- Server now has rate limiting for requests to reduce risk of DOS attacks and login brute force attemts. Login (admin only) gives 5 attempts before it starts to throttle requests.<br>- Server has been updated to use DigitalOcean's updated server package, which has twice as much RAM (1 GB, previously 500 MB).]]></description>
            <link>https://voyacode.com/blogs/41</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/41</guid>
            <pubDate>Mon, 06 Aug 2018 16:02:57 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Push Notifications]]></title>
            <description><![CDATA[Have you always wanted to get notified immediately when a new blog gets published on Voya Code? If you have, this is your lucky day.<br><br>Voya Code has now support for push notifications which will notify you whenever a new blog is released. To subscribe go to the <a href="/blogs">blogs page</a> and click the handy 'Subscribe to push notifications' button. The browser will then prompt you to give a permission for notifications. After you have accepted, you will get notified when the next blog gets released.<br><br>Note that at least the desktop versions require that the browser is open for the notification to get shown. If browser is closed, the notification is delayed until the browser is opened again. Not sure yet how it works on mobile browsers.<br><br>(Update: Mobile browsers can send notifications even when browser is 'closed'.)<br><br><b>What are push notifications and how do they work?</b><br><br>Push notifications are messages that can be sent to user even when the website/app isn't open. These are same messages that mobile apps may send you, except in this instance the 'app' is the browser. The notifications also work on desktop browsers.<br><br>Because push notifications can also be received when the website is closed, they require service workers to work. Once the push data comes to the browser (from a server, as an example), it's service worker's job to display the notification to the user.<br><br>Here is a basic push notification 'life cycle':<br>1. User visits the site, service worker gets installed.<br>2. Browser asks user's permission for notifications. User accepts, and browser sends identifying key values to the backend.<br>3. User leaves the site, and may even close the browser.<br>4. An event happens which causes the server to send a message to 'Push server' which is responsible for push message delegation. This message contains identifying values obtained earlier from the browser. Actual message contents are encrypted.<br>5. Once user's browser becomes active again, the push server will send push message to the browser.<br>6. Service worker will catch the push message and choose what to do with it. Most of the time it will generate a notification from the message details and show it to the user. Service worker will also handle all notification interactions, such as click events.<br><br>Here are what push notification 'features' I've implemented for the site:<br>- Ability to register to push notifications<br>- Ability to unregister from push notifications, without having to go to browser settings.<br>- If user's browser doesn't support push notifications or doesn't have service workers enabled, 'subscribe' button will be disabled.<br>- 'Navigate' action handler for notifications, which will open a window to the blog post that got released. If browser doesn't support notification 'actions', clicking the notification will also cause the navigation.<br>- Support for multiple 'subscription topics'. At the time of writing there is only 1 topic ('blogs'), but in the future there could be a need for more categories. With the current database it is easy to allow users to subscribe to different notification topics separately, if there ever is a need for that.<br>- If user removes notification permission, their now invalid subscription will get removed from the database automatically.<br><br><b>Browser support</b><br><br>One sad reality is that push notifications are not supported on all browsers yet. At the moment Chrome and Firefox should have a full support for it, at least on desktop and likely also on mobile.<br><br>Safari and all iOS browsers don't support web push notifications yet. They have some custom implementation for it, but I haven't looked too much into it. Hopefully they'll receive support for the standardized implementation in the future.<br><br>Microsoft Edge <i>should</i> have support for service workers and push notifications, but I haven't got it to work on my machine. I suspect it may be a browser related problem, because I also couldn't get service workers to work on any other site either.<br><br>Internet Explorer will likely never get support for service workers and push notifications. If you are for some reason still using it, please consider upgrading to a modern browser.<br><br><b>In other news</b><br>- Email newsletter now greets you by your name. 'Name' is picked from email address.<br>- Email newsletter subscription urls have been changed to follow a more similar url scheme with push notification subscription urls.]]></description>
            <link>https://voyacode.com/blogs/40</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/40</guid>
            <pubDate>Fri, 03 Aug 2018 17:20:09 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Backend rewrite with TypeScript and NestJS]]></title>
            <description><![CDATA[As I have mentioned a couple of times earlier, I've wanted for quite some time to rewrite my backend logic with TypeScript. Well, now I've done it. And not just that: I've also swiched from Express to <a href="https://docs.nestjs.com/">NestJS</a> framework. This has resulted in a much better and more manageable structure; more details below.<br><br><b>What's Nest?</b><br><br>Nest is a Node.js framework that is very heavily inspired by Angular in the frontend. Just like Angular, it uses TypeScript as the primary programming language and is very modular by design. Unlike most Node libraries, it provides a clear project architecture that is easy to scale up and test.<br><br>Nest uses Express behind the scenes, which allows it to use any of the libraries that work with Express. Nest however brings a lot of abstraction to the code, which allows the use of a completely different library with minimal code changes.<br><br><b>Nest vs. Express</b><br><br>When I originally thought of switching to TypeScript, Express was the initial direction I was going to take. I however felt that I didn't just want to directly convert my old project to TypeScript version one file by one: I wanted a good project structure that would be easier to manage in the future. So I started looking for good TypeScript Express starters, but I couldn't find any that I liked. Being a big fan of Angular, Nest framework started to really attract my attention.<br><br>One thing that Express is really good at is its simplicity. It has a very simple router model, where you just pass function callbacks to each route. Each function ('middleware', route handler) gets the passed request and response objects and they may modify them, send a response to user or whatever needs to be done. Nearly everything is done with middlewares.<br><br>Nest is slightly more complicated than that. The concept of 'middleware' is split into more concepts such as exception filters, pipes, guards, interceptors and decorators. Because of this responsibilites are more clear and thus single responsibility principle is easier to enforce.<br><br>One quite major difference in Nest compared to Express's middlewares is that it has a lot higher abstraction: you very rarely need to access request or response objects directly. As an example, in controllers you can just return an object directly, and it will get converted to JSON response behind the scenes. This adds a lot better decoupling and also makes testing easier.<br><br>Compared to 'plain' express, here are some of the things that Nest has brought out of the box:<br>- Stronger typing with TypeScript, making it easier to find programming errors<br>- Dependency injection: easier to follow Dependency inversion principle and easier unit testing<br>- TypeORM: mapping database entities into TS objects and vice versa, also enforcing types<br>- class-validator: validates database entities and user inputs with handy decorator syntax. No more validation logic inside controllers!<br>- More modular and consistent directory structure<br><br>It is worth noting that most of the things above can also be used with express, but you would need to do some more research on your own to figure out how to implement them. Many of them however require decorators to reach their full potential, which would require either TypeScript or Babel.<br><br>Nest brings a lot of cool features, but there is a cost: learning. It is a lot more complex that Express where route resolution follows a very linear path from function to function. In Nest there is a lot more 'magic' in the background such as dependency injection and decorators that require more effort to understand fully.<br><br>I feel that the comparison between Express and Nest is quite similar as with React and Angular. In Express/React starting out and learning the basics is very easy, but they don't provide much structure by default, which makes them harder to scale out. With Nest/Angular it takes a little longer to learn the basics, but on long term they can give more scaleable applications where it's easier to follow SOLID principles. That is my view on the matter anyway; both approaches have their benefits and downsides.<br><br><b>Testing</b><br><br>One of the biggest things that has changed regarding how I work is testing. In the past I haven't done as much unit testing as I probably should. I have had some tests, but not even close to covering an entire application. As a result, making updates was slightly troublesome, because any change could potentially break something without you knowing. Everything would require manual testing, which obviously takes time.<br><br>With this backend rewrite I've taken a more test-driven approach: as I'm making a feature, I'll also write tests for it. I have two kinds of tests: unit tests and end-to-end (e2e) tests. I have put a lot more focus on e2e test because they give a better assurance that the actual functionality works as it should, because it takes all integrations into account as well. They are also really easy to make, because HTTP requests are really easy to test: you only need to generate request as an 'input' and check that the response 'output' matches your expectations. e2e tests also require very minimal mocking. I've opted to use a real database for these tests (obviously separate from production), because I feel that is the best way to get assurance that everything works, considering that database interactions have a big role in APIs.<br><br>As a testing framework I've used Jest, which is the recommendation from Nest. I've previously used both Mocha and Jasmine, as well as other related libraries such as Chai and Karma, but Jest is the very clear winner here. It gives all mocking/spying/etc. features, has great expect behaviour (easy to write and understand), code coverage and no need to figure out how to get testing functions to global scope. It also supports promises and async functions, lack of which has provided me a lot of issues with Jasmine, as an example. I will definitively be using Jest in my future projects.<br><br>I can happily say that the line coverage of the tests is currently 99.3 %. That is 48 unit tests and 56 end-to-end tests. Branch coverage is slightly lower, 73.5 %. That approximation however seems to be wrong, because there is a <a href="https://github.com/istanbuljs/istanbuljs/issues/70">bug</a> where coverage tool counts constructors as unexplored branches. Based on manual examination I would estimate that it's really much closer to 95 %. Code coverage of course isn't enough to say that everything works as it should, but it gives a lot more confidence than nothing.<br><br><b>Overview</b><br><br>Overall I've been very pleased with Nest. So far I have mostly implemented the same behaviour that previously existed on the Express version, but the code structure and the responsibilities of each component are much clearer. I've also found and fixed a few bugs that existed in the old version. With stronger typing and tests it will be easier to upkeep the system and make new changes in the future without breaking existing behaviour.<br><br>I'm actually confident enough to finally release the backend repository to the public, so <b><a href="https://github.com/Voya100/VoyaCode-Server">here it is!</a></b>. If you wish to give feedback or suggestions, feel free to do so either in the comments here or make an issue at Github.<br><br>This is only the beginning, and I will be improving the backend in the future. Stay tuned!<br><br><b><i>In other news</i></b><br>- Fixed a bug where service worker would redirect every API request and RSS page to index.html on navigation attempts.<br>- Fixed a bug where only 1 [url] tag would get converted to HTML on blogs and comments.<br>- APIs have stronger validation and more user-friendly messages, meaning less 'TypeError' type of errors.<br>- Comment submission form can now show multiple errors at once.<br>- Fixed a few (rare) API related issues revealed by more throrough testing.<br>- 404 pages no longer change url to /404, allowing you to make potential fixes to it]]></description>
            <link>https://voyacode.com/blogs/39</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/39</guid>
            <pubDate>Tue, 10 Jul 2018 20:16:11 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Progressive Web App and Service Workers]]></title>
            <description><![CDATA[Voya Code is now a progressive web app (PWA) and service workers have now been enabled for the site. More details below!<br><br><b>What's a progressive web app?</b><br><br>Progressive wep apps try to combine the best of web and traditional apps.They have many characteristical features, but some of the best known ones are the ability to install them on your (mobile) device and the use of service workers.<br><br>One key factor separating standard web apps from progressive ones is manifest file which includes settings for how the app should look like. As an example, the settings include app's color theme and icons. Once the app manifest is set up, the mobile browsers can recommend the user to install the app on their own device. After this they can launch it from their home screen almost like a native app.<br><br>Putting web apps on home screen is nothing new, as the feature has existed as 'bookmarks' long before. The key factor that makes it different is the use of service workers.<br><br><b>What are service workers?</b><br><br>Service workers are JavaScript scripts that are run on a separately from the website itself, meaning that they can run even after the website has been closed. One of the most common uses of service workers is as a HTTP proxy. This gives the developers a lot more control on caching than the traditional cache settings can give.<br><br>With the more controlled caching it is possible to save all the resources that the website needs locally and retrieve them later. This way the website can remain usable even without any online connection, just like traditional mobile apps.<br><br>The caching works essentially like this:<br>1. User comes to the site for the first time. The page is loaded normally. Service worker is installed (if supported by the browser).<br>2. Servive worker starts fetching website resources in the background. This may include resources/pages user hasn't gone to yet, decreasing the loading time when user does access them.<br>3. User leaves the website.<br>4. User returns to the website. Service worker fetches resources from the cache, making the loading nearly instant. If there are new versions of the resources, service worker now starts to load them in the background. The updated resources will be shown to the user next time they return to the site.<br><br>It is worth noting that this isn't only for the apps installed on mobile devices; it works on browsers as well. You can test it out by disabling the online connection from dev tools and refreshing this page.<br><br><b>Angular and service workers</b><br><br>Angular luckily has a good support for PWA and service workers, which makes transitioning to PWA really easy. By running a simple CLI command almost everything is set up and ready to go. Well, at least almost: there is currently a <a href="https://github.com/angular/angular-cli/issues/8779">bug</a> which can cause it not to work with certain dependencies or app configurations. There is luckily a workaround, but that one needed some extra research on my part. The tooling could also use some minor improvements. As an example, the default CLI build settings don't currently include minification for the service worker script. Despite this, it does its purpose and will no doubt improve in the future.<br><br><b>Service workers on Voya Code</b><br><br>By default Angular's service worker configuration preloads all main bundles of the app, which includes essentially JavaScript, HTML and CSS. This means that all of the pages on this site should currently work (more or less) regardless of internet connection, if service worker is installed. It also meant that moving around the website is slightly faster, because bundles for other pages are also preloaded. I may however need to check if there are some things that shouldn't be preloaded; because not all of them are used as often.<br><br>Blog API results are also cached for a shorter amount of time, but those cache results are used only if there isn't an internet connection.<br><br>Other assets such as images are loaded on demand and cached for future uses. Because they aren't downloaded beforehand, they won't work without online connection if you haven't visited those pages before while the service worker has been active.<br><br>One 'feature' of the service workers is that it doesn't always show the newest version of the site instantly. This is because service workers will always use cache if available, and download the newest version in the background. For most sites this is useful, because it makes the website faster the users, and it may not be critical that they have the exact newest version immediately. On this site I however quite often talk about the latest changes I've made, and it could be confusing if the changes haven't been applied yet for the reader. Because of this I've added a small popup that will appear once the service worker has downloaded the newest assets and recommend refreshing the page. You may see it if you later decide to visit the website.<br><br><b>In other news</b><br>- A small &lt;noscript&gt; element has been added so that those who have disabled JavaScript won't see just a blank screen.<br>- More metadata has been added to &lt;head&gt; for search engines.<br>- More favicons of different sizes have been added to support different devices.]]></description>
            <link>https://voyacode.com/blogs/38</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/38</guid>
            <pubDate>Sun, 01 Jul 2018 20:47:40 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Angular CLI and updating to Angular 6]]></title>
            <description><![CDATA[It has been a while since I updated the website's dependencies, so I have now updated from Angular 4 to Angular 6 and switched webpack for Angular CLI. More details below.<br><br><b>Angular CLI</b><br><br>Back when I first started using Angular, Angular CLI was still in very early stages. Because of that Webpack was one of the few options for Angular app building at the time. I managed to piece together a working build setup by using the help of multiple guides. It had worked quite well - it handled the AOT builds, minifications, uglifying, scss, necessary file copying and even gzipping. It does however take some effort to keep it up to date, and the setup's performance is no longer ideal.<br><br>Angular CLI is now the de facto build tool for Angular apps. It does all build related tasks with very minimal configuration, although it also provides options for more complex build setups. One of its bigger benefits is that it provides multiple tools that can be used to build Angular apps. It can also upgrade itself and Angular dependencies (mostly) automatically, so it's a lot easier to keep things up to date.<br><br>One of the biggest challenges with setting up CLI was that it had quite a different project layout compared to my old one. Most files had to be moved under a new directory, and I needed to swap around the interaction between frontend and backend during development (previously: App hosted on Node backend, now: App served with CLI command and API calls are proxied to backend). Another thing that changed is that there is no longer need for a JIT (just-in-time) setup, because AOT (ahead-of-time) is now fast enough to be used in development as well.<br><br>Overall the switch to Angular CLI went without major issues, and it will surely help with keeping up to date with the newest Angular updates.<br><br><b>Angular 4 -> Angular 6</b><br><br>There has been 2 new major versions of Angular, and in those versions there has been some breaking changes. Some of the most notable ones are the changes to <a href="http://brianflove.com/2017/07/21/migrating-to-http-client/">HttpModule</a> and <a href="https://github.com/ReactiveX/rxjs/blob/master/docs_app/content/guide/v6/migration.md">rxjs</a>. This meant that I had to update the code to match the new changes. Http related changes had to be done manually, but rxjs changes were luckily mostly automated. The new Observable API will however take some time to get accustomed to.<br><br><b>Code formatting</b><br><br>As I was doing the major updates to the entire project structure, I decided to take a slightly closer look at code formatting. Even though I have used tslint and eslint to find errors and maintain some code styling rules, I have kept the rules quite light. One major reason for this was that it often simply took too much effort to keep completely consistent style.<br><br>I have now started to use Visual studio code's auto format feature, which formats the files automatically on each save. I'm currently using Visual Studio Code's default formatting alongside tslint rules. This way keeping the code style consistent and clean takes very little effort and is thus easier to maintain. I would greatly recommend the auto formatting feature for everyone, because it makes things so much easier.<br><br><b>Overview</b><br><br>Now that the website uses Angular CLI and is updated to the latest (stable) version of Angular, it will be easier to keep the website up to date with the latest Angular updates. The Angular CLI setup has already proven to be more efficient than my old webpack setup: previously the homepage's script bundles were 312.5 kB, now they are only 200.4 kB (gzipped). That is over 35 % decrease, which is quite significant.<br><br><b>What's next?</b><br><br>Now that the website uses Angular CLI project structure, there are a few things I can now implement more easily. On the short term one thing I would like to look at is making this website a Progressive Web App (PWA). Another slighly longer term goal would be to look at Angular Universal, which would add server-side rendering to the mix. Universal app would add some limitations on the website, so I will need to investigate how practical it would be to implement.<br><br>Until next time! :)]]></description>
            <link>https://voyacode.com/blogs/37</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/37</guid>
            <pubDate>Sat, 30 Jun 2018 21:08:37 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[GDPR and status update]]></title>
            <description><![CDATA[It has been a while since my last blog, so let's talk a little about GDPR and have a quick status update.<br><br><b>GDPR</b><br><br>As most of you have most likely heard from multiple other sources already, GDPR (General Data Protection Regulation) took effect on 25th of May. This regulation in essence gives better privacy rights to users and adds more limitations on how data can be gathered and used. I have now done some small tweaks to make the website more compliant with these changes.<br><br>The overall effects to this website are very minimal, because I'm not storing any personal information. One thing that could be impacted is the use of Google Analytics, which tracks which pages are used and some other generic information. The information gathering is however anonymous and is not enough to identify a person, and it isn't sent to any 3rd parties (for advertisement/etc. purposes). Because of this it doesn't fall under GDPR consent requirements based on the articles that I've read.<br><br>One thing that is however required is that the users are informed what is tracked about them and how the cookies are used. Because of this I've added a <a href="https://voyacode.com/cookie-policy">cookie policy page</a> which has more details on what is collected and how the information is used. The page also includes an option for opting out from cookies if you don't want any tracking. I would however be thankful if you keep the cookies on, because it's helpful to know how much the website and its different pages are used.<br><br><b>Status update on Chess</b><br><br>I originally hoped that I would finish the Chess's online multiplayer feature during this spring, but I sadly had very little time for my hobby projects due to my university studies. I have however done quite a lot of progress on other chess updates I've also been planning, such as full implementation of all the rules, unit tests and some code rewrites. The progress on websocket features has been slower than I would like, but there has been some progress on that as well. I may do another blog on the challenges I've faced later on.<br><br>Even though there has been some progress, I have decided to put Chess online multiplayer project on pause for the time being. I have a couple of reasons for this. The first is that I want to update the backend to use TypeScript before I move further with the websocket features. The more I delay this transition, the harder it will be. Stronger typing will also make the development of websocket features easier overall, because it has more moving parts than the more simple HTTP requests.<br><br>Another reason for the pause is that there are a few other projects that I would also like to work on. I'm hoping to share more about them in the coming weeks, so keep an eye out for them!<br><br>TL;DR: Online chess is not forgotten, but other projects will have higher priority for the time being.]]></description>
            <link>https://voyacode.com/blogs/36</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/36</guid>
            <pubDate>Sun, 24 Jun 2018 17:56:55 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Voya Code's 2 year anniversary]]></title>
            <description><![CDATA[It has now been two years since I first launched the website, so let's take a look at what has been going on during the past year!<br><br><b>What's new?</b><br><br>This year Voya Code has had some quite massive changes to both frontend and backend. At start of the year the whole website was changed to use Angular instead of the older PHP/jQuery combo approach. This change also included some soft rewrites for older projects such as Rock, Paper, Scissors and Chess which were also changed to use Angular instead of jQuery.<br><br>Backend has also received some big changes, the most notable one being the switch to Node.js based server. This change gives me a much better control on the site and has allowed the use of HTTPS and gzip among other things. The code is also easier to organize and can be more easily shared with Javascript frontend, which makes my life a lot easier.<br><br>Multiple projects were also finished during the year. My first React Native app for Android, <a href="https://play.google.com/store/apps/details?id=com.voyacode.chess&pcampaignid=MKT-Other-global-all-co-prtnr-py-PartBadge-Mar2515-1"><i>Chess</i></a>, got finished during the summer, and later in autumn <a href="/projects/color-picker"><i>Color Picker</i></a> was added to the site. Behind the scenes a better admin section was also implemented for my personal use with authentication and proper tools for publishing new blogs for the site.<br><br><b>What's next?</b><br><br>I have been working on online multiplayer for Chess during the past couple of months, implemented with websockets. The progress has been slower than I would like due to university studies, but I'm hoping to get it's browser version finished during this spring.<br><br>Last year I got the Node.js environment set up and things are looking good, but the code used isn't yet perfect. One thing that I'm missing on that front is the type checking of TypeScript, which is why I'm planning on changing it to use that some time in the future.<br><br>As for new projects, I have one or two on my mind that I would like to work on this year, and likely more ideas to come. You will have to wait and see what they will be.<br><br>Happy new year everyone! :)]]></description>
            <link>https://voyacode.com/blogs/35</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/35</guid>
            <pubDate>Mon, 08 Jan 2018 20:52:11 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Color Picker]]></title>
            <description><![CDATA[I have just released a new feature for the website, <a href="/projects/color-picker"><b>Color Picker</b></a>!<br><br>With this color picker you can select different values using RGB and HSL formats. You can also save colors locally, if you want to save them for future use.<br><br>This project was my first experiment with HTML5 canvas element. The process in this case was quite simple - I only needed to go through each pixel and set their red/green/blue values with some math every time canvases needed updating.<br><br>Some of the sliders could have also been made with already existing gradient options, but they tended to skip colors in their results, so I decided to make them too by hand.<br><br>I may try to do something else with canvases in the future if I find an interesting use for them. As always, the code can be found at <a href="https://github.com/Voya100/VoyaCode/tree/master/app/projects/color-picker">Github</a>.  Until next time!]]></description>
            <link>https://voyacode.com/blogs/34</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/34</guid>
            <pubDate>Sun, 17 Sep 2017 18:14:50 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[RSS feed and email newsletter]]></title>
            <description><![CDATA[<b>RSS feed</b><br><br>The RSS feed has been broken since transition to new server, and I have now managed to get it fixed. The feed is now also slighly less error prone and more likely to be up to date than previous version.<br><br>The path of the RSS feed has also changed. The new url is this: <a href="https://voyacode.com/blogs/rss">https://voyacode.com/blogs/rss</a><br><br>The old url will still work for the time being, but I will disable it eventually, so switch over to the new url as soon as possible.<br><br><b>Email newsletter</b><br><br>If you want to stay up to date with my blogs, but don't want to check the website every day or use 'outdated' RSS feeds, you can use the email newsletter system I have just created.<br><br>By using <a href="/blogs/subscribe"><b>this link</b></a> you get to page where you can add your email to the mailing list. When you are subscribed, you will get an email notification every time a new blog is posted. I don't have any regular blog schedule, but I likely won't be posting more than four blogs a month, so it hopefully shouldn't cause any 'spam'.<br><br>If you ever feel a need to unsubscribe from my blog newsletter, you can always do so with a link that is provided with every newsletter email.<br><br><i>Note: <br><br>At the moment Outlook/Hotmail blocks all newsletter emails, likely due to 'shared' email ip provided by Mailgun, which may affect the 'reputation' of the emails. One solution to this could be that I handle the sending of emails completely myself, but that will require some extra work.<br><br>I may look more into it at later date, but for now there is no guarantee that Outlook/Hotmail users can subscribe to the system. Sorry about the inconvenience.</i>]]></description>
            <link>https://voyacode.com/blogs/33</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/33</guid>
            <pubDate>Sun, 10 Sep 2017 15:08:41 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Blog managment and authentication]]></title>
            <description><![CDATA[Hey everyone! I've just done a major update regarding the website - although, due to its nature, it may not be very visible to you.<br><br><b>New admin tools</b><br><br>The website has now 'proper' administration pages which I can use to add, edit and remove blogs. Previously I needed to send them with direct HTTP requests or database manipulation, but now I have a nice UI for it. Since you don't have permissions to use them (obviously), below are some pictures of it.<br><br><img src="https://voyacode.com/images/blogs/admin-blog-1.jpg" /><br><img src="https://voyacode.com/images/blogs/admin-blog-2.jpg" /><br><br>It's nothing too complex, but it suits my needs really well. I took some of the comments page code and applied it here. Preview feature is especially helpful - now I can see what the blog would look like before I send it.<br><br>I have also re-enabled the admin ability to see the private comment messages you have sent - feel free to send me more, if you wish. :)<br><br><b>Authentication</b><br><br>One major part of this is obviously authentication - I don't want that just anyone can post blogs or see 'private' comment messages. For authentication I have used JSON Web Tokens.<br><br>How the web tokens work is essentially that first user authenticates themselves to the server (with username and password, as an example), and the server responds with a signed token. Token can contain some (unencrypted) details about the user, such as their username and whether they are admin users or not.<br><br>Every time user wants something from the server, they send the token with the request. Server verifies the token (it will notice if it's changed), and can thus see if the user authenticated. Server doesn't need to store any sessions. The token will expire after a set amount of time (stored into token itself), after which the user will need to authenticate again.<br><br>All of the frontend code regarding admin features can be found at <a href="https://github.com/Voya100/VoyaCode/tree/master/app">Github</a>, if you wish to take a look. Do note however, that even if you are able to get to the admin pages by some trickery, all real authentication is done in the backend, so you won't be able to do anything you shouldn't be able to. ;)<br><br>At the moment the authentication system is relatively simple, because there is only one user (me), but it can be easily expanded in the future if there is a need for it.<br><br><b>What's next?</b><br><br>Next I will be fixing the RSS feed and hopefully adding a mail list feature for blogs. At some point I will expend admin features to include comment managment, and I have many other ideas I want to try this Autumn (some of which are more visible to you, I hope). Until next time!<br><br><i>In other news:</i><br>- Google Analytics have been enabled for the site.<br>- Home page has been slightly updated.<br>- It is now possible to link to specific <a href="/blogs/32">blogs</a>.<br>- Blog filters have been fixed.<br>- Comment error messages have been fixed.<br>- Other minor fixes to things Node.js transition broke.]]></description>
            <link>https://voyacode.com/blogs/32</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/32</guid>
            <pubDate>Mon, 04 Sep 2017 20:46:55 GMT</pubDate>
        </item>
        <item>
            <title><![CDATA[Voya Code has moved to new server!]]></title>
            <description><![CDATA[If you are reading this, then your DNS has updated itself properly and directed you to Voya Code's new server!<br><br>As I have mentioned before, my previous server had some limitations - such as not being able to use node, make websocket servers or use gzip. For those reasons I decided to move to new server. I chose DigitalOcean as a service provider because it seems to provide plenty of freedom, is relatively cheap and allows scaling up if needed.<br><br>In terms of front end not much has changed, but in the back end things are quite different. Instead of apache/PHP it now uses Nginx/Node. The thing that is visible to you is that the web page should now be faster and more secure than before. More details about the transition and changes are below.<br><br><b>The transition to new server</b><br><br>Unlike on my previous server, on this server I have a root access to the server. This gives me much more freedom to use the server how I like, but at the same time a lot more responsibility. Pretty much everything was already set up on my previous server when I started to use it, so I didn't need to think much about configurations on the server side.<br><br>Even though this new server also has some things preinstalled (such as node), I needed to configure most things myself. It was a lot more work compared to my previous server, but luckily there were many step by step guides to help me through it. It has definitively given more chances to get more familiar with the command line.<br><br><b>Server structure</b><br><br>I'm using both Nginx and Node to host the server. Nginx sits at the front of the server - it handles serving of static files, caching and gzipping. If Nginx can't handle the request, it passes it to Node, which can respond to it depending on the route. If it's an api route (as an example), it returns an expected json response. If Node doesn't have a specific path for the route, it redirects to the Angular app's index.html file, which can handle the navigation inside the app depending on the route (or 404 page, if all else fails).<br><br><b>PHP vs Node</b><br><br>The back end on my previous server was with PHP, and now I have moved to use Node instead.<br><br>PHP is very 'script based' - it is very different to Node, which is more clearly a server that handles all requests. PHP files are often just scripts that are run once user comes to page in that path. PHP files don't need to have any connection to other PHP files, unless they share some code. Each PHP file can be thought as a seperate program entity - they don't have any shared 'state', at least not directly.<br><br>Node, on the other hand, is often just one program that is always running in the background and handles requests as it gets them. You need to think more about how the routes work, which middleware to use and how to respond to different kind of requests. It will take some time to get used to, but I think it will be worth it in the long run. It's also easier to share js code between front end and back end, if there is a need for that.<br><br>With PHP I could never figure out how to launch it in localhost together with Angular front end, or how to setup databases for testing. As a result I often tested PHP on a live server, which wasn't an optimal solution. With Node it is a lot easier to link them together and run tests.<br><br>At the moment Node only needs to handle blog and comment APIs and redirects to Angular app. I already had moved to REST API approach with PHP when I switched to Angular front end, so front end didn't need many changes.<br><br><b>New features: gzip, HTTPS and HTTP2</b><br><br>Previously I couldn't use gzip, because the server didn't support it. I have now enabled it on this new server, and as a result downloading speeds are improved. Around five months ago I improved loading speed by almost 3 times when I moved from Angular JIT compilation to AOT compilation. After that I got the following results from Web speed test:<br><br><img src="/images/blogs/aot-performance.jpg" /><br><br>On the new server, after enabling gzip, the results are now like this:<br><br><img src="/images/blogs/gzip-performance.jpg" /><br><br>In other words, the loading time is now over twice as fast as before, or almost 6 times faster than it was over five months ago. That's a huge improvement!<br><br>In addition to gzip I have also enabled HTTPS and HTTP2. With HTTPS the connection is encrypted and thus more secure, so you can send me all of your darkest secrets through the comments, if you so wish. ;) <br><br>These days HTTPS sertificates can be obtained for free from <a href="https://letsencrypt.org/">Let's Encrypt</a>. I would recommend HTTPS to anyone who has a website - it is very easy to set up.<br><br><b>Email server</b><br><br>On my old website I had email set up automatically for the server. Even though I very rarely use it, I still wanted my old email address to work. Hosting a full email server is a huge task, so I opted to instead use Mailgun which does all the hard parts for me. It is free up to 10,000 emails and it provides a very good api interface, so it fits well for my current and future needs.<br><br><b>What's next</b><br><br>I have now restored most of the previous functionality to website - RSS feed is a little out of date, but I will try to either update it or add a mailing list to tell when new blogs are added.<br><br>Another thing I need to think about is the admin page, because the server by default has very few control panels. I will probably install some kind of GUI for my postgresql database (formerly MySQL), so that I don't need to do everything from command line. I should also add managment tools for blogs and comments, and proper authentication for all the admin features (this blog was posted with a direct post request, which isn't optimal).<br><br>In the later future I can use the new Node server to do websocket apps, which would allow me to do things that require constant online access, such as online games or chat-like apps. I'm not sure yet what I'll try first, but I'll definitively figure something out.<br><br>That's all I have for now - until next time.<br><br><i>PS: If you want to host your own website, you can use <b><a href="http://www.digitalocean.com/?refcode=8d805767dfc0">this link</a></b> to get $10 credit for free at DigitalOcean. It is enough to host a website for 2 months, so it's a good way to try it out.</i>]]></description>
            <link>https://voyacode.com/blogs/31</link>
            <guid isPermaLink="true">https://voyacode.com/blogs/31</guid>
            <pubDate>Sun, 02 Jul 2017 12:14:37 GMT</pubDate>
        </item>
    </channel>
</rss>