Create a secured RESTful API with Node.JS in 30 minutes

Today API’s are a must for most app’s running on devices collecting or fetching data of any kind or website’s either running as a single page application or simply fetching writing/reading data from its internal or some external system.

Often, API’s are even more important than the application itself, so starting to design and implement the API first makes sense.

In this article I am going to describe a simple Node.JS based RESTful API I created for my app “Puckify” to collect some statistical data and in-app purchases users do. Today it is very easy to set up a simple API, so i am going to show how to do that in 30 minutes.

This API, once finished, could be integrated into systemctl on a Linux based OS to make it auto boot and even restart on crashes.

I will follow a simple MC (Model-Controller) architecture, set up a version based API with some GET and POST function while connecting to the MongoDB Cloud (you can also connect to your local MongoDB).

On the following figure you can see the folder structure of our API we are going to create. At the end of this article I will offer a ZIP file containing the complete installable API on any computer having Node.JS installed.

As you can see we have two sub folders called “api”, which has the main logic of our api and a config folder were we will put our DB connection stuff.

server.js is the main script file we will need to start to run the API, main.js contains the bootstrapping code to set up everything from the api folder.

This API will be able to support multiple versions. Keep in mind that any API running today never ever runs only a single version. Once you update your running API, having your clients running on iOS or Android based devices from the App Stores, people might update the app weeks or months later after your new API version has been published. That means users will still access your API the way the old client on their smart phone works. In this case we need to support multiple versions on our API so users don’t get into trouble once you update your API.

I basically follow the version string format [MAJOR].[MINOR].[PATCH] like 1.0.0. If you fix a bug or do some simple text changes, you should increase the PATCH number like from 1.0.0 to 1.0.1. MINOR stays for minor changes in the application which in some case can affect clients connecting to your API. So changing your API from 1.0.0 to 1.1.0 is to be added to your current running versions having both 1.0 and 1.1 running on your API. MAJOR stays for big changes which also leads to changes affecting your clients and should be added to the current running versions. 

Patching your API from 1.2.3 to 1.2.4 should NOT affect your clients connecting to the 1.2 version. This is why for my controller and model files I only keep the MAJOR and MINOR part of the version as PATCH is not to be considered for API’s.

It is up to you how you define the version format. You could use simple revisioning number like a single number or a longer one like 1.2.10.344.23. But keep in mind to reflect it properly on your API (as you might have a website running along with the API having the same version format).

This API will also support running in multiple environments. For now we will define a TEST and PROD mode. The difference between both will be the virtual host it will listen on as in TEST mode it will listen on api-test.eser-esen.de while on PROD it will listen on api.eser-esen.de.

We will also secure this API with a simple HMAC-SHA256 hashing algorithm. That means, clients accessing this API will need to send an Authorization header along the request containing a plaintext value (let’s call it username) and a hash generated out of the username. The API will generate a hash using the username and the secret key stored and match the hashes. If it matches it will grant access, block otherwise with a 403 status code.

Our application is Node.JS based, so we will code in JavaScript. The modules we are going to use for our API will be:

  • mongo (MongoDB driver)
  • https (the tool for the network part to handle HTTPS)
  • express (the web framework handling the requests & responses)
  • vhost (to define virtual host like api.domain.com, as we do on usual web servers)
  • fs (to handle the certificates as we will run with HTTPS)
  • crypto (to handle the authorization part as we will allow access with simple HMAC-SHA256 tokens only)
  • more stuff we will use are
    • moment (as one of our functions will simply return the current timestamp using momentJS)
    • body-parser (to easy parsing request bodies when you API is running)

server.js

First, we will run in strict mode and load all modules we need for server.js. Using “use strict”; simply makes YOU code cleaner than without, such as not allowing you use undeclared variables.

"use strict";

var express = require("express");
var vhost = require("vhost");
var crypto = require("crypto");
var fs = require("fs");
var https = require("https");
var port = 9999;
var params = {};
var current_key = "";
var hash_key = "MySecretPassphrase";

Our port will be 9999, but you can define any port you want. Keep in mind, if you run this API on a server which has a web application running on port 80 or 443, this API will not run when you try to use the same port. Later i will explain how to make it run on port 80 or 443 parallelly with your other web applications on the same server using the same web server like Nginx.

params will hold all parameters send along with the application execution from the command line.

current_key will be used to parse all the parameter you offer to this application when starting. As we will support multiple environments we will check for the “env” parameter expecting either empty (to default to TEST) or TEST or PROD.

hash_key is used for our simple token based authorization logic. We will use the HMAC-SHA256 method which will take the plain-text username the clients are going to send, encrypt it using hash_key and compare the result with the hash the client has sent along the request to allow or disallow the request. This is a very simple authorization method but still good enough to secure your API. Later I will explain what other steps you can take to make your API even more secure.

Now lets add the code to parse the arguments we get from the command line and store it in:

// iterate through parameters providing for this server and store
process.argv.forEach(function (val, index) {
  if( index >= 2) {
    if( index % 2 == 0 && val.startsWith("-")) {
      current_key = val.substr(1).toLowerCase();
      params[current_key] = "";
    } else {
      params[current_key] = val;
    }
  }
});

As the first two elements from process.argv will contain the path to the node application and your server.js file we need to start parsing from index 2. The way we will call our API with parameter will look like the following:

node server.js -env PROD

This is why the first IF block is checking for the dash to start saving it as key in the current_key object expecting the next argv element being the parameter value.

In the following code we define our encryption function to generate the hash using HMAC-SHA256 and our hash_key:

// main encrypt function using hmac256
function encrypt( text){
  var crypted = crypto.createHmac("sha256", hash_key).update(text).digest("hex");
  return crypted;
}

Now we do check our params object for the env parameter, default to TEST if it is empty or throw error if the caller provided neither TEST or PROD.

if( !params["env"]) {
  params["env"] = "TEST";
}

// env param is mandatory
if( params.env != "TEST" && params.env != "PROD") {
  console.log("Error: env parameter is mandatory. Allowed values are TEST and PROD.");
  process.exit(1);
}

As we will support HTTPS, we need a certificate (and the private one) to secure our connections. You can either use self-signed certificate (not recommended) or by a certificate for your production environment or use Lets Encrypt (recommended) for free and very easy to set up using the script certbot-auto.

Check this link to see how you can set up your certificate using certbot-auto free.

The steps to create your certificate with certbot-auto are easy as described as follows:

  1. Run certbot-auto certonly
  2. Choose Nginx if you have a running Nginx server or webroot if you prepared a web folder where certbot can set up your certificate as it will try to verify the domain you are trying to set up the domain for
  3. Type in the domain name you want to create the certificate for.
  4. Once finished, you will find your certificate files at

    /etc/letsencrypt/live/xyz.yourdomain.com/privkey.pem

In my case I set up a certificate for api-test.eser-esen.de. So add the following code to server.js (using my path in this case)

var options = {
  key: fs.readFileSync("/etc/letsencrypt/live/api"+(params["env"] == "TEST" ? "-test" : "")+".eser-esen.de/privkey.pem"),
  cert: fs.readFileSync("/etc/letsencrypt/live/api"+(params["env"] == "TEST" ? "-test" : "")+".eser-esen.de/fullchain.pem")
};

As you can see, I do check for the “env” parameter to switch between the certificate files. I created one for TEST and one for PROD. This is required as my app “Puckify” was running on iOS and iOS does NOT allow connecting to non secured or self-signed URI’s. Actually, there is a way to allow self-signed certificates access by iOS devices but I found it way to hard to make it running, so instead, spend 3 minutes using certbot-auto to create a valid certificate.

Now the main part of server.js comes into play. Using everything from above we now create our express instance which allows us to define rules to handle incoming requests in a sequential order.

First we check for authorization, then we check for the domain the client is accessing this application from, finally blocking everything else if none of the above rules match.

var app = express()
  .use(function(req, res, next) { // handle authorization first
    var path = req.path && req.path.length > 0 && req.path.startsWith("/") ? req.path.substr(1) : "";
    var tokens = path.split("/");
  
    var auth = req.get("Authorization");
    if( !auth || !auth.match(/:/)) {
      res.status(401).send({error: "001: Unauthorized"});
    } else {
      tokens = auth.split(":");
      var encrypted = encrypt(tokens[0]);
      if( encrypted != tokens[1]) {
        res.status(401).send({error: "002: Unauthorized "});
      } else {
        // OK
        next();
      }
    }
  })
  .use(vhost("api"+(params.env == "TEST" ? "-test" : "")+".eser-esen.de", require("./main.js").app))
  .use(function(req, res) { // if none matches, throw 404
    res.status(403);
  });

In the first “use” function call we fetch the “Authorization” header which has to look as follows:

Authorization: itsme:3b4852a227a9ffb99cb0e5480ff12cff6de88fd3410859f5c36499af839b4d58

We take the plain text part which in this case is “itsme”, use the encrypt function to generate the hash and then compare it to the one from the Authorization header. If it matches, we continue, if not, we block the request with the code 401 (Not authorized) returning some error message in the body (just for info). I also added two kinds of error messages. The first with 001 comes into effect if something is wrong with the header, the second, if the hash does not match (wrong password used?).

In the second “use” function call we check for the virtual host the API is actually being called for, if the defined host matches, we include our main.js starting the code behind.

If none of both “use” calls match, we end up with 403 (Forbidden). This can happen if the authorization is correct but the client somehow connected to this API with another host as defined in server.js. So the last “use” call should never happen if you set up everything correctly.

The main code for server.js is done and we finally start the server printing a string into the command line telling that the API is now running.

https.createServer( options, app).listen(port); // start listening
console.log("test api "+params.env+" started on port " + port);

main.js

The code in main.js are only a few lines. It basically connects to MongoDB and once connected, initialize the routes for the API to make it listen on those.

"use strict";

var express = require("express");
const MongoClient = require('mongodb').MongoClient;
var bodyParser = require("body-parser");
var app = express();
var config = require("./config/db");

app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());

// init mongo db client then init routes
const client = new MongoClient(config.url, { useNewUrlParser: true });
client.connect(function(err, database) {
  if (err) return console.log(err);
  const db = client.db(config.dbname); // connect to database with name "test"
  require("./api/routes").default(app, db);
});

exports.app = app;

db.js

db.js simply holds the DB credentials plus the database name. The connection string is to be found on the MongoDB Atlas account using the Short SRV string working with MongoDB Version 3.6+. Depending on how you want to connect, MongoDB Atlas provides the details on how to connect. More details about how to connect with Node.JS can be found here.

"use strict";

exports.url = "mongodb+srv://[USERNAME]:[PASSWORD]@clusterXXX-XYZ.mongodb.net?retryWrites=true";
exports.dbname = "test";

routes.js

In the routes.js file we initialize the controller for version 1.0 and define the routes our API will listen on. We map each route to a function call on the controller using the method functions the router in express is providing us. At the end of all valid routes, we need a wildcard one to catch up all invalid routes leading to a 404 with some basic error message, otherwise your client will wait endlessly until the client timeout triggers as the API does not respond at all without this last route.

"use strict";

exports.default = function(app, db) {
  var v1_0 = require("./controllers/controller.1.0");

  v1_0.model.db = db;
  // Routes
  app.route("/1.0/datetime").get(v1_0.get_datetime);
  app.route("/1.0/book").post(v1_0.post_book);
  app.route("/1.0/book/:id").get(v1_0.get_book);

  // we need this to end all other routes this API does not know with 404
  app.get('*', function(req, res){
    res.status(404).json({error: "Unknown route"});
  });
};

Now imagine you are implementing a version 2 of your API offering more functions. The routes.js file could look like as follows:

"use strict";

exports.default = function(app, db) {
  var v1_0 = require("./controllers/controller.1.0");
  var v1_1 = require("./controllers/controller.1.1");
  var v2_0 = require("./controllers/controller.2.0");

  // assign the db instance to all models behind all controllers
  v1_0.model.db = db;
  v1_1.model.db = db;
  v2_0.model.db = db;

  // Routes
  app.route("/1.0/datetime").get(v1_0.get_datetime);
  app.route("/1.0/book").post(v1_0.post_book);
  app.route("/1.0/book/:id").get(v1_0.get_book);

  // routes for 1.1, added two more function on this version
  // addbook for example could work differently here compared to version 1.0
  app.route("/1.1/datetime").get(v1_1.get_datetime);
  app.route("/1.1/book").post(v1_1.post_book);
  app.route("/1.1/book/:id").get(v1_1.get_book);
  app.route("/1.1/article").post(v1_1.post_article);
  app.route("/1.1/articles").get(v1_1.get_articles);

  // routes for 2.0, added two more functions compared to 1.1
  // again, existing functions from versions before could work different here
  // older clients wont work against 2.0 probably, so you keep offering 1.0 and 1.1
  app.route("/2.0/datetime").get(v2_0.get_datetime);
  app.route("/2.0/book").post(v2_0.post_book);
  app.route("/2.0/book/:id").get(v2_0.get_book);
  app.route("/2.0/article").post(v2_0.post_article);
  app.route("/2.0/articles").get(v2_0.get_articles);
  app.route("/2.0/video").post(v2_0.post_video);
  app.route("/2.0/video/:id").get(v2_0.get_video);

  // we need this to end all other routes this API does not know with 404
  app.get('*', function(req, res){
    res.status(404).json({error: "Unknown route"});
  });
};

To ease the handling of your growing routes, you could split the routes.js file into version based files like routes.1.0.js and routes.1.1.js etc.

So in main.js you simply add as many

require("./api/routes.X.Y.js").default(app, db);

as you have versions deployed. So every time you update or patch a single version you only need to deploy the single file and your routes.X.Y.js files will not grow as the example above the more versions you deploy.

controller.1.0.js

The controller file defines all functions your API for the given version will offer. Here you basically do some validation (some people do it on the model, it is up to you) and the model should handle the real read/write stuff.

For now we have a simple datetime function which generates three types of DateTime strings we can generate with momentJS, then a function to add a book storing a document into MongoDB and another one to read out by using its ID.

"use strict";
var model = require("../models/model.1.0");
var moment = require("moment");

exports.model = model;
exports.get_datetime = function(req, res) {
  var date = new Date();
  res.json({datetime: date.getTime(), datetime2: date.toUTCString(), datetime3: date.toISOString()});
};

exports.post_addbook = function(req, res) {
  var params = req.body;
  params.datetime = moment().format("DD.MM.YYYY HH:mm:ss");
  model.addBook(params, function(ok, details) {
  	if(ok) {
    	res.status(200).json({details: details});
  	} else {
    	res.status(400).json({error: "Error storing book", details: details});
  	}

  });
};

exports.get_book = function(req, res) {
  console.log(req.params.id);
  var results = model.getBook(req.params.id, function(results) {
    res.json(results);
  });
};

As you remember the route for reading a single book had a single parameter “:id”. Whatever you now put into the query right after /book/ will be available in the req variable at req.params.id.

model.1.0.js

The model file has the real logic doing the read and write stuff. Here you should do the real work like reading documents from MongoDB (or whatever source) and write into it.

I basically work with callbacks as all DB related actions are asynchronous. Be careful when you do stuff synchronously. Fetching data with some driver working asynchronously while calling it in a synchronous way will lead to empty results ending in hours of headaches wondering why you don’t get any result. Database and network related actions better be handled asynchronously. There are ways to fetch and write  data in a synchronous way, but that leads to your API blocking the current thread as long as the action runs. This is important when your API does handle other stuff in the same thread at the same time.

exports.test = function() {
  console.log("hi");
};

exports.addBook = function(params, callback) {
  exports.db.collection("books")
  .insertOne(params)
  .then(result => {
    console.log("book insert done. inserted:"+result.insertedCount);
    console.log(result);
    callback(result.insertedCount >= 1, result.ops[0]);
  })
  .catch(err => {
    callback(false, err);
  });
};

exports.getBook = function(id, callback) {
  var mongo = require('mongodb');
  return exports.db.collection("books").find({_id: new mongo.ObjectID(id)}).toArray(function(err, results){
    console.log(results);
    callback(results[0]);
  });
};

That’s it! Now run the following commands to make it run:

npm install
node server.js -env TEST

You can find the ZIP with all files, ready to install and start here: testapi.nodejs.tar.gz. I also added the certbot-auto from https://certbot.eff.org/docs/install.html (link)

Node.JS App on Port 80 while Webserver runs on Port 80?

Nginx

You need to define a server which catches the API domain. To do so add the following configuration (using my api test domain, put yours instead) to your sites-enabled (or sites-available then symlink it into sites-enabled and) and restart Nginx.

server {
    listen 80;
    server_name api-test.eser-esen.de;

    location / {
        proxy_set_header   X-Forwarded-For $remote_addr;
        proxy_set_header   Host $http_host;
        proxy_pass         "http://127.0.0.1:9999";
    }
}

For port 443 (SSL) you better create a port 80 configuration on Nginx and use certbot-auto choosing the Nginx option. This will automatically adapt your configuration file and all configuration parameters including the certificates into your Nginx configuration.

In this case you don’t need the certificates to be loaded on your Node.JS App because Nginx will handle the SSL stuff while the connection between your Nginx and Node.JS is unsecured as it runs locally. If your Node.JS app and Nginx run on different servers, leave the certificate options in your Node.JS app and point to https://.. in your proxy_pass instead so the connection is secure.

Apache

For this you need mod_proxy to be installed and ready. Then prepare a vhost configuration for your API domain and use the following (again using my host as an example):

<VirtualHost *:80>
ServerName api-test.eser-esen.de
ProxyPreserveHost on
ProxyPass / http://localhost:9999/
</VirtualHost>

The same applies here when you run your API on port 443 (SSL), then you also need to load the certificate stuff as well. If you need to run on SSL between Node.JS and Apache then replace http with https in the ProxyPass directive.

Additional steps to make your API even more secure

Having your API completely public makes it the most complicated thing to secure your API as you never know who actually connects to your API.

There are different questions you need to clarify when you design and implement your API:

  • Will the API be completely public?
    • Free app fetching data from an API? Think of Google Maps on devices, there is no need to login, but you can use it for free, search for locations, calculate routes all loaded from Google’s API.
  • Do users always need login credentials to use my API?
    • Think of Facebook, cannot be used without logging in.
  • Do I want my API to be public but want to assure it is only accessed by my mobile app?
    • Again Google Maps. It is for free, but users can login and use their profile data (stored locations, history of locations, etc.) while using Google Maps.

Depending on your needs you need to follow some rules and apply patterns to secure your app.

If you think you can fulfill the requirement from point 3 (allow API to be used only by your app) then you are wrong. In fact, it is impossible to make your API safe from being used by bots, humans or what not using your app. What you can do is to make it as hard as possible for others to abuse your API or access in a way you don’t want them to.

You can never verify the client itself, but you can verify their behaviour. Besides all that these are the following approaches you can do to make your API more secure or at least make it as hard as possible for those who you don’t want to have access to your API.

  • Use timestamps. Integrated into the request headers
    • Sending the timestamp and comparing the one sent with the current timestamp on your server will allow you block so-called “replay” attacks. Allow a time difference of a few seconds as you cannot expect that every entity on planet earth is in perfect sync all the time.
    • If a bad guy somehow copies a request by sniffing and resending it to your API hoping to get the data he is not allowed to, it will fail as his timestamp in the request will be in the past (old enough so your time difference a way higher than a few seconds)
  • User logins.
    • If possible make your API accessible only with user credentials.
    • So any stranger won’t get any access as he has to to sign up (which you might only enforce via your app)
    • Still, if someone steals valid credentials he will get access to your API.
  •  Throttling mechanism.
    • Using the API described in this article, someone could take down your API by simply bombing it permanently with request or do a brute force like attack trying to fetch data by trying different request parameters.
    • The API would try to handle each request ending up with overloaded request coming in from a few IP’s or maybe just a single ip.
    • A throttling mechanism to be integrated by defining a frequency of allowed requests per IP
    • Additionally you could define a per-app usage quota. Either by a user login or some unique id you make your app send to the API and store it and control the traffic by that ID limiting it down to a proper frequency. Basic delays would be enough making users wait longer the more requests they do. Be careful, users connecting to your API from the same network having the same IP will be affected when another user from the same network hits the quota or threshold.
  • No keys in URI.
    • Extending your API by new functions and probably security related content might end up in putting keys in the URI. Never do that as URI are logged on servers and readable for humans while body payloads and headers are not.
    • So keep putting your keys either on the payload or the header.
  • Use Apple’s DeviceCheck API and Google’s SafetyNet API.
    • Both API’s will help you figure out by verifying a token your app is sending against the API, which then confirms the app integrity and actually proofing that this request is really coming from your app being installed on a real (iOS or Android) device.
    • Since both are also an API there is some chance that someone might spoof or even steal a valid token getting access to your API using this technique, but as I said make it as hard as possible.

Always keep in mind, making your API more secure should NOT affect your users. Your API should not require your users to do more steps as needed just because you want to triple check if they are allowed to use or slow down your API because of complex security related time-consuming tasks which at the end, kills your apps success.

If you publish your iOS app or Android app to the world, you basically public your source code too. Since anyone can decompile your app running on his device figuring out how you connect to your API and what data you are sending. Storing passwords plain text in source code may get exposed this way. So always follow the security rule: “Make it as hard as possible”.

Create textures in Swift apps without images

Creating games is fun. Making games on your own and seeing the results gives every developer the reason to continue doing so. But working on a game as a lonely wolf takes time and especially when it comes to graphics (images, sprites, textures, etc.) this might end in weeks or months finding and creating the proper graphics.

This mini tutorial is not about designing graphics or how to use tools like Photoshop or Inkscape, it is about how to create texture without the use of any single image file (PNG, JPG, etc.). The reason I decided to “code” my textures in-game is just to safe some time. When I started working with XCode, I was impressed how easy it is to code but also to set up assets. But still, you need to design and create (or buy) each single image and to deliver the best results on all devices you should offer 3 sizes for each image.

from apple.com

The technique i am about to describe has been implemented in my iOS game “Puckify” (check here).

It is about to create all textures used in your game on the fly when the app starts. The idea is to create a so-called TextureManager which does “build” all images during your app start using one of the following methods:

  • UIBezierpath. Code your image/sprite/whatever and create SKShapeNode out of it in the first step.
  • Use Paintcode. Draw or import any image or svg file, change it to your needs and turn it into Swift code and put it into a function which returns a single UIBezierpath for the image you used.
  • SVG Files with parser. Use SVG files to import into your project and parse them into UIBezierpath or CGPath to create a SKShapeNode instance out of it in the first step.

The SVG file method obviously is not the technique i am describing here, but this method is still easier than drawing images on your own.

UIBezierpath

If you are a pro you might be able to code your image/sprite right in XCode using UIBezierpath. The following shows a function which generate the UIBezierpath for a simple star shape:

func star() -> UIBezierPath {
    let star = UIBezierPath()
    star.move(to: CGPoint(x: 112.79, y: 119))
    star.addCurve(to: CGPoint(x: 107.75, y: 122.6), controlPoint1: CGPoint(x: 113.41, y: 122.8), controlPoint2: CGPoint(x: 111.14, y: 124.42))
    star.addLine(to: CGPoint(x: 96.53, y: 116.58))
    star.addCurve(to: CGPoint(x: 84.14, y: 116.47), controlPoint1: CGPoint(x: 93.14, y: 114.76), controlPoint2: CGPoint(x: 87.56, y: 114.71))
    star.addLine(to: CGPoint(x: 72.82, y: 122.3))
    star.addCurve(to: CGPoint(x: 67.84, y: 118.62), controlPoint1: CGPoint(x: 69.4, y: 124.06), controlPoint2: CGPoint(x: 67.15, y: 122.41))
    star.addLine(to: CGPoint(x: 70.1, y: 106.09))
    star.addCurve(to: CGPoint(x: 66.37, y: 94.27), controlPoint1: CGPoint(x: 70.78, y: 102.3), controlPoint2: CGPoint(x: 69.1, y: 96.98))
    star.addLine(to: CGPoint(x: 57.33, y: 85.31))
    star.addCurve(to: CGPoint(x: 59.29, y: 79.43), controlPoint1: CGPoint(x: 54.6, y: 82.6), controlPoint2: CGPoint(x: 55.48, y: 79.95))
    star.addLine(to: CGPoint(x: 71.91, y: 77.71))
    star.addCurve(to: CGPoint(x: 81.99, y: 70.51), controlPoint1: CGPoint(x: 75.72, y: 77.19), controlPoint2: CGPoint(x: 80.26, y: 73.95))
    star.addLine(to: CGPoint(x: 87.72, y: 59.14))
    star.addCurve(to: CGPoint(x: 93.92, y: 59.2), controlPoint1: CGPoint(x: 89.46, y: 55.71), controlPoint2: CGPoint(x: 92.25, y: 55.73))
    star.addLine(to: CGPoint(x: 99.46, y: 70.66))
    star.addCurve(to: CGPoint(x: 109.42, y: 78.03), controlPoint1: CGPoint(x: 101.13, y: 74.13), controlPoint2: CGPoint(x: 105.62, y: 77.44))
    star.addLine(to: CGPoint(x: 122, y: 79.96))
    star.addCurve(to: CGPoint(x: 123.87, y: 85.87), controlPoint1: CGPoint(x: 125.81, y: 80.55), controlPoint2: CGPoint(x: 126.64, y: 83.21))
    star.addLine(to: CGPoint(x: 114.67, y: 94.68))
    star.addCurve(to: CGPoint(x: 110.75, y: 106.43), controlPoint1: CGPoint(x: 111.89, y: 97.34), controlPoint2: CGPoint(x: 110.13, y: 102.63))
    star.addLine(to: CGPoint(x: 112.79, y: 119))
    star.close()
    return star
}

If you are not familiar with UIBezierpath, check out Apple’s documentation here

Use Paintcode

Paintcode is a great commercial product allowing you either to import media or draw on your own. Once done it offers you the code for different languages which allows you to programmatically render the image created or imported in Paintcode. Paintcode is free for 5 days which I very recommend to try out if you don’t know it. If you like it buy it paying $ 99 (Date: 27th Dec 2018; one time purchase). Companies pay $ 199 per seat per year.

Paintcode does all the work for you being a pro in coding your image. So simply copy the code and use it for your game/app. Paintcode offers more languages like Objective-C, C# Xamarin, Android Java, Web SVG and Javascript Canvas and some more.

SVG files with parser

Another way to code your images is by parsing SVG files into UIBezierpath in your Swift app. This requires you to import the SVG file into your project so your app is able to read in the file during app runtime to convert it to UIBezierpath.

I know four solutions so far all on github.com, I tried PocketSVG which works great unless you have some monster dead complex SVG file to handle, it might happen you will get weird results, in this case it makes sense to cut your SVG into pieces and load each part merging them into one in Swift, as follows you can see some projects I found so far:

  • https://github.com/pocketsvg/PocketSVG (link)
  • https://github.com/exyte/Macaw (link)
  • https://github.com/mike-engel/swiftvg (link)
  • https://github.com/mchoe/SwiftSVG (link)

With the following lines of code you can load your SVG turning it into a UIBezierpath:

let svgpath = Bundle.main.url(forResource: "star", withExtension: "svg")!
let star_bp = SVGBezierPath.pathsFromSVG(at: svgpath)
let node = SKShapeNode(path: star_bp.cgPath, centered: true)
node.fillColor = .white node.strokeColor = .black

First step done. What next?

Well, you could now start putting your SKShapeNode instances right into your scene. But wait, this is a bad approach. SKShapeNode represents a vector based shape which you now are about to add into your scene.

Your app is now going to render a vector graphics every frame. If you do so you will see you CPU will raise up quickly since it is busy with calculating and drawing your shape. For a simple shape it might be useful to scale and manipulate on each frame, but this is not the way to go for most sprites being drawn as this would eat up all your CPU power on the device.

We need to work with SKSpriteNode which is build out of SKTexture instance which has your sprite.

So after you have created your SKShapeNode with a proper fill color, stroke color and probably some more, you need to do the following:

let node = SKShapeNode(path: star_bp.cgPath, center: true)
let view = SKView(frame: UIScreen.main.bounds)
let texture = view.texture(from: node)!
let el = SKSpriteNode(texture: texture)
self.addChild(el)

Line 2-4 are the missing steps to create your sprite which you now can add to your scene which definitely performs better than handling SKShapeNode instances.

There is no need to use your running SKView in your app to convert your SKShapeNode instances into a texture. It works fine with a separate instance.

But what about the sizes?

This is the next step we need to handle. The first question you need to answer is what devices you are going to support and what is the current smallest device you will need to support.

When I worked on “Puckify”, I had to check the current iOS distribution and what devices people were using. There are different sites providing you the information you need. One site I found was from David Smith (link). Using the statistics from his app “AudioBooks”, he offers a good overview of current iOS distributions today. What you also need to know are the device sizes and their resolutions. This you can find here.

As you can see at iosres.com, the smallest device on the left is the iPhone 8, which in this case was my chosen default one. Now looking at the logical resolution of 375×667 you need to find a proper size either for all your sprites or define for each. My sprites are quadratic, so i decided having 100×100 sized sprites look good on an iPhone 8.

So my TextureManager class was building up buttons, backgrounds, icons like shields, stars, my player sprites, enemies etc. all sized 100×100.

But once you start your app on an iPad or the iPhone 8+ your sprites will look blurry and pixellated, simply because 100×100 is way too small for a bigger device.

So why not scale up so. My approach was very simple.

In my TextureManager I defined 375×667 (or 667×375 when you run it in landscape mode) as my default screen size. Once the screen got bigger, I calculated the scale factor looking only on the width of the device.

For this I built a simple sizeMultiplier function which I was now multiplying the width and height of each SKShapeNode using the CGAffineTransform class.

struct DefaultResolution {
  static let width = CGFloat(667)
  static let height = CGFloat(375)
}

static func sizeMultiplier() -> CGFloat {
  let ssize = UIScreen.main.bounds.size
  if ssize.width > DefaultResolution.width && ssize.height > DefaultResolution.height {
    let w = ssize.width / DefaultResolution.width
    return (w * 10).rounded() / 10 // round to one decimal place
  }
            
  return 1.0;
}

Now using the code above all my sprites, walls, enemies, just everything, i do scale by using sizeMultiplier().

One full example using my TextureManager looks like this:

let scalefactor = Constants.Config.sizeMultiplier()
let scaleRefresh = CGAffineTransform(scaleX: scalefactor, y: scalefactor)
// ### create REFRESH texture for refresh button
let r = UIBezierPath.refresh() // one of my UIBezierpath function returning the path for a refresh icon
r.apply(scaleRefresh) // now scale the bezierpath before creating SKShapeNode
var refresh = SKShapeNode(path: r.cgPath, centered: true)
refresh.fillColor = .white
refresh.strokeColor = .black
self.addTexture(withName: "refresh.icon", texture: self._view.texture(from: refresh)!)

Now I don’t need to care about my refresh icon anymore, no matter on what device I use this icon, it is scaled properly for the device. I tested this on an iPhone 6, iPhone 6+, iPhone 7, iPhone 8+, iPad 2 and latest iPad (9.7 inch; 6th generation). Everything is scaled properly. The following screenshots using the iOS Simulator shows my main screen on the app “Puckify”.

Warning: The logic described only works if your app either runs in landscape or in portrait mode, not in both. If you want to support both modes in your app, then you need to handle it in your code. Looking again at the iPhone 8 with 375×667 in its logical resolution. Rotating this device would result in 375 becoming 667, in this case you don’t want to scale up since it is still the same device. You should focus on the value given by UIScreen.main.bounds.size und check both, width and height to doublecheck if the size is really bigger than your default one chosen in the beginning. In this case, when rotating, you should NOT scale up.

[slide-anything id=”138″]

A note about scaling

Never ever use .setScale on the SKShapeNode, because it is a simple scaler without any proper algorithm behind, if you scale up a shape by let’s say 2 times its width, you will clearly see it gets pixellated.

Instead use following code (as seen above) to scale the Bezierpath before you create the SKShapeNode instance:

let scaleRefresh = CGAffineTransform(scaleX: scalefactor, y: scalefactor)
let r = UIBezierPath.refresh() // one of my UIBezierpath function returning the path for a refresh icon
r.apply(scaleRefresh) // now scale the bezierpath before creating
// now do create the SKShapeNode instance using the bezierpath instance