Designing Elixir Systems With OTP – Part One

These are my notes as I am working my way through this book from PragProg. I am reading a beta version of the book so things may change. I’ll be posting this to my github repo: (https://github.com/chriseyre2000/designing_elixir)

The introduction starts by building a simple counter application as the domain and wraps it in a send/receive process with an API.

The code samples here are clean and clearly ordered. The only note that I have found so far is that I needed to delete the autogenerated Count.ex file to avoid a warning.

Takeaway from the intro: Supervisors are about lifecycles.

Takeaway from type introductions: Send the functions to the data!

Chapter 3 typo: response.ex is misspelt (reported).

excoveralls looks to be a better code coverage tool.

Essential Docker and Docker-Compose Commands

I spent some of today trying to work out why a Docker image that I was using did not work correctly.

This is the most useful command for debugging a docker image:

docker run -it ubuntu /bin/bash

This is based upon the ubuntu image but feel free to replace it with whatever you are trying to debug.

You can do the same with a docker-compose

docker-compose run --entrypoint /bin/bash

I am now looking at how to run a gradle build inside a docker container and then extract the log files. It will be especially useful to be able to view the log files. It may be possible to add a web server to view them.

It’s now also possible to stand up a Jenkins server on your dev machine to test Jenkinsfiles before deploying them to the build server.

docker run -p 8080:8080 -p 50000:50000 jenkins/jenkins:lts

Designing for Scalability with Erlang/OTP using Elixir – Part 2

Chapter 4

I am still working my way through this book.

The main difficulty that I am finding with chapter 4 is the scattered nature of the example code. It’s all small snippets, some of which references code in the previous chapter.

This is in addition to the Erlang => Elixir translations:

  • atom => :atom
  • VariableName => variable_name
  • moduleName => ModuleName

Some of the Elixir functions are reordered for consistency.

#Erlang
1> gen_server:start_link({local, timeout}, timeout, [], []).

#Elixir
1> c "Timeout.ex"
2> GenServer.start_link(Timeout, [], name: :timeout)
3> GenServer.call(:timeout, {:sleep, 100})
4> GenServer.call(:timeout, {:sleep, 5001})

The GenServer will by default timeout if a message takes more than 5 seconds to respond.

Samples for Chapter 4 have been pushed to

https://github.com:chriseyre2000/scalability_ex with hash: cc8dd44

Chapter 5

The Erlang sys module can be reference from iex using :sys

The the log commands require atoms.

:sys.log(Frequency, :print)

:sys.log(Frequency, :get)

Now at hash: 830c1ea

Chapter 6

This is an odd chapter for Elixir as the Finite State Machine was considered defective and was not adapted as a wrapper. I am going to skip this chapter

Chapter 7

GenEvent has also been Deprecated

Designing for Scalability with Erlang/OTP using Elixir – Part 1

This is another of my book walk through series. This time I am working on Designing for Scalability with Erlang/OTP.

This is the book that goes into detail about OTP in general.

Chapter 2 : Introducing Erlang

https://github.com/chriseyre2000/scalability_ex (edc0862)

The above is a repo with commit hash of where I have currently got to.

So far we have some basic recursion and some manual send/receive logic.

We have now seen the first use of ets – Erlang Term Storage a simple persistence layer.

Notes from Twitter

You build it you support it (even if not first line). It’s the other side of continuous delivery. You also need the right to fix anything that could wake you up. It focuses the developers minds to proper testing, logging, auditing and error handling. There is no wall.

Management frequently believes that the value of the advice is proportional to the cost. Also, knowledge gained outside an organisation is more valuable than that gained within.

Automate Your Life: Banking

I have been encouraging my team to automate everything.

To live by this principle I have started to automate my bank spreadsheet. I keep a spreadsheet with all of my banking transactions. This comes in handy should I need to investigate an old transaction. My current sheet covers the last 9 years. I had been manually copying the details over.

My bank allows statements to be exported as CSV files.

Here is a simple bash script that allows the data to be put into the format that I want:

cat filename.csv | awk -F, ‘{print $1 “,” $5 “,” $7 “,” $6 }’ | sed ‘1d’ | tail -r

I like the newest transactions at the bottom and credits before debits.

This becomes much easier to import into a google sheet rather than fighting with open office.

I have multiple current accounts and rebalance the main current account to a fixed amount at the end of each month. The remainder is moved into an offset mortgage. Credit card bills get paid (when due) from the mortgage account. This maximised the offset benefit.

How Jenkinsfiles Really Work

I only recently encountered the Jenkinsfile format. Previously I had used the clunky Jenkins UI or the cleaner Circle CI options.

Some of my colleagues had described it as using a special Groovy declarative syntax. It is in fact simply Groovy code using some neat tricks.

Groovy allows a closure (what other languages may call a lambda) to be used as a function parameter:

def block(Closure closure) {

closure.call()

}

This can then be used as follows:

block( { print (‘hello’) } )

Groovy allows the closure to be moved outside the brackets as a code block:

block() {

print ‘hello’

}

Here I have started to use the groovy trick of dropping brackets. In fact you can also drop the leading brackets:

block {

print ‘hello’

}

This is beginning to look like the pipeline or stage steps from a Jenkinsfile.

You can even add parameters:

def wrap(marker, Closure closure{

println marker

closure.call()

println marker

}

// Which can be used as:

wrap (‘name’) {

// something …

}

Jenkinsfiles are code pretending to be config, with the added benefit of being able to become code again when needed.

Painless is Painful

Elastic search 5.6 reached its end of life date on Monday. I work on a project that uses Elastic search extensively. The downside is we have recently had a refresh of the development team leaving us with no more than 6 months exposure to a 2 to 4 year old code base.

The first discovery of the upgrade (to 6.6.1) was the new restriction that you need to be explicit about content types. This is not too difficult but does require a few changes.

The second big discovery was the move to not allow multiple types within an index. This can be resolved by adding a type field and using that to discriminate.

The third discovery was `application/x-ndjson` which is used in the bulk update process. This is a content type that takes a list of JSON items each terminated with a newline. This forms an equivalent to a CSV file. For bulk updates you send pair’s of action/metadata followed by a body element with the details.

The fourth discovery is the new language called Painless. This replaces Groovy which we had previously used. Despite various claims it is not a drop-in replacement. It’s a paired down Java with almost no syntactic sugar. Adding arrays – nope, converting to sets or lists, nope. String split requires regex to be enabled (which it advises against). To pass in parameters requires the undocumented params object. There is no cli or compiler to use to test this – just error messages that try to point you in the right direction. The name Painless itself makes it hard to search for. I understand what it is trying to be (efficient and secure) but it comes across as clumsy.

Investigating Amplify

Here are my notes on creating a react app in amplify following the steps here: https://aws-amplify.github.io/docs/js/react

To make this interesting I have started with an AWS user that has zero access rights. Rights will be added as needed. I am developing this on a mac.

Started with installing the amplify cli.

npm install -g @aws-amplify/cli

This starts with 5 warnings:

npm WARN deprecated circular-json@0.3.3: CircularJSON is in maintenance only, flatted is its successor.

npm WARN deprecated kleur@2.0.2: Please upgrade to kleur@3 or migrate to ‘ansi-colors’ if you prefer the old syntax. Visit <https://github.com/lukeed/kleur/releases/tag/v3.0.0\> for migration path(s).

npm WARN deprecated minimatch@2.0.10: Please update to minimatch 3.0.2 or higher to avoid a RegExp DoS issue

node-pre-gyp WARN Using request for node-pre-gyp https download

npm WARN graphql-import@0.4.5 requires a peer of graphql@^0.11.0 || ^0.12.0 || ^0.13.0 but none is installed. You must install peer dependencies yourself.

I now have @aws-amplify/cli@1.1.7 installed.

Step 2 configure:

amplify configure

This fails with:

SyntaxError: Unexpected token ...
    at createScript (vm.js:74:10)
    at Object.runInThisContext (vm.js:116:10)
    at Module._compile (module.js:533:28)
    at Object.Module._extensions..js (module.js:580:10)
    at Module.load (module.js:503:32)
    at tryModuleLoad (module.js:466:12)
    at Function.Module._load (module.js:458:3)
    at Module.require (module.js:513:17)
    at require (internal/module.js:11:18)

nvm reveals that I am using node 8.0.0

Now move node up to the current lts version (10.15.3), repeat the cli install (same as before), now onto the configure:

This time it prompts me to login to aws and create a new user.

Apparently the minimum access rights are: AdministratorAccess

The user has been created and the access key stored in a profile.

Now I need to ensure that create-react-app is installed (javascript is very fashion conscious and all the cool kids now use yarn):

yarn global add create-react-app

Now working on the amplify init

Here I get prompted for a name (matching the app helps).

Also:

Environment (dev)

Preferred editor (VS Code)

Type of app (javascript).

Framework: (react).

There are several others here but the defaults work fine.

It then prompts you for a profile to use then starts doing the aws magic.

This creates all of the local config without yet pushing it to the cloud.

The next command is:

amplify add hosting

There are 2 options here: dev using http and prod using https.

For now I am going for dev and followed the defaults.

The following will deploy the app:

amplify publish

This will create the application in s3 and configure cloudformation infront of it.

My trivial app is now deployed.

Next step is to add authentication.

amplify add auth
amplify push
yarn add aws-amplify aws-amplify-react

I have added in the required boilerplate code then called: amplify publish

Testing creating an account hits one of my pet hates: invisible limits on password entry systems. The default cognito password rules require one Capital and one Symbol (already included lowercase and numbers).

The next trick will be to learn how to configure the join, login screens and verification codes – but this is outside the scope of this exercise.

To avoid leaving you with amazon bills there is the option of:

amplify delete

Mischief Managed.

Another Useful Dependabot Command

Commands are sent to dependabot by commenting on a pull request.

@dependabot ignore this major version

This command will prevent dependabot from suggesting any further upgrade to a given dependency until it has been manually advanced past a certain point. This is really useful if you want to keep your node to a LTS version and don’t want to keep getting notified of bleeding edge changes.