Livebooks are an amazing piece of Technology

Let’s start to show the amazing power of livebooks without even touching custom code.

First install livebook from this site: https://livebook.dev/

This provides a host to run a livebook.

Start a new livebook

Paste in the following url then hit enter:

https://github.com/chriseyre2000/livebooks/blob/main/EventStorming.livemd

You now have a local copy of my mermaid starter for Event Storming.

Yes livebooks allow you to open sample pages that happen to be sitting on a website.

If you click on the edit button that gives you the ability to see the source used to create this.
How over the main image and hit +Block and you can add another diagram block and copy and edit the key that I created.

Theoretically these can be edited collaberatively.

Event Schema Evolution in CQRS Event Sourcing

This is a collection of thoughts on this big topic. If you are working on an event sourced system it will need to cope with changes to the structure of events over time.

Here is some of the literature that I have found on this topic:

Avro and Protobuf are wire protocol schemas.

For avro, here I am quoting from: https://docs.oracle.com/cd/E26161_02/html/GettingStartedGuide/schemaevolution.html

These are the modifications you can safely perform to your schema without any concerns:

  • A field with a default value is added.
  • A field that was previously defined with a default value is removed.
  • A field’s doc attribute is changed, added or removed.
  • A field’s order attribute is changed, added or removed.
  • A field’s default value is added, or changed.
  • Field or type aliases are added, or removed.
  • A non-union type may be changed to a union that contains only the original type, or vice-versa.

Beyond these kind of changes, there are unsafe changes that you can do which will either cause the schema to be rejected when you attempt to add it to the store, or which can be performed so long as you are careful about how you go about upgrading clients which use the schema. These type of issues are identified when you try to modify (evolve) schema that is currently enabled in the store. See Changing Schema for details.

Avro tends to be used with centralised registries. This is both a benefit for consistency and.a constraint for rapid change.

Protobuf is slightly more flexible. Given that the underlying storage is determined by ids it is the practice to change an id when the meaning of a field changes. It is entirely possible to have two protobuf definition files that are compatible yet don’t share any field names. This is great for translations across bounded contexts. Protobuf permits a slight read/write disparity. Just because a field is required on the write side it may be optional or omitted on the read side.

Working with releases on github

I have built a great command for working with releases:

 gh alias set --shell pending 'git log $(gh release list -L 1 | cut -f 1)..@ --pretty="%an - %s" | sort '

This requires you to have `gh` installed along with git, cut and sort.

If you have a specific branch that is meteged into that used for releases this identifies the people whos commits will form part of the next release!

Kubernetes and Elixir Part 3

In part one of this series I linked to an article on how to integrate Elixir with Kubernetes.

The steps so far:

  • – Create an elixir application
  • – Use slipway to create the Dockerfile
  • – Build the Dockerfile (and possibly test locally)

The next steps involve getting the docker image to a registry that Kubernetes can see.
One this has happened we need to create a kubernetes deployment.

This can be performed via kubectl apply see: https://kubernetes.io/docs/concepts/workloads/controllers/deployment/

Presumably there is an equivalent using Helm see https://helm.sh/docs/intro/

Once we have a basic application running in K8S the trick is to cluster it correctly, but that is a later article.

Using and abusing Docker Registry and Minikube: https://minikube.sigs.k8s.io/docs/handbook/registry/

Mermaid Event Storming

I have recreated the event storming Plantuml for mermaid:

```mermaid
graph TD;
classDef facadeCommand fill:#779fae
classDef command fill:#aec6cf
classDef result fill:#cfcfc4 
classDef event fill:#ffb853
classDef domainEvent fill:#ffcb81
classDef integrationEvent fill:#ffdeaf
classDef query fill:#62d862
classDef readModel fill:#77dd77
classDef userInterface fill:#a2e8a2
classDef aggregate fill:#fdfd9d
classDef service fill:#fcfc78
classDef policy fill:#b6a2db
classDef saga fill:#c9bbe5
classDef process fill:#ddd4ee
classDef timer fill:#cfcfc4
classDef person fill:#ffd1dc
classDef system fill:#ffd1dc
classDef comment fill:transparent

FacadeCommand:::facadeCommand --> Command:::command
Result:::result --> Event:::event
DomainEvent:::domainEvent --> IntegrationEvent:::integrationEvent
Query:::query --> ReadModel:::readModel
UserInterface:::userInterface --> Aggregate:::aggregate
Service:::service --> Policy:::policy
Saga:::saga --> Process:::process
Timer:::timer --> Person:::person
System:::system --> Comment:::comment

```
Image colours as before

Announcing Slipway

This is a mix extension that creates a boilerplate docker image for any mix project.

https://github.com/chriseyre2000/slipway

To use this download the release and use:

mix archive.install slipway-0.1.0.ez

This adds a new command to mix.

mix slipway.gen.docker

This generates a minimal docker file for the elixir project, based upon https://pentacent.medium.com/getting-started-with-elixir-docker-982e2a16213c but now working with mix release instead of distillery.

Slipway will gain several other generators over time. The intent is to create a tool that eases the use of Elixir in Kubernetes.

Elixir Dependencies

I was going to write a dependency checker for Elixir using :digraph to model the dependencies.

However while trying to work out how to read the deps from a mix.exs file (without parsing it myself) I found an existing command.

mix deps.tree

This solved the problem that I had, namely identifying why a given dependency is used.

This has a useful switch, which combined with a graphviz utility:

mix deps.tree --format dot && dot -Tpng deps_tree.dot -o deps_tree.png

I updated the phoenix generator and built a simple phoenix, naturally called Fawkes.

This is the dependency tree:

Full dependency graph of a phoenix application.

Jason Schemas and Validation

I am currently working on an API. The frontend that feeds the API uses GraphQL to send the requests to the server.
The frontend code does have a certain amount of validation, but since it is impossible to full secure a web client some of the validation needs to be repeated on the server. Anything that the client can do over https can be automated as an API. The client code used by the user contains all the information needed to do this!

Given that I have been receiving a json document it would be useful to have one central place to describe what is expected to be sent. This is where JSON Schemas come in.

The obvious starting point is https://json-schema.org/

This ends up with a document that looks something like:

{
   "$schema": "http://json-schema.org/schema", 
  "title": "My Schema",
    "description": "Some description",
    "required": [
        "list_of_things",
        "name",
        "my_object"
    ],
    "type": "object",
    "properties": {
        "list_of_things": {
            "type": "array",
            "items": {
                "$ref": "#/$defs/thing"
            }
        },
        "name": {
            "type": "string"
        },
        "my_object": {
            "$ref": "#/$defs/my_object"
        },
        "age": {
            "type": "integer"
        }
    },
    "$defs": {
        "thing": {
            "type": "object",
            "required": [
                "lines",
                "postcode"
            ],
            "properties": {
                "lines": {
                    "type": "array",
                    "items": {
                        "type": "string"
                    }
                },
                "postcode": {
                    "type": "string"
                }
            }
        },
   "my_object": {
        "type": "object",
       "required": [],
       "properties": {
           "length": {"type": "integer"}
       }
    }
}

Once you have that you can use https://hex.pm/packages/ex_json_schema to use this to create a validator for the json obect.

  schema =
      File.read!("myschema.json")
      |> Jason.decode!()
      |> ExJsonSchema.Schema.resolve()

ExJsonSchema.Validator.validate(schema, %{"foo" => "bar"}
# Returns this:
{:error,
             [
               {"Required properties list_of_things, name, my_object were not present.",
                "#"}
             ]}

The practical use is that you get a description of what is wrong with the document and an idea of where in the document the error is.

In terms of validation this is a good start. You can find out missing fields and fields of a wrong type.
I have yet to make it handle smarter validations (such as only one item in the list can have the main boolean set).