I curently working for a medium size company that is slowly moving towards using DDD. We have notionally been domain based for the last 18 months.
Its an international company with distinct business in three countries. The difficult part is working out what can be made global without breaking the existing offerings.
Each country has different preferences and work at different scales.
Common infrastructure pieces could be extracted. It makes sense to only integrate with payment providers once.
Each country will have regulatory requirements for reporting that need to remain country specific.
I like Vaugh Vernons definition of DDD as
Developing ubiquitous language within a bounded context
I like to have a ubiquitous dictionary so we have somewhere to document a Ubquitous Language.
Apparently I have been a pasta snob since childhood. My parents tell a tale where I objected to tinned pasta when I first encountered it on holiday (Isle of Wight early 1970s). At home we normally had dried pasta.
Since then I have been using either dried pasta or shop bought “fresh” pasta.
A week or so ago I had a team building trip to Rome where we were taught to make fresh pasta by an intructing chef.
When I got home I purchased the pasta machine. This is a specialised press and slicer. As this is experimental I bought a cheap one from Lidl.
Pasta Machine
I have some experience with parts of the process as I have been baking bread for the last decade. One of the tricks I have learned is that dough is linerarly scalable.
Fresh pasta is made from very simple ingredients:
– Pasta Flour
– Semolina (which apparently is for beginners)
– Water
– Salt
Egg (optional)
The recipie that came with the machine called for equal amounts of flour, semolina and slightly less water (plus a pinch of salt). The chef advised that these are only starting points and you will learn to adjust them based upon the ingredients.
I would start with 25g of each per person plus one more to clean the machine. You are meant to put a small test sample through the machine and dispose of it to clean the blades.
Mix all the ingredients together and knead for 10 mins. This should give a soft ball of dough. Wrap in clingflim and leave to proove for half an hour
Take pieces of the dough (covering the remainder) and put it through the pasta machine about 7 times on the widest setting folding it in half each time. Then you can reduce the size to get the thickness of pasta you want. These can then be sliced using the other head of the pasta machine. Mine can make spagetti or tagatelle.
Be carefull to catch the cut pasta as it can start to stick together. I need to get a pasta drying rack to go with the machine. The pasta needs to dry for a while (book says 1 hour, chef much less).
One of the fundamental GDPR rights is the right to be forgotten.
I recently received a cold call on my mobile number from a company that I had not given it to. They had purchased it from Cognism.
Now Cognism are reasonable and have decent records (kind of they needed to be reminded that the initial daraset was incomplere) of where they have sold it to. Their website has a simple form to remove me from their database,
The problem is now the companies they have shared my details with:
lead-forensics
zymplify
lightrun
revgen
ukfast
These all have gdpr compliant messages on their sites but don’t have a simple form to access them.
Lots to see, food is great. Attended a pasta making training session.
Rome is said to be on seven hills. The hills are not very big. It’s more that the city is not as flat as say Milan.
Hotel rooms in Rome have kettles – something that the Milan equivalent lacks.
The international airport is a 35 min Train ride from the main Rome station. The arrivals board is more prominent then the departures board. Getting onto the train is more complex than necessary due to some repair work. You need to leave the platform and rejoin (or walk through the building work).
The travelator at the station moves at less than walking pace!
The playwright e2e tests are significantly easier to work with than cypress.
While developing the tests using the playwright test --ui option makes developing the test really easy. You get to see what is happening at each step and can simply copy the final url to allow experimentation.
The only drawback is that on CI you can’t always see the server logs. This can be alleviated by adding small healthcheck endpoints.
I recently found Storybook to be a great means of simplifying the design of a component. Storybook is a component design tool. You build a catlog of examples showing how a component can be used.
This is great for ensuring system consistency as it is possible to lay out all possible options for a component. Given that it can be time consuming to achieve certain states by using a full application, having all of the possible states laid out in one place is a big time saver.
I am working with a react application that uses both Relay and Formik. I am just getting the hang of using Formik in storybook. Relay is more difficult.
In typical open source fashion a component exists because someone had a problem to solve. However when people move on to new projects (or the project is replaced) these eventually become abandoned. The relay storybook components are in this state. It worked once, but is now broken by the continuously moving environment. This leaves two choices: try to fix the library or work around the limitation.
Currently I am using the work-around by extracting “pure” components and testing them. For example I could put all of a dialog box into a component and isolate it from the effect button.
Last weekend was the start of British Summer Time, which is described as daylight saving. This was implemented during WWII and is best charcterised as a politicians sigilism.
We must do something.
This is something.
We must do this.
The net effect of moving all the clocks an hour earlier is to inflict jet lag on an entire country for 2 weeks. During that time the sun will be rising earlier anyway so it will be as light when you wake up as if the system had not been imposed.
The argument that it helps farmers is demonstrably crazy. Pet owners know how confused they are at randomly changing meal times. Can you imagine how confused a milking cow herd would be!
With the loss of the Heroku free tiers the old solution I used no longer works.
The first problem to solve is to detect PRs in need of merging.
declare -a arr=(“name1” “name2” “name3” )
for i in “${arr[@]}” do gh pr list -R owner/$i done
The above is a bash script which requires you to have the gh cli tool installed and configured to access your repos.
This will help give you a report of the pending PRs to merge. It may need adapting if you have too many.
The next step is to start merging them.
Dependabot text commands are useful here. You can use `@dependabot merge` to assist with this.
The step beyond that is detecting the number of merged PRs to deploy. You don’t want a huge deploy in case it needs to be reverted.
You will never be clear of the upgrade treadmill. The best solution is to fully automate it.
To use that you need several things:
– a fast reliable deploy/rollback process
– a sufficient test suite
The best option is to automate the merging of dependabot PRs that pass all the tests. Beware false positives that other integrations can give (snyk).
You will also need an automated deploy process. Deploying the latest build every day at a fixed time would help this (this also ensures that you could at least deploy yesterday).
It is possible to rate limit dependabot to only having 10 open PRs at a time. This could help but could be problematic if you are in a fast moving environment like javascript.