Thursday, 31 December 2020

#rEUnion Manifesto

 Now we've left the EU, I thought I'd write a bit of a blog post on what I think should be the priorities for being able to re-join the EU at some point in the future. I want to start with an assessment of how we got here, as well as a step-by-step manifesto for how we get back. This will take a long time, perhaps a decade or two (it took nearly five decades for Euroskeptics to engineer our exit) so we need to plan strategically for that kind of distance.

Why We Are Here

The easiest explanation for why the UK has split from the EU is that we're following a parody of the 1930s. In a much earlier blog post from 2008 on the subject of the global crash. Here I wrote:

 "The 1929 Crash was a culmination of 5 years of massive stock market growth which was ultimately boosted by heavy speculative investment. The market initially recovered over the next several months of 1930, but this was not enough to prevent the subsequent Great Depression and corresponding global recessions in Britain and more importantly in Germany (where the economic (and social) instability lead directly to the the rise in power of extreme political parties and subsequently the Nazi dictatorship and World War II)."

"So, when we come to look at the Crunch we actually see the hallmarks of previous crashes all over again. We see deregulated markets leading to a financial boom and subsequent serious bust."
"What we can predict is that this is only the start of the problem"

You can read the rest in the actual blog post itself. My concern at the time - although I didn't state it directly, because I didn't think that governments would actually do it, was that we would follow the same path as the 1930s, because we know what happened then.

However, I was wrong. In reality we followed pretty much exactly the same path, with the exception of the US for the first 8 years and the UK for the first two years which implemented a half-baked Keynesian solution which refinanced banks at the expense of refinancing people. In the EU they followed the path of austerity.

The Five Steps Of Failure

The 1930s followed a basic pattern which we've been roughly copying. This can be summarised as:

  1. A global crash which lead nations to..
  2. Implement austerity in order to 'manage' their finances.  This is a classic right-wing economic approach which treats national economies like domestic economies. It cannot be emphasised enough that it doesn't work. I have a blog entry from 2015 which explains why they are not the same. Nevertheless, the EU imposed austerity (or SAPs) from 2009 and the UK imposed it on itself from 2010. In reality austerity causes..
  3. People to react by becoming more politically extreme, in particular by shifting towards right-wing Nationalism. I think the reasons for this are pretty simple. When people are faced with austerity, they spend far more of their time looking after their immediate needs and so their cognitive horizons shrink. In essence austerity prevents people from looking at the wider problems and so people are lead to more populist, political thinking: simple immediate solutions rather than complex wider solutions. But right-wing nationalism has the edge, because the shrinking of horizons forces a more nationalistic viewpoint. What happens outside of those horizons is threatening. This leads to..
  4. The destabilisation of Europe. In Austerity Europe of the 1920s (post Versailles) and 1930s (post Wall Street Crash), both phases lead to destabilisation from Fascist governments. Italy broke away in the 1920s under Mussolini and Spain entered a (partly Nazi supported) civil war in the early 1930s. And of course Germany went Nazi in the very early 1930s. But the same patterns of political destabilisation appeared across Europe and the US; ranging from Oswald Moseley's Black Shirts to the America First movement in the US. Europe was destabilised.
  5. Ultimately, because Nationalist governments have a relatively local focus, they are lead into conflict with either themselves or other countries. So, the whole process ended with War.

Now, it's understandable that the EU, if lead largely by Germany (which I think essentially has control of the Eurogroup) would choose austerity, rather than Keynesianism. And the reason is that from the German perspective in the early 1920s, the reaction to the conditions for the Treaty of Versailles was to print money. It was this printing of money that lead to hyper-inflation and the emergence of Nationalist and subsequently fascist groups like, obviously the National Socialists. Thus Germany, in particular its Ordoliberal school of economics has a deep aversion to non-austerity: they believe in keeping a tight control over money, but this is, at least in some circumstances, like this one, the wrong lesson.

With the final step of Brexit under a hard right, nationalist government we are fully into Stage 4. It's taken a whole decade of austerity to get there (cf 1919 to 1929) and there's been a lot of resistance, but it's safe to say we're at the beginning of step 4 now.

The important thing to note in all of these, even above the individual steps is that the further we progress with them, the harder it is to turn around and recover.

So, in the 1930s, the US recovered first because it didn't go very far down step 2. It pursued poor policies during the Hoover period (Dust Bowl, great depression), but then FDR was elected and he instituted the New Deal, which dug the US out of its mess, thus avoiding steps 3 to 4 (though there were plenty of elements of 3 in the US of the 1930s).

The UK managed to take some steps in the late 1930s towards economic recovery, but was still partially in the grip of appeasement (i.e. Nazi Sympathy) at the outbreak of World War 2.

The Manifesto

1. A Common Understanding Of The Root Problem

Without being able to agree on the five steps above, we cannot agree on the root cause of the problem. At the moment the pro-EU movement has literally no agreement. The consequence of this, for example, means that because we don't think austerity is the primary cause that lead to World War 2 (and there are other, specific concrete political critical events and causes), we have no insight into the underlying forces that drove Brexit and no map for where we're heading. For us, Brexit came out of nowhere - just individual Euroskeptics who forced a decision on the Conservative party. That was something we wouldn't have predicated. Also, we don't know what's coming next, because we only see the problem in terms of the issues Brexit presents us as a country, in other words, we have a Nationalistic view of Brexit where the EU plays the part of the good guys and we're constantly reacting to the situation.

And our myopic view of the EU as "the good guys" is what also prevents us from criticising the EU (or rather in this case the Eurogroup) where it is part of the problem. i.e. because the Eurogroup is pro-austerity, British Europhiles are pro-austerity. This position has to be rejected. By rejecting austerity as reasonable reaction to the crash of 2008 Centrists will be able to work together with left-wing Europhiles (though it'll be harder to work with right-wing Europhiles).

But to re-iterate, without a common unerstanding of the root problem, we have no control over Brexit.

2. A Common Narrative

The Remain campaign and Remain movement, to this day, 5 years later is obsessed with economic technicalities as a foundation for EU membership. This is a mistake.

The primary reason why Brexiters won was because they have a narrative about Britain as the plucky buccaneers that can do anything when not hindered by the continent. The EU plays the part of the oppressive King or evil dictator and Brexit is about being free of that. All Brexiter arguments are driven by this sense of identity, even though it's inaccurate.

For us to regain the initiative we have to have a narrative about Brtain as belonging with the EU. We need a narrative that says that our natural place is alongside the rest of the continent: helping to make its decisions; supporting it at every step; sharing with its culture, its history, its people, its languages and its purpose of diversity, responsibility and liberation.

Note how 'belonging with' is emphasised. The 'with' is important because we need to convey a peer relationship, not a subservient relationship.

3. Addressing Media Responsibility and Accountability

This is a short point. 80% of the media by print during the referendum was owned by tax-dodging pro-Brexit billionaires. Unless this changes we'll almost certainly lose again. Basically we need a law that's comparable to media laws in some of the rest of Europe where firstly, the ownership is based on a Trust which provides a remit for the media's general political flavour (it's OK to be right-wing or left-wing, it's not OK to just be a mouthpiece for the proprietor). The paper itself should be worker owned by the journalists and readers. This applies to whatever form of media is relevant in the future.

Secondly, there needs to be some level of accountability whereby the media can play fast and loose with facts the way it happened with the referendum. For example, to force media to redisplay corrections with the same prominence of the original erroneous articles.

4. Guerrilla Ops (Picking Battles We Can Win)

Ultimately we need to get back in the EU, but in the same sense that Euroskeptics fought a number of minor battles, most of which were fabrications or merely symbolic and most of which they lost.

But to keep up some kind of morale we should pick fights we can win. I would suggest that the first fight we pick is one over Metrification. Arch Brexiters want us to return to Imperial measurements. JRM, for example has mandated imperial units be used in all his correspondence. I would have thought that at the earliest opportunity, they will try and revert back to Imperial measurements for general use.

We should stop this and push back, to get everyone using metric in ordinary day-to-day activities and communications as well as all formal information. Dump Imperial at all levels.

We can win this one too. That's because:
  1. We've gotten used to talking in terms of metres over the past year in a way we never did before.
  2. Educational establishments in the UK will back us up: they won't be bothered, particularly in sciences, to backtrack on 50 years of progress.
  3. Industry will back us, because Imperial units have a direct financial cost. 
  4. The NHS could back us, by switching to e.g. only giving out metric weights for children.

5. Building a Shadow EU in the UK

We should start preparing for a future with the EU, and the way to do that is to build business and cultural resilience. Even though the Brexit deal is thin, it provides for British companies to adhere to EU standards. Thus, by building networks of British companies that operate on that basis, these companies will be forced to exclude business that breaks those rules: they gain financially by EU commerce and companies that don't will find it harder to compete, despite the UK government's attempts to tip the level playing field.

My suggestion for a name: BEBA: the British European Business Alliance.

BECA would be the cultural counter part. The New European could provide the basis for educational material to provide holidays, cultural exchanges and language tuition and importantly instil more of a sense of European identity in the young until the point when we have #rEUnion.

6. Proportional Representation

Remainers failed in the General election in 2019 for a whole host of reasons, but the simplest is that we failed to collaborate in our opposition to the Conservative and Brexit party and they went onto win the election on just 43% of the vote.

The next election will be easier for the Conservatives than this one, because possibly Scotland will be leaving the UK by then and constitutional boundary changes will lead to a net loss of over 20 current Labour seats.

Therefore it will be more imperative for Labour to collaborate with other opposition parties in the 2025 (or 2024) election, against an environment friendlier towards electioneering since the Electoral Commission's powers will have been curtailed by the Conservatives in the intervening period.

The only way I can see for Labour to gain the confidence of other parties is to promise proportional representation if they win - and the form of PR must be specified so that Labour can't pull the same trick the Conservatives pulled after the 2010 election, where a PR referendum was held, but only AV was an option. Future edits to this post will include more references and possibly diagrams!

7. A Robust Mechanism For A New Referendum

With steps 1 to 6 in place, we would finally be in a suitable position for a fair referendum, along the lines of the one held in 1975, which was based on facts rather than propaganda. Media balance would be better, representation of the people would be more equitable; relationships with the EU would be coherent and ready for re-admittance; cultural affinity for the EU and Europe would be higher; we'd have a UK narrative that would fit into EU membership and rEUnion groups would be more easily able to work together.

8. A Formal British Constitution

Finally, and within the EU, a reformed UK constitution could be defined to make it harder, much harder for the UK to be subverted in the way it was up through the Brexit referendum. Part of the reason why we belong with the EU is because of the checks and balances it provides, but the same applies domestically. There's never a substitute for active participation in politics, but the mechanisms within the state should facilitate both representation and accountability in such a way as to protect both security and prosperity for the people.

Conclusion

We've lost every battle since 2008, primarily because we lack an insight into the wider picture and a model for what to expect. We can't get to #rEUnion by carrying on as we are, with the same arguments. Instead we need to find a common framework for why we're here; a common EU centric narrative for the UK that embodies us belonging with it and finally a strategy that addresses all the institutional failings that prevented us from being about to mount a robust defence of our existing constitution.

The end result should put the UK on a much firmer foundation for the good of all within and without the UK, for the rest of the 21st century.


Tuesday, 14 July 2020

Toggle Booting a PIC MCU!

Before the 1970s people had to boot computers by laboriously flipping a set of toggle switches on the front of the main processor. Today, all computers have built-in boot ROMs and even embedded microcontrollers are programmed using powerful external computers.

I only consider a processor to be computer if it can support self-hosted programming, so I wondered what it would take to manually toggle in a program on an MCU with no computer and minimal logic. I've produced a video based on this blog here.



I chose a primitive 8-bit PIC, because it has a simple programming algorithm, but even so, I have to enter 56 bits per instruction flashed. It was so tedious I printed out the entire sequence and used a ruler to make sure I didn't lose my position. Here's the 11-instruction program itself:

 0 Bsf Status,5 ;01 0110 1000 0011
 1 Bcf Trisc,4  ;01 0010 0000 0111
 2 Bcf Status,5 ;01 0010 1000 0011
 3 Movlw 53     ;11 0000 0011 0101
 4 Movwf T1con  ;00 0000 1001 0000 65536 cycles at 8us/cycle=
 5 Btfss Pir1,0 ;01 1100 0000 1100 3.8s per full blink.
 6 Goto 5       ;10 1000 0000 0101
 7 Movlw 16     ;11 0000 0001 0000
 8 Xorwf PortC,f;00 0110 1000 0111 (Don't care about RMW issues)
 9 Bcf Pir1,0   ;01 0000 0000 1100
10 Goto 5       ;10 1000 0000 0101

You don't need a GOTO start at address 0: if you don't have any interrupt code you can just start your program at address 0.

The reason why it's so tedious is that programming each instruction involves four command sequences:

  1. Load the instruction into (I presume) an internal buffer.
  2. Burn the instruction into flash memory.
  3. Verify the flash memory instruction you've just programmed (always worthwhile as it's so easy to make a mistake).
  4. Increment the address of the PC. Unlike many MCU flash programming algorithms, the 8-bit PICs can only have their programming address reset (by unpowering the MCU, connecting the PIC's /MCLR signal to 12V, then powering the rest of the MCU at 5V) and then incremented. Thus making a mistake (most likely because you've left out a bit or duplicated one) means you have to start again.

In addition, each programming command expects all the data to be entered from the LSB to the MSB, so in fact I toggled in all the commands backwards, reading each line from right to left. So, the first full command really looked as follows (with '|' separating individual fields):

Bsf Status,5 |0|01 0110 1000 0011|0|000010 Bit 5 is RP0, status is 3
  (Prog)     |                     |001000 Bsf is 01 01bb bfff ffff.
  (ReadBack) |0|-- ---- ---- ----|0|000100
  (Inc)      |                     |000110

The PIC needs just two signals to program it: a clock input and data signal. My hardware is (relatively speaking) super simple.

I use an astoundingly primitive 555 timer chip in its monostable (one shot) mode to debounce the clock. All I needed was a couple of capacitors and resistors to give a 0.2s delay and it would eliminate bounce. All the information I needed came from the wikipedia page on the 555 timer.

The data button was a bit more challenging. I used a few resistors and a red LED as I needed Vdd when I pressed the button, less than 0.8V when I let go, but I also needed to enable the DAT line to drive the LED when verifying data and not exceed current limits even if I accidentally pressed the button when the DAT was trying to output. One of the downsides to this approach is that the LED is barely visible on the video, though in retrospect I think I could have halved the resistor values and it would have been fine.

Finally, I needed a 12V programming voltage for the PIC, and then used a basic 7805 voltage regulator to generate the 5V Vdd voltage for the 555 and PIC. The currents are so small I didn't need to worry about a heatsink for the 7805.

On a PIC it's not good enough just to program the flash, I needed to change the internal configuration to stop the Watchdog from resetting the PIC after 4ms and to use the internal oscillator instead of an external RC oscillator. The spec on programming the configuration is rather confusing, because the sequence to go into config mode requires the full 14-bit configuration setting, and then you have to enter it again as a normal programming instruction.

With some experimentation I got it right in the end! With a bit more detail, I started off by finding out how I could enter one instruction (after running the erase command), and then two instructions. I made two attempts to program the entire program - in the first attempt I made a mistake on the last-but-one instruction and had to start again, I videoed both of them so although the toggling sequence is complete and not spliced from multiple attempts, it wasn't the first attempt.

I had a similar problem with the configuration. It took a few attempts at that to get it right and at one point I thought I had configured it to use an external oscillator and couldn't read the flash anymore. In fact I had mis-inserted the VPP to the right of the 7805's IN pin.

So, it's possible, but not very practical to manually toggle program an MCU, but perhaps survivalist geeks might find it useful in some post-apocolyptic dystopia!

Saturday, 14 March 2020

COVID-19 Herd Immunity Sim

Why Herd Immunity Doesn't work for Covid-19

This is a simulation of the epidemic for the sake of working out how much herd immunity is needed to protect the population.
The answer is around 90+% (98% in this sim). Herd immunity is mistaken.

Explanation

Herd immunity is designed for vaccines.
In normal epidemics, herd immunity works because the contagion distance between someone who contracts the disease and the next person who isn’t immune is large due to the number of people already immune (depending on the % immunity and infection rate). But here, no one has immunity, so the disease would distribute itself throughout the population and the contagion distance remains small even after nearly everyone has become immune. So in this case, herd immunity is incorrect, despite what you may have heard from our PM saying he’s followed the advice of the chief scientist.

How To Use

The easiest way is simply to press the StartSim button. It simulates a population of 160,000 where each person potentially infects 0.8 people in their vicinity each day until they get isolated (in the example in day 5). The default vicinity is 256 (+/-16 in each direction) and if by chance an infected person tries to infect an already infected person, it has no effect.

You can see in realtime how an infection spreads across the population. Initially it appears to be exponential, but fairly quickly isolation cuts in and people who have it tend to come into contact with those who already have it, so they don't infect so many new people.
So, the epidemic then spreads at the boundary.
As it does so, some people - a few - remain uninfected, by chance and as the disease progresses at the boundary, the mass of people now immune effective protect the very few who have never been infected. But this is a very low figure, because the whole population has no prior immunity and is not vaccinated.

You can play around with the infection rate (<=1) and mobility to simulate a ghastly pandemic that seems to spring from everywhere all at once, but ultimately it has little effect on the herd immunity as it effectively doesn't exist.

Remember, this is only a simulation to demonstrate the lack of herd immunity. It does not really project actual figures for infection, nor are the rates correct, nor the population, nor the mobility, nor the potential impact of warmer weather on the virus.

Simulation

Pop: Contagion/day:
Isolate on day: Mobility:

Your browser does not support the HTML5 canvas tag.

Monday, 30 December 2019

Monopoly Sim



Our family plays Monopoly every Boxing Day and the winner gets to keep our Monopoly trophy for a year with their name inscribed on a new lolly stick. I've never won, I should do something about that!

I wanted to simulate the game of monopoly to find out which locations are most likely. The distribution won't be even because GO TO Jail takes you back to Jail (or Just visiting Jail once you pay or throw a double to get out) and some of the other cards take you to different locations on the board (Go, Pall Mall, Marlebone station, Trafalgar Square or Mayfair.

Naturally, it's not good enough just to simulate the game, the quest is whether it's possible to do that on a 1K ZX81. We can do that by some analysis on what really counts for the simulation.

Firstly, obviously, to find out which locations are most popular, we don't need to draw a simulation of the board, simply listing the popularity of each location as a series of numbers will do. On a ZX81 we'd need to count the memory used by the screen for this. Assuming each location could have a popularity up to 255, then we need 4 locations per location: 160 bytes in total.


Secondly, we don't need to store the names of the properties; the value of the properties or rents since if we simply know the popularities, we can calculate the income we'd get from that and the card values.

Thirdly, we don't need to maintain any money, because we only care about popularity; this means we don't need to simulate the bank or money for passing GO and we don't need to simulate chance and community chest cards that involve money. So, the only cards that matter are those that take you to a new location.

Fourthly, we can note that there are only 2 Community Chest cards that don't involve money, a goto GO and goto Jail, which are also part of the Chance cards, so we can re-use two of the chance cards to simulate Community Chest.

Fifthly, we can assume that a play will get out of jail immediately by paying or by having a card - it makes little difference to distributions (though we could simulate runs where the play prefers to get out of jail using a double, this means they'd only move

This means the simulation is pretty simple:

  • We need an array of 40 locations to hold popularities.
  • We need to simulate two dice for moving parts: the sum of two random numbers 1..6.
  • We need to redirect the locations if we land on one of the 3 Chance or 3 Community Chest locations or Goto Jail.
  • The redirect locations are: 0 (GO), 10 (Jail), 11 (Pall Mall), 15 (Marlebone stations), 24 (Trafalgar Square), 39 (Mayfair) or back 3 spaces, but Community Chest is limited to the first two.
  • We could calculate going to jail if the last 3 rolls are matching pairs and this would mean an extra jail visit every 1/216 throws on average. Getting out of jail by throwing a matching pair wouldn't affect results because the throw only puts the player into the 'just visiting' location and thus doesn't affect where they land after leaving jail.
  • Periodically we should display the current popularities.
With all these rules apart from the even dice jail rule and using the normal ZX81 Basic space-saving rules we can squeeze the code into 1Kb.


Line numbers take up 21*5 = 105b. The main code takes up another 305b. M$ takes about 45 bytes variables take up about 6*4 = 24 bytes. The screen takes 160 bytes, making a total of 639 bytes, which is within the limit of a 1K ZX81.

Results

It's just a list of numbers representing the popularity of every location from GO to Mayfair:

The ZX81 is really slow when run in SLOW mode, taking 0.6 seconds per go, or about 84 seconds per round including the printing. About 4,000 runs were taken which took nearly an hour. We could speed this up by running in fast mode; removing lines 25 and replacing line 130 with GOTO 30. Then it would take 0.15s per run, so 4000 runs will take 10 minutes.

What the results mean:

Site Pop Site Pop
Old Kent 32 Whitechapel 41
King's Cross 29 Angel 40
Euston 43 Pentonville 36
Pall Mall 46 Electric 41
Whitehall 35 Northumberland 40
Marlebone 40 Bow St 47
Marlborough 51 Vine St 44
Strand 47 Fleet St 52
Trafalgar 58 Fenchurch 43
Leicester 44 Coventry 40
Water 47 Piccadilly 49
Regent 46 Oxford 39
Bond 32 Liverpool 45
Park Lane 29 Mayfair 51

In this run there were 4173 throws, so there would have been 19 extra go to jail's due to three successive pairs (boosting Electric to Strand by about 7%).

We can see that the four most popular places are Trafalgar, Fleet St, Marlborough and Mayfair and at the other end of the scale, Park Lane, King's cross, Old Kent Road, and Bond Street are significantly less likely. It's understandable that Marlborough would be likely, but I would have thought Bow Street would have been equally likely (both 5/36 probability), but Trafalgar was unexpected - except that it's an actual card destination. We know that Chance was hit (47+70+54 = 171) times without the player being redirected (11/16), so in total it was hit 248 times and therefore 15 of these involved jumps to Trafalgar (lowering the hit rate to 52). A similar number of hits redirected to Mayfair (51-15 = 45 still leaving it more popular than Park Lane).

The sample size is about 4173 across 40 locations, about 104 hits per location, so the variance will be fairly significant. Therefore I haven't deduced too many details from the results.

Conclusions

The main conclusion is this: it's perfectly possible to solve an interesting question by simulation on an old computer with minimal memory. Even on a 1Kb ZX81 there was spare space. It's worth considering what the minimal computing resources are to solve the same problem. It's certainly under 1Kb, so a basic Jupiter Ace or a 1Kb MSP430 should easily be possible along with a PIC16F54 or even maybe an ancient 1802-based ELF computer with the standard 256b of RAM.

Saturday, 3 August 2019

Danger! Danger! Devaluation!

Right now, Brexiters keep claiming that devaluation is good for the economy. They're almost certainly wrong.

Intuitively, a fall in the pound is a bad thing: anything we buy from overseas goes up in price and since a high proportion of what we buy comes from overseas, our cost of living will go up.

But then Brexiters jump in and say "Devaluation is great, because it means more exports and that will make the economy grow." As a general rule, because Brexiters often distort the truth, you should be skeptical. So, are they wrong here too?

It's a good question. Recently I was at my Dad's house and we were watching a TV show called "Factory Wars" on the Yesterday channel. I was surprised to find a bloke from The Institute of Economic Affairs. Why was an economist from a lobbying organisation with secretive funding (hint, Big tobacco and the climate denying US groups like the Heartland Institute) on a history show?

Basically he was using it to push free market ideology. He claimed that the UK escaped the recession in the 1930s because it implemented austerity unlike the US which implemented a Keynesian New Deal. This was completely different to how I'd understood these economies in the 1930s: on the basis that austerity empirically doesn't work; its easy to explain why, and it was the Keynesian New Deal that empowered the US to manufacture the armaments that helped the allies overcome the Nazis.

It's not coincidence that the IEA offered a completely different explanation than what is normally understood: they're based at 55, Tufton Street where Vote Leave was also based. So, again we have another ideologue whose views we should suspect.

So, I checked out what really happened in the UK's economy in the 1930s? Economics Help is really useful. Basically, the crash of 1929 caused a lot of hardship in the UK, but in the mid to late 1930s we dropped the gold standard; which lead to a deflation of the pound and this lead to a mild economic recovery in the south, but the North still had it tough.

So, it looks like Brexiters might be a bit right, and the IEA were misleading as normal. But is it?

Devaluation Model

Actually we can work this out ourselves, because it's easy to model. In our model (for a business, but it can apply at a larger scale) we consider just three variables:

  1. Profits
  2. Internal costs (labour minus rises in the cost of living due to buying overseas goods, maintenance of equipment etc). We assume this is a constant.
  3. Imports (including the increases in the cost of living for employees from overseas goods).

Case A

In the first scenario, we consider a company with reasonably high import costs (50%), a profit margin of 30% and the rest is internal costs (20%). If the pound deflates by 25%, then imports go up to 50*1.25 = 62.5%. So, now our profits are 30-12.5 = 17.5% profits. This means that the company needs to sell 30/17.5 = 71% more goods, but the domestic market will buy less, and the overseas market will only find its products are 25% cheaper (due to devaluation).

So, the question is: is it like overseas customers will buy 71% more, if it's 25% cheaper? This seems unlikely to me.

Case B

Let's consider the second case: profits are 70%, internal costs 20% and imports are 10%. Imports increase by 25% => 12.5%, so we now need to sell: 70/(70-2.5) = 3.7% more. OK, so in this case our product is 25% cheaper, and we only need to sell 3.7% more for us to make more profit, this seems plausible.

So, we have to ask ourselves, what kinds of businesses are like this? Mass market manufacturing (cars, aircraft, smoke alarms) will rely a lot on imports and will make a relatively small profit. Apple, for example has margins between 20% to 30%. Dyson is also about 21%. DELL has an operating profit of just 1.1%. Companies producing consumer items such as smoke alarms have fairly low margins which explains why they would move to cheaper EU countries.

What companies are like case B? Services and financial companies are like that. However, financial companies are likely to move out of the UK due to Brexit, so this means service companies would benefit.

However, to some people, UK manufacturing companies can look good: namely, if you hold a lot of offshore wealth, then UK manufacturing companies struggling under devaluation look like a good buy. In other words, disaster capitalists with offshore accounts, such as people like Jacob Rees Mogg and a large proportion of Conservative party members and MPs.

Epilogue

Brexiters are wrong: it's just a front for their disaster capitalist mission. We can summarise our model and convey why in a simple table:



So, why did the UK not crash due to a double-whammy of austerity and devaluation in the 1930s? That's pretty simple, being the biggest power bloc at the time meant that although most of its food (91%?) was imported along with a significant proportion of goods, a great deal of this was essentially an internal market, with the added advantage that many resources, particularly coal were internal.

Saturday, 6 July 2019

Let the EVs Take the Strain

A recent BBC article says EVs won't solve congestion problems. It's yet another negative headline about EVs to follow from yesterday's negative EV headline where they said EVs were falling in sales for the first month in whatever (when in fact BEV sales had gone up 67%). They even go to the trouble of showing a picture of a rare EV, an 8 year old early prototype Smart ED TwoFour, rather than - say EVs hundreds of times more popular, to get across the idea that EVs are toys. Next week, watch out for the Tesla-bashing article ;-) and no mention of how sales of real EVs in the EU, the US and globally are rocketing.

Similarly, this article uses a bit of truth to hide a bigger lie. In fact EVs will go quite a long way to solving congestion.

Car Use is Falling

Car use is already going down in some parts of the UK, mostly because in London, they're not needed much and elsewhere because insurance for young drivers is prohibitive.
But actually, the nature of EVs will themselves radically change our vehicle usage, primarily because they have so few moving parts and batteries last much longer than originally expected (and will get several times better), to get sufficient wear and tear we're going to have to drive them much more often.

ICE Drives Congestion

The problems we see with urban vehicles are problems relating to ICEs themselves. For example, you can't have a filling station at everyone's house - it's far too dangerous and far too expensive! ICEs force us to place filling stations as widely as can be tolerated and because the effort taken to fill up (compared with plugging in an EV); this in turn forces infrequent filling; large tanks and very long ranges.

But long ranges themselves have the side effect of increasing our journey lengths which impacts everything: distance travelled to shop, to our workplaces, to schools and hospitals and all this increases traffic.

EV Transformations Will Blow Our Minds

EVs will change this radically. We'll have to share cars to get the wear and tear out of them and because charging will become ubiquitous (think every forecourt where your car might hang around); we'll need cars with much shorter ranges on average than even the first generation of EVs: think 10KWh or even 5KWh for the majority of cars and in turn two person EVs will dominate for the vast majority of journeys. But in turn, because we can charge easily, we can expect journeys to shorten too.

Remember in this model, people don't own their cars as much.

Why will people choose tiny, 'under-capacity' cars? It's simple, they'll be much cheaper to build, sell and drive! My Zoe (22KWh) gets about 4 to 4.5 miles per KWh at maybe 12p/KWh. Given a gallon of petrol (4.5L) = 4.5*£1.25 = £5.63, I get 5.63/0.12*4.5 up to 210mpg running costs.

But a Renault Twizy (a 2 person EV with a 6.7KWh battery) will get 6 to 8 miles per KWh, equivalent to 300 to 400mpg running costs.

Given a typical day's travelling in the UK is only about 10 to 20 miles, about 3KWh, that's only half a Twizy's battery. And considering the sheer number of charging points there will be, the average needed journey between charges will only be 5 to 10 miles, just 1.5KWh.

On that basis, a future EV with a 5KWh will seem ample, even though right now, all the talk is about 50KWh to 100KWh batteries.

So, EVs will go a long way to reduce congestion in themselves owing to the different driving model.

Friday, 12 April 2019

Plottastic ZX80!

A guide to plotting pixels in Basic on a ZX80!


Introduction

Both the ZX80 and ZX81 support 8x8 pixel character-only displays and contain 16 graphic characters that can be used to plot fat 4x4 pixel pixels on a two-by-two grid within each character:

These are called Battenberg graphics after the cake of the same name 😀

On a ZX81 it's easy to use these characters to plot (crude) graphics on the screen, the computer provides a PLOT x,y and UNPLOT x,y for this purpose. But with half the ROM on a ZX80, it's much harder - so I wondered, how much harder is it to plot pixels? It turns out it's pretty formidable!

Challenges


  • The ZX80 doesn't have any PLOT or UNPLOT commands.
  • The screen on a ZX80 is like that on a ZX81, when you run a program, the screen first collapses to a minimum of 25 newline characters and expands as you display text. However, on a ZX80, unlike the ZX81, you can't print anywhere on the screen as there's no PRINT AT function, this means we'll have to poke onto the screen.
  • The memory map on a ZX81 has the display immediately after the program, but on a ZX80, the display comes after the variables and the editing workspace which means that it'll move around just by creating a new variable or possibly by performing input (which is a potential problem).
  • Ideally, to convert from pixel coordinates to Battenberg graphics you'd want to map the odd and even x and y coordinates to successive bit positions to index the character.


  • But, unlike the ZX81, the character set on a ZX80 doesn't have the characters in the right order to display Battenberg characters. Instead they contain gaps; the last 8 codes are generated from the first 8 but in reverse order; and some of the first 8 are taken from what ought to be the inverse characters!

The Solution

The solution really comes in a few parts. Firstly, the easiest way to be able to map an ideal pixel value to its Battenberg character is to create a table in RAM, by putting them into a REM statement (the ZX80 has no READ and DATA commands so it's hard to put a table of data into an array, but the first REM statement in a program is at a fixed location). However, even this presents a problem, because only 8 of the graphics characters can be typed. The easiest way to handle that is to write a program which converts a different representation of the characters into the correct ones.

So, on the ZX80, first type this:
After you run it, you'll get this:
Even this was tricky; I had to use character codes above 32, since symbols such as +, -, /, ", etc get translated into keyword codes for the Basic interpreter. The above program illustrates an interesting feature of ZX80 Basic that differs from ZX81 and ZX Spectrum Basic in that AND and OR operations are actually bitwise operations rather than short-circuit boolean operations. Thus P AND 15 masks in only the bottom 4 bits.

Once we have generated the symbols we can delete lines 10 to 30.

The next step is to actually write the code to generate pixels and plot them. Once we know how, it's actually a bit simpler. Firstly, we fill out the screen with spaces (or in my case, with '.'):
This gives us 20, full lines of characters. Because it's going to be difficult to plot a pixel by reading the screen, figuring out the plotted pixels then incorporating the new pixel; I cache a single character in the variable P and its location in the variable L. All my variables a single letters to save space if you try to run it on a 1Kb ZX80. The idea is that if the new print location in L changes, we reset P back to 0, otherwise we incorporate the new pixel.

Next we start the loop and calculations for plotting, in this case a parabola. We loop X=0 to 63 and calculate Y on each loop (it's a parabola that starts at the bottom of the screen):

Finally we perform the pixel plotting and loop round.

This involves saving the previous location in K so we can compare it later; then calculating the new location based on X and Y (since each character position is 2 pixels, we need to divide X and Y by 2 and since there are 32 visible characters and an invisible NEWLINE character on every screen line we must multiply Y/2 by 33). Note, a quirk of ZX80 Basic means that multiplies happen before divides, so Y/2*33 would calculate Y/66!

The pixel bit calculation in line 50 makes use of ZX80 bitwise operators, we want to generate 1 if X=0 (so 1+(X AND 1) will generate either 1 or 2) and then we need to multiply that by the effect of the Y coordinate, which is 1 on the top line, and 4 on the bottom: 1+(Y AND 1)*3 will do that. Hence this will generate the values 1, 2, 4, 8 depending on bit 0 of X, Y.

We must POKE the location L plus an offset of 1 (because the first character of the display is a NEWLINE) and also we must add the location of the display file (it turns out that these calculations don't make the display file move around). We poke it with the pixel value we want indexed into the right character code from the REM statement. Finally we loop round X.

This code generates this graph:
It looks reasonable even though it's generated with integer arithmetic. It's the first pixel plotted graph I've ever seen on a ZX80!

More Examples

My original reason for doing this was to see if I could generate sine waves using a simple method I once found for a Jupiter Ace. Here's the program and the result:
It looks fairly convincing, especially as it's all done with integer arithmetic. Because the code generates both the sine and cosine value, it's easy to turn this into a circle drawing program which produces the following:

It looks a bit odd, that's because the algorithm doesn't quite generate sine and cosine curves. What it's really doing is computing a rotation matrix, but only one of the terms is computed for sine and cosine each time. Hence, the circle looks a bit like an oval with a slight eccentricity.

My graph drawing algorithm has one very serious limitation. Because it doesn't read the screen in order to compute a new pixel, if the graph goes over the same part of the screen twice, the second pass will muck up what was there before. Doing the correct calculations is possible using some table lookups and bitwise operations, though it would slow down the graph generation. I didn't bother, because I only wanted to generate simple graphs.

Conclusion

The ZX80 and ZX81 have very similar hardware, but a number of design decisions in the ZX80's firmware made drawing circles much harder than you might expect. With a lot of effort it is possible to generate some simple graphs 😀