Monday 22 January 2024

Starring The Computer: Holly's Psion MC400 In Die Hard 2

Well Die Hard 2 fans! This is the moment you’ve been waiting for… What laptop was Holly Gennaro-McClaine using on the plane???

It turns out the answer is A Psion MC400: Ie an early 1990s notebook computer! There aren't many shots of the computer in the movie and the only purpose they seem to serve is to contrast Holly's pro-technology attitude with John's anti-technology attitude. We'll start with the last shot where she packs it away, at about 1:16h into the film:


It's quite clearly the same computer, you can see the PSION insignia at the top left; the ribbing going through the centre, the model name on the metal band on the bottom.

There's more to the Psion MC400 than the outside. At 2:20 you get a fairly decent shot of the keyboard. I enlarged it so you can see it against a stock image.




It's really satisfying to do this, for three main reasons. The most important is that I've worked it out and I can't find any other reference to it on the net (I've tried googling "Die Hard 2" and "Psion MC400"; and I've tried looking at the excellent website: "Starring the Computer" and it doesn't appear in either).

The next reason is that it was only when me and my wife saw the movie at the beginning of January 2024 on BBC iPlayer that it occurred to me that I could find out what computer it was and given that even in the late 80s and early 1990s, there were probably a number of laptop style computers it could be, tracking it down from the few clips we have of it seemed like a tantalising, but challenging exercise. The weird thing was that when I looked at the computer, I felt I’d seen that computer before, and it wasn’t long before I guessed it might be a Psion MC series, because of the light keys against a dark casing (all PC laptops of the time were either too big to fit on an airplane seat (eg Toshiba T1000) or had a beige case, or both). So, really, the first computer I tried to check it against was the Psion MC400.

The third reason is to do with the nature of the MC400 itself, because it was a remarkably advanced computer from a British company and also very unusual, partly because it wasn't very successful. Seeing a British laptop in a US film is particularly remarkable.

You can see how advanced the MC400 looks in comparison with e.g. a Toshiba 1200 from 1989 (when the movie was shot, assuming post-production took 6 months). Can you imagine Holly lifting it with one hand to put in a bag or it even fitting on the tray table for her seat?


And this tells you why the MC400 was the perfect laptop for demonstrating hi-tech Holly in the movie, because the computer can be handled easily (only about 2kg) and fits nicely on the tray shelf; something that isn't possible for any other laptop machine. I have to be careful with the phrasing here, because there were palmtops at around the same time and also some awkward notebooks like the Epson PX-8, but they were quite clearly not like a modern laptop.

To illustrate why it was so progressive: One of the cleverest things about the MC series is the trackpad, which is above the keyboard. As I try to remember from v brief the time I used one; it used absolute positioning (top-left, bottom-right = top-left, bottom-right on the screen) and accurate placement involved rolling your finger on the pad. At the time, laptops either had no mouse/trackpad/trackball (because PCs didn't have Windows); or you'd plug in a serial mouse (like most people still do with a PC laptop) or in a few cases and maybe this is a few years later, you could get a weird mini trackball that plugged into the side of the keyboard.

After the failure of the MC series Psion re-used its multi-tasking architecture and OS (Epoc) in a new product, the first (& incredibly successful) useful PDA, the Psion series 3 🙂 !



You can see how it shares some of the design language of the MC400 series, with the ribbed casing silver logo followed by the dark model name banner. In addition the keys have a similar colour scheme. The big difference is the screen size (240x80 pixels vs 640 x 400) and built-in apps.

It was only the arrival of the PowerBook 100 in 1991 when the trackball/trackpad was moved to the front (and Apple [nor anyone else] used a trackpad for at least another 5 years, which, again shows how advanced the MC400 was).

After the PB100 appeared, all laptops were set out this way (apart from IBM's flirtation with the ThinkPad  'nipple'):


The MC400 (and Psion Series 3) had one last trick up their sleeves, they were based around a pure solid state storage model, involving SRAM and serial Flash cartridges that plugged into the unit rather like modern USB memory sticks, except that they would fully insert.




Ground-breaking machines!

Friday 29 December 2023

The Humble Microfloppy Disk: A Vehicle of Insidious Cultural Imperialism

I think this is the longest title I've had for a blog post!

And yet the post should be relatively short.

I came across this video about the history of the microfloppy disk, the 720kB / 800kB, 1.4MB removable disk format that lives on in the shape of the Save Icon and the classic (but only marginally funny) joke about a kid thinking that one is a 3-D print of the Save Icon.


[https://youtu.be/djsyVgTGaRk?si=Kd0Z1nrqXfmUG15c]

It's an intriguing history, mostly because there was a fairly rapid transition from 8" floppy disks to 5.25" floppy disks in the 1970s, but then, despite Sony's microfloppy arriving in at the very beginning of the 1980s, and being so superior, it took about 5 to 7 years before it started to dominate (hint: the IBM PC standard held it back).

But one fact really blew my mind: it turns out the 3.5" microfloppy doesn't exist. Let's say that again - the 3.5" microfloppy doesn't exist.

In reality it's 9cm, not 3.5". I've used them since the mid 1980s and in all those 40 years, I've never known this - I was duped by some Cultural Imperialism!

In retrospect, it should be pretty obvious that the 3.5" microfloppy is unlikely to have a specification in inches, simply because it was made by Sony, a Japanese company. Japan uses metric. CDs, for example are 12cm - they were designed in Europe and Japan. 3.5 inches is 8.89cm, making it just over 1mm less than the correct size for a microfloppy disk, but that 1mm matters.

We can prove this to ourselves by measuring it (which I did) and then taking a photo. The trick though is to compensate for the parallax, since if you're looking at the disk from the centre, then the width could indeed look about 1mm shorter depending on the thickness of the ruler you use. In this photo, I did it by using a panoramic shot. That way I can measure 0cm (actually 20cm) directly above the left-hand side of the disk and 9cm (actually 29cm) directly above the right-hand side of the disk and you can see that I didn't move the ruler, or cheat by some other mechanism (though vertically, you can see it isn't straight).



Why is cultural imperialism important? The answer is that metric versus imperial measurements is a practical issue, blocked by political games. Namely, metric measurements are objectively better, but many people in power have an agenda to maintain historical measurement systems.

Why would they do that? The reason is because Imperial measurements are more complex and that makes it easier to manipulate people, to pull the wool over their eyes. And this happens because different types of units aren't easily comparable (e.g. weight, mass, volume, lengths and time) and different scales for the same kind of unit use different bases (e.g. 12 inches per foot, 3 feet per yard, and almost no-one knows how many yards there are in a mile).


This presents a barrier of understanding which reduces people's ability to process units down to just comparing values from the same kind of unit. It has an actual impact on maths attainment in the UK [@todo MetricViewsLink].

For example, someone sells 7oz of cherries for 2 crowns and 1lb of cherries for £1.5s.6d. Which is better value? To know that you need to know there are 16 ounces in a pound; 5 shillings in a crown; 20 shillings in a pound (money) and 12 pennies in a shilling. Then you convert everything into ounces and shillings (or maybe pennies) leading to 7oz for 10 shillings and 16oz for 25.5 shillings. Now you know that the 7oz price is cheaper (just).

That's how it was in the UK before February 1971 when we switched from £sd on decimal day. It took well over a century, from the mid-1800s to the mid-1960s before the UK finally managed to agree. At the time, people were worried that decimalisation would cause traders to con customers, yet they never considered that it was much easier to con people using £sd money.

Nobody alive in the UK would consider shifting back to that awful system, yet we, who are generally in favour of metric measurements are quite happy to let Imperialists force us to use non-metric units. And, that's because there is effectively a deliberate attempt by them to switch everyone back: they convert metric to imperial units and then delete references to the metric units, and when questioned they appeal to ‘patriotism’ or your compassion for their stubbornness.

A case in point is the 2022 UK government consultation on Imperial measurements, billed as allowing us to use imperial measurements. But it was a lie, since we can already use imperial measurements in the UK, we just have to include metric measurements and make them at least as prominent. What the government wanted to do instead was to be able to omit metric measurements; and to further that aim, they rigged the consultation so that it wasn't possible to let the government know you preferred metric. All the questions were along the lines of “Do you want things to remain as they are, or allow metric to be omitted?” Therefore the balance of responses had to tilt in favour of eliminating metric.

In the end, over 100,000 responses were submitted and respondents, including myself found ways of being able to assert their preference for metric (via the occasional "other comments" boxes). Because the consultation didn't go the way the government wanted, they didn't publish the findings within the 12 week period they promised, but waited a year.

We found out the results on December 27th. Over 98.7% as clearly as possible said they preferred the current rules or only metric, so the government... introduced imperial measurements to bottles of wine “as a first step” towards more Imperialism, which no-one wanted, supermarkets already are saying they won't sell and are impossible to sell on a global market either.

It's all covered in the pro-metric UK society, metric views.uk. How to respond to the survey; mistakes & bias in the consultationhow the survey could have been fixedgovernment ignores complaints about the surveywhy no response after a year; and finally government confirms 99% don't want more Imperialism.

In conclusion, imperial measurements are embarrassing in the 21st century, but coercion is being used to perpetuate them. What we need is #MoreMetric.



Thursday 28 December 2023

Dialog Disillusion - The Mac Programming Primer Let Me Down

 Introduction

We did a bit of Macintosh programming at UEA, my undergraduate University between 1986 and 1989. Here we mostly used the interpreted MacPascal and a couple of sheets of the list of ToolBox APIs. We had infrequent access to MPW Pascal on the Mac IIs in the 3rd year, but the vast majority of development was done on Mac 512Ks and Mac Plusses.

This meant that we didn't really learn Macintosh programming properly. That's partly because MacPascal didn't support it properly (it used wacko inline functions to access the ToolBox), partly because we didn't get enough time on the Macs and partly because we just didn't have enough usable documentation.

So, when I found a copy of The Macintosh Pascal Programming Primer in about 1993 when I finally had a Mac (a Performa 400), I was overjoyed! I followed the entire set of examples from beginning to end and found them really educational: a whole bunch of well-written example applications that covered most of the needs of Toolbox API-based programs. The only difference was that I was using THINK C 5.0.4 instead of THINK Pascal, but it was easy to translate.

I used this knowledge to write a 16-bit RISC CPU simulator that gained me access to an MPhil degree in Computer Architecture at the University of Manchester between 1996 and 1998.

The Problem

Recently I've wanted to write a simple simulation framework for the classic Mac OS that consists of a dialog box to enter parameters and a main window to run the simulation. I probably want to integrate the dialog box with the main window and allow it to be updated live, but to start with I thought it would be easier to use a modal dialog, so that the user interaction would be:
  1. Edit the parameters
  2. Start/Restart the simulation
  3. Maybe stop the simulation before it ends
  4. Go back to steps 1 or 2 or Quit.
I started by taking the demo THINK C program: oopsBullseye and then adding the Dialog handling code from the Macintosh Pascal Programming Primer. But it didn't work - it just crashed the Mac every time (actually just a miniVMac emulator, but it's still a crash).

I wondered what kind of mistake I'd made, so I went back to the original Dialog chapter (chapter 6) and followed it through. Lo-and-behold, it worked. I still couldn't see where I'd gone wrong, but I thought it was because on my version, I could see that the dialog box appeared behind the main window, and that seemed to hang it. So I modified the Dialog demo to make it more like my application: the window would be open all the time (not just when the countdown was happening) and countdowns could be interactively started or restarted. I had to support update events.

And then I found out that this modified dialog application didn't work any more either! It had the same problem, the dialog box appeared behind the main window and crashed when ModalDialog was called. I scoured my copy of Inside Macintosh and Macintosh Toolbox essentials (I have a paper copy of both) and found some example Dialog box code for modal dialogs, but it still wasn't obvious what the difference was.

It turns out that the Macintosh Pascal Programming Primer is doing Dialog boxes really badly! I was gutted! Their example uses a couple of poor idioms which would mislead other programmers and it makes all the difference.

Analysis

TMPPP does two basic things that are wrong.

Firstly, it creates a global dialog box in DialogInit() (by using GetNewDialog(..) to read it in from the application's resource) which then sits there in memory all the time as a window you can't see. This means that when any other windows are created, the dialog box will pop up behind them when ShowWindow(dialogPtr) is called and then ModalDialog(..) will crash (the Mac!).

What it should do is create the dialog box when needed using GetNewDialog(..), i.e. in the Dialog Box handler, and when the user has finished with it, dispose of the dialog box (DisposeDialog()). Then the operation of the modal dialog is handled all in one place, and the Mac can deallocate the dialog box memory when it isn't needed, which is what we want.

Secondly, it violates the standard model / view / controller paradigm. Here, essentially, the model is the parameters used the dialog box. But in their example, they store the actual parameters in the dialog box items themselves; then when the dialog handler is called, they're saved to an internal data structure; and only if the user presses cancel, the internal data structure is used to restore back to the dialog box itself.

It should be done the other way around, the model is the internal data structure. When calling the Dialog box handler, the parameters should get copied to the dialog box items (which is equivalent to RestoreSettings(..)); and when the user quits by clicking [Save], the new items' values are copied back to the internal data structure, which is the SaveSettings(..) option ([Cancel] doesn't do anything, it just quits dialog box operations without updating the internal data structure).

The New Dialog Box Demo

So, my new Dialog Box demo is included here. It's significantly shorter, <500 lines; doesn't use the Notification manager, but importantly does use the modal dialog box the way it's supposed to. I avoid most of the repeated copying of lines of code by factoring the code that sets and gets controls: I think that this is going to be just as easy for new programmers to understand, because they won't have to scan a whole set of very similar lines of code to understand what each set of lines is doing: they can just go back to the lower-level getter/setter code and when they do their own dialog boxes, they'll be more likely to factor it too.

The resources are all almost exactly the same. I removed a menu option, because it no longer applied. You don't need the SICN icon.


Re-source¹ Fields.. [Close Window] Info².. [Close, Close]
DITL [OK]  
(see image below. Start with the Save, then Cancel buttons, then the other fields)

ID=400, Name="Alarm", Purgeable (only)
DITL [OK]
(see image below. Start with the OK button, then the text field)

ID=401, Name="About", Purgeable (only)
ALRT [OK] TOP=40, Bottom=142, Left=40, Right=332, DITL=401, Default Color ID=401, Name="About", Purgeable (only)
DLOG [OK] TOP=40, Bottom=200, Left=60, Right=320, DITL=400, Default Color, Standard double-border Dialog style, Not Initially visible, No close box ID=400, Name="Alarm", Purgeable (only)
MENU [OK] [X] Enabled, Title=• Apple Menu[ENTER], [X] Enabled, Title="About..."[ENTER] [ ]Enabled, • Separator line ID=400, No attributes.
MENU [OK] [X] Enabled, Title="File"[ENTER], [X] Enabled, Title="Settings..", Cmd-Key:S[ENTER], Title="Run", Cmd-Key:R[ENTER], Title="Quit", Cmd-Key:Q[ENTER] ID=401, No attributes.
MENU [OK] [X] Enabled, Title="Edit"[ENTER], no options [ ] Enabled: Title="Undo", Cmd-Key:Z[ENTER], Separator Line[Enter], Title="Cut", Cmd-Key:X[ENTER], Title="Copy", Cmd-Key:C[ENTER], Title="Paste", Cmd-Key:V[ENTER], Title="Clear", Cmd-Key:none[ENTER] ID=402, No attributes.
MBAR [OK] Each time, click in '****', choose Resource:Insert New Field(s) for Menu Res IDs 400, 401, 402. Top should say "# of menus 3 at the end." ID=400, No attributes.
WIND [OK] Close, then choose Resource:Open Using Template [WIND] [OK]. Bounds Rect= 70, 36, 106, 156 [Set], Proc ID=0, Visible=false, GoAway=false, RefCon=0, Title="Countdown", Auto Position=$0000. ID=400, Name="Countdown", Purgeable (only)

Parameters Ditl


About Ditl




When you've finished, close the .rsrc file. ResEdit will ask you to save it - save it. Then open up the Dlog.Ï€ project.  Choose File:New and create a stub of a C program:

int main(void)
{
    return 0;
}

Choose File:Save to save it as Dlog.c. Choose Project:Add "Dlog.c" to add the file to the project. You don't need to do anything clever to add the rsrc file to the project, THINK C will automatically associate the .rsrc with the same prefix as your application. 

Now you want to replace the dummy program with the rest of file. When you've finished...

Dlog.h

/**
* @file: Reminder.h
*/

#ifndef Reminder_h
#define Reminder_h

#define kBaseResId 400
#define kAboutAlert 401
#define kBadSysAlert 402

#define kSleep 60

#define kSaveButton 1
#define kCancelButton 2
#define kTimeField 4
#define kSOrMField 5
#define kSoundOnBox 6
#define kIconOnBox 7
#define kAlertOnBox 8
#define kSecsRadio 10
#define kMinsRadio 11

#define kDefaultSecsId 401
#define kDefaultMinsId 402

#define kOff 0
#define kOn 1

#define kSecondsPerMinute 60

#define kTop 25
#define kLeft 12

#define kMarkApplication 1
#define kAppleMenuId (kBaseResId)
#define kFileMenuId (kBaseResId+1)
#define kAboutItem 1

#define kChangeItem 1
#define kStartStopItem 2
#define kQuitItem 3

#define kSysVersion 2

typedef enum{
  kBoolFalse=0,
  kBoolTrue=1
}tBool;

typedef enum {
  kTimeUnitSeconds=0,
  kTimeUnitMinutes=1
}tTimeUnit;

typedef struct {
  long iTime;
  int iSound, iIcon, iAlert;
  tTimeUnit iUnit;
}tSettings;



extern Handle DlogItemGet(DialogPtr aDialog, int aItem);
extern void CtlSet(DialogPtr aDialog, int aItem, int aValue);
extern int CtlGet(DialogPtr aDialog, int aItem);
extern void CtlFlip(DialogPtr aDialog, int aItem);
extern void ITextSet(DialogPtr aDialog, int aItem, Str255 *aStr);

extern void StartCountDown(long aNumSecs);
extern void HandleCountDown(void);
extern void UpdateCountDown(void);

extern void RestoreSettings(DialogPtr aSettingsDialog);
extern void SaveSettings(DialogPtr aSettingsDialog);
extern void HandleDialog(void);
extern void HandleFileChoice(int aTheItem);
extern void HandleAppleChoice(int aTheItem);
extern void HandleMenuChoice(long aMenuChoice);
extern void HandleMouseDown(void);
extern void HandleEvent(void);
extern void MainLoop(void);
extern void MenuBarInit(void);
extern void DialogInit(void);
extern void WinInit(void);
extern tBool Sys6OrLater(void);
extern void ToolboxInit(void);
extern int main(void);

#endif // Reminder_h

Dlog.c

/**
 * Dlog.c
 */

#include "Dlog.h"

tBool gDone;

EventRecord gTheEvent;
tSettings gSavedSettings;


WindowPtr gCountDownWindow;
long gTimeout, gOldTime;
tBool gIsCounting;

Handle DlogItemGet(DialogPtr aDialog, int aItem)
{
  int itemType;
  Rect itemRect;
  Handle itemHandle;
  GetDItem(aDialog, aItem, &itemType, &itemHandle, &itemRect);
  return itemHandle;
}

void CtlSet(DialogPtr aDialog, int aItem, int aValue)
{
  Handle itemHandle=DlogItemGet(aDialog, aItem);
  SetCtlValue((ControlHandle)itemHandle, aValue);
}

int CtlGet(DialogPtr aDialog, int aItem)
{
  Handle itemHandle=DlogItemGet(aDialog, aItem);
  return GetCtlValue((ControlHandle)itemHandle);
}

/*
void ITextSet(DialogPtr aDialog, int aItem, Str255 *aStr)
{
  Handle itemHandle=DlogItemGet(aDialog, aItem);
  SetIText(itemHandle, aStr);
}
*/
void CtlFlip(DialogPtr aDialog, int aItem)
{
  Handle itemHandle=DlogItemGet(aDialog, aItem);
  SetCtlValue((ControlHandle)itemHandle,
    (GetCtlValue((ControlHandle)itemHandle)==kOn)? kOff:kOn);
}

void StartCountDown(long aNumSecs)
{
  GetDateTime(&gOldTime);
  if(gSavedSettings.iUnit==kTimeUnitMinutes) {
    aNumSecs*=kSecondsPerMinute;
  }
  gTimeout=gOldTime+aNumSecs; // this is the timeout.
  gIsCounting=kBoolTrue;
}

// Called on Null event.
void HandleCountDown(void)
{
  if(gIsCounting==kBoolTrue) {
    long myTime;
    GetDateTime(&myTime);
    if(myTime!=gOldTime) {
      GrafPtr oldPort;
      gOldTime=myTime; // gTimeout-gOldTime==remaining seconds.
      // gen update, but how?
      GetPort(&oldPort);
      SetPort((GrafPtr)gCountDownWindow);
      InvalRect(&gCountDownWindow->portRect);
      SetPort(oldPort);
    }
  }
}

void UpdateCountDown(void)
{
  //
  WindowPtr win=(WindowPtr)gTheEvent.message;
  if(win==gCountDownWindow) {
    long remaining=gTimeout-gOldTime;
    Str255 myTimeString;
    BeginUpdate(win);
    MoveTo(kLeft, kTop);
    if(remaining<=0 || gIsCounting==kBoolFalse) {
      remaining=0;
      gIsCounting=kBoolFalse;
    }
    NumToString(remaining, myTimeString);
    EraseRect(&(gCountDownWindow->portRect));
    DrawString(myTimeString);
    EndUpdate(win);
  }
}

void RestoreSettings(DialogPtr aSettingsDialog)
{
  Handle itemHandle;
  Str255 timeString;
  tBool isInSeconds=(gSavedSettings.iUnit==kTimeUnitSeconds)?
      kBoolTrue:kBoolFalse;
 
  itemHandle=DlogItemGet(aSettingsDialog, kTimeField);
  NumToString(gSavedSettings.iTime, &timeString);
  SetIText(itemHandle, timeString);
 
  CtlSet(aSettingsDialog, kSoundOnBox, gSavedSettings.iSound);
  CtlSet(aSettingsDialog, kIconOnBox, gSavedSettings.iIcon);
  CtlSet(aSettingsDialog, kAlertOnBox, gSavedSettings.iAlert);
  CtlSet(aSettingsDialog, kSecsRadio, (isInSeconds==kBoolTrue)?kOn:kOff);
  CtlSet(aSettingsDialog, kMinsRadio, (isInSeconds==kBoolFalse)?kOn:kOff);

  itemHandle=DlogItemGet(aSettingsDialog, kSOrMField);
  SetIText(itemHandle,(gSavedSettings.iUnit==kTimeUnitSeconds)?
      "\pseconds":"\pminutes");
}

void SaveSettings(DialogPtr aSettingsDialog)
{
  Handle itemHandle;
  Str255 timeString;

  itemHandle=DlogItemGet(aSettingsDialog, kTimeField);
  GetIText(itemHandle, &timeString);
  StringToNum(timeString, &gSavedSettings.iTime);
 
  gSavedSettings.iSound=CtlGet(aSettingsDialog, kSoundOnBox);
  gSavedSettings.iIcon=CtlGet(aSettingsDialog, kIconOnBox);
  gSavedSettings.iAlert=CtlGet(aSettingsDialog, kAlertOnBox);
  gSavedSettings.iUnit=(CtlGet(aSettingsDialog, kSecsRadio)==kOn)?
        kTimeUnitSeconds:kTimeUnitMinutes;
}

void HandleDialog(void)
{
  tBool dialogDone;
  int itemHit;
  long alarmDelay;
  Handle itemHandle;
  DialogPtr settingsDialog;
 
  settingsDialog=GetNewDialog(kBaseResId, NULL, (WindowPtr)-1);

  ShowWindow(settingsDialog);
  RestoreSettings(settingsDialog);
 
  dialogDone=kBoolFalse;
  while(dialogDone==kBoolFalse) {
    ModalDialog(NULL, &itemHit);
    switch(itemHit) {
    case kSaveButton:
      SaveSettings(settingsDialog); // update them.
      dialogDone=kBoolTrue;
      break;
    case kCancelButton:
      dialogDone=kBoolTrue;
      break;
    case kSoundOnBox:
    case kIconOnBox:
    case kAlertOnBox:
      CtlFlip(settingsDialog, itemHit);
      break;
    case kSecsRadio:
      CtlSet(settingsDialog, kSecsRadio, kOn);
      CtlSet(settingsDialog, kMinsRadio, kOff);

      itemHandle=DlogItemGet(settingsDialog, kSOrMField);
      SetIText(itemHandle, "\pseconds");
      break;
    case kMinsRadio:
      CtlSet(settingsDialog, kSecsRadio, kOff);
      CtlSet(settingsDialog, kMinsRadio, kOn);

      itemHandle=DlogItemGet(settingsDialog, kSOrMField);
      SetIText(itemHandle, "\pminutes");
      break;
    }
  }
  DisposeDialog(settingsDialog);
}

void HandleFileChoice(int aTheItem)
{
  switch(aTheItem) {
  case kChangeItem:
    HandleDialog();
    break;
  case kStartStopItem:
    HiliteMenu(0);
    StartCountDown(gSavedSettings.iTime);
    break;
  case kQuitItem:
    gDone=true;
    break;
  }
}

void HandleAppleChoice(int aTheItem)
{
  Str255 accName;
  int accNumber, itemNumber, dummy;
  MenuHandle appleMenu;
  switch(aTheItem) {
  case kAboutItem:
    NoteAlert(kAboutAlert, NULL);
    break;
  default:
    appleMenu=GetMHandle(kAppleMenuId);
    GetItem(appleMenu, aTheItem, &accName);
    OpenDeskAcc(accName);
    break;
  }
}

void HandleMenuChoice(long aMenuChoice)
{
  int theMenu, theItem;
  if(aMenuChoice!=0) {
    theMenu=HiWord(aMenuChoice);
    theItem=LoWord(aMenuChoice);
    switch(theMenu) {
    case kAppleMenuId:
      HandleAppleChoice(theItem);
      break;
    case kFileMenuId:
      HandleFileChoice(theItem);
      break;
    }
    HiliteMenu(0);
  }
}

void HandleMouseDown(void)
{
  WindowPtr whichWindow;
  int thePart;
  long menuChoice, windSize;
  thePart=FindWindow(gTheEvent.where, &whichWindow);
  switch(thePart) {
  case inMenuBar:
    menuChoice=MenuSelect(gTheEvent.where);
    HandleMenuChoice(menuChoice);
    break;
  case inSysWindow:
    SystemClick(&gTheEvent, whichWindow);
    break;
  case inDrag:
    DragWindow(whichWindow, gTheEvent.where, &screenBits.bounds);
    break;
  case inGoAway:
    gDone=kBoolTrue;
    break;
  }
}

void HandleEvent(void)
{
  char theChar;
  tBool dummy;
  WaitNextEvent(everyEvent, &gTheEvent, kSleep, NULL);
  switch(gTheEvent.what){
  case mouseDown:
    HandleMouseDown();
    break;
  case keyDown: case autoKey:
    theChar=(char)(gTheEvent.message & charCodeMask);
    if((gTheEvent.modifiers & cmdKey)!=0) {
      HandleMenuChoice(MenuKey(theChar));
    }
    break;
  case nullEvent:
    HandleCountDown();
    break;
  case updateEvt:
    UpdateCountDown();
    break;
  }
}

void MainLoop(void)
{
  gDone=kBoolFalse;
  while(gDone==kBoolFalse) {
    HandleEvent();
  }
}

void MenuBarInit(void)
{
  Handle myMenuBar;
  MenuHandle aMenu;
  myMenuBar=GetNewMBar(kBaseResId);
  SetMenuBar(myMenuBar);
  DisposHandle(myMenuBar);
  aMenu=GetMHandle(kAppleMenuId);
  AddResMenu(aMenu, 'DRVR');
  DrawMenuBar();
}

void WinInit(void)
{
  gCountDownWindow=GetNewWindow(kBaseResId, NULL, (WindowPtr)-1);
  gIsCounting=kBoolFalse;
  SetPort(gCountDownWindow);
  TextFace(bold); // it's the same in THINK C.
  TextSize(24);
  ShowWindow(gCountDownWindow);
}

void DialogInit(void)
{
  gSavedSettings.iTime=12;
 
  gSavedSettings.iSound=kOn;
  gSavedSettings.iIcon=kOn;
  gSavedSettings.iAlert=kOn;
 
  gSavedSettings.iUnit=kTimeUnitSeconds;
}

tBool Sys6OrLater(void)
{
  OSErr status;
  SysEnvRec SysEnvData;
  int dummy;
  tBool result=kBoolTrue;
  status=SysEnvirons(kSysVersion, &SysEnvData);
  if(status!=noErr || SysEnvData.systemVersion<0x600) {
    StopAlert(kBadSysAlert, NULL);
    result=kBoolFalse;
  }
  return result;
}

void ToolboxInit(void)
{
  InitGraf(&thePort);
  InitFonts();
  InitWindows();
  InitMenus();
  TEInit();
  InitDialogs(NULL);
  MaxApplZone();
}

int main(void)
{
  ToolboxInit();
  if(Sys6OrLater()) {
    DialogInit();
    MenuBarInit();
    WinInit();
    InitCursor();
    MainLoop();
  }
  return 0;
}

Conclusion

As a whole, The Macintosh Pascal (and C) Programming Primer is a brilliantly simple introduction to traditional (non-OO) Macintosh programming. However, the Dialog box chapter ("Working With Dialogs") is a major exception. After a bit of sleuthing (which I should never have needed to do), I worked out the problem and rewrote a better demo (and made sure the cursor is an arrow instead of the watch icon). This means I'm closer to my goal of writing a simple simulation framework.

For the chronically lazy amongst you, feel free to download the full project from here.

Friday 15 December 2023

Matching Pairs\The Dyslexia Advantage

 It turns out there are some advantages with Dyslexia.

The Advantage

I recently came across this YouTube video.


It starts with how Dyslexic people can be better at identifying impossible shapes (like a Necker-Cube), but it turns out that there are a number of cases where dyslexic people are better at processing images as a whole; or containing scattered information; or interpreting information at the periphery of their vision.

And part of this is the trade-off from the relatively short period since humanity developed writing, compared with the relatively long period where a variety of visual abilities were more advantageous.

The interesting thing for me is that it made me think about why why my younger sister (who has some degree of Dyslexia) always used to beat me at card games, starting with Matching Pairs, even when we were really young. And this is perhaps a reason why.

Matching Pairs

I'd have a basic strategy for Matching Pairs, and I needed a strategy, because I couldn't remember the layout of the images when flipped over; nor the set of cards that had been flipped.

So, my strategy was that firstly, if I couldn't remember where two matching cards were (which was the normal case), then I'd randomly turn a card that I didn't think had been picked before (or recently). So, if it was familiar, I could guess at picking a matching card. However, if it wasn't familiar, I'd then try to pick another card that wasn't in the set of what I thought were recent cards, because then there's a better chance of picking a matching card.

The trouble was that even if the first card was familiar, I'd never know where the matching card was, I'd merely have a feeling about the general area. That meant I was wrong most of the time.

Reflecting on the way my memory works - which is visual, is somewhat like this:

Imagine in Matching pairs your whole visual field either contains a single card turned over, or there's a grid of unturned cards; or there are two turned cards and you can see how far one is away from the other, but not able to remember their absolute positions.

Then in the history of the game so far, you can flag up the areas where cards have been found, but each of those positions could contain any one of the cards seen so far. That's why it's challenging for me even though it's such a simple game.

For someone with Dyslexia, it could be an easier game, because they'd have a better sense of the whole playing area for the cards that have been turned over: rather like being able to visualise it in that state. Hence my sister was much better at playing the game than me.

ZX81 Version

Because it's such a simple game, it didn't take me long to imagine how I'd write a version in BASIC for a ZX81. Here's the listing for it. You can play it yourself by going to the ZX81 Javascript emulator and typing in the listing below:

Note: In line 5, the 16 graphics characters are: <space><graphic 1>..<graphic 7><graphic Q>..<graphic Y><graphic space>.



Playing The Game

The game works as a two-player game. A grid of 6 x 6 (i.e. 36) tiles are generated, each of which contains 18, unique pairs of random 4x4 pixel patterns. They're initially shown all face-down with the digits 0..9, then A..Z in the top left-hand corner of each tile.

Player 1 then types one of these characters (e.g. '0') and that tile is turned over. Then Player 1 types another character (e.g. 'J') and its tile is turned over. If the tiles match, Player 1's score is incremented otherwise the tiles remain turned over for 2 seconds to give the user a chance to memorise them before being turned back.

Player 2 always has the next go and they choose the codes for two overturned tiles in sequence as per Player 1.

Play returns to player 1 and both players repeat the process. Over time, more and more pairs are found.



The game finishes when all the tiles have been matched. The player with the most overturned tiles wins (or if both have 9 pairs each, it's a draw).

This version of the game is good for exploring players' relative strengths. In particular, one of the differences between this game and a classical Matching Pairs game is that the patterns on each tile are abstract, because they're just a random arrangement of 4x4 pixels rather than familiar images. This makes it easier for people who can process abstract images more easily.

The Automatic Game

It's fairly easy to change the game so that the computer plays itself. In this version, the program selects a random character in the right range, as though it was typed and then the simulation plays out just as it would with human players. We don't need to pause at the end of every go where there's no match, but we do need to make sure that the computer doesn't try to pick the same tile twice, nor overturn tiles that are already matched: so we set matched pairs to '<space>' characters and build in an extra rule.


There are three things I really noted about the automatic game. Firstly, the ZX81 is terribly slow. Even though its effective reaction times are much faster than a human, I found it would take over 20 minutes to solve a game, about 3 or 4 times slower than a pair of humans.

Secondly, even though I consider myself to have a terrible memory for this kind of game, the automatic version shows that I do in fact remember the patterns much better than my personal algorithm described above implies: I would end up remembering the positions of several pairs while the computer was repeatedly choosing the wrong second tile for one I already knew. Some of this came from the sheer repetition of wrong moves by the computer, but it still demonstrated that I was in fact memorising locations even though I thought I didn't.

Thirdly, watching it was quite pleasantly calming and after several minutes I struggled to stay awake! Yay for boring programs as a cure for insomnia!

Program Analysis

I originally wanted the game to fit in 1kB, but that turned out not to be possible - at least I don't yet know how to squeeze it down. The two-player code itself is about 1124 bytes long if I remove the REM'd statements in lines 83 and 86, but include the extra automatic game checking in lines 125, 130, 203 and 206. To do this it has all the normal short-cuts like NOT PI (0), SGN PI (1), CODE "X" for some values in the range 10..255 and VAL "X" for others.

Generating Patterns

The first challenge with the game was to create the random set of patterns. I didn't store any fixed pictures, because the ZX81 has such terrible graphics anyway it hardly seemed worthwhile and also, it would take up memory I didn't want to waste.

So, instead I created 4x4 block patterns. The first issue then is how to create random patterns without duplicating them - the ZX81 is slow, so if I was checking each new pattern against all the previous ones, it would take O(n²) time, up to about 1000 checks. Instead what I did was simply generate a random number and then use the ZX81's 16-bit System Variable: SEED to generate the pattern directly, since every time it executes RND, SEED is updated, but it goes through all the 65536 binary values before repeating itself: hence I didn't need to check for repeats. I stored each 16-bit random pattern in a pair of 8-bit characters; so I needed an array of 18, 2-byte strings (M$) to do this.

The second challenge was to make sure I created proper pairs of values in a random order. To do this I first split the card generation into two levels: the 18 card patterns, which were 16-bits, and the cards themselves, which are indexes to those patterns and only need to be 8-bits.

For those, I adapted my card-shuffling algorithm. I essentially created a string containing a list of pairs of cards: AABB..RR (the first to 18th letter). Then I picked & removed a random card from the 'remaining' set of sorted cards; and placed it at the beginning of the string after the previous random cards. Then defined the remaining set as the set of following cards (which is just the n+1th card onwards after n cards have been picked).

This card shuffling algorithm guarantees that all the cards get shuffled, whereas a more literal algorithm would leave some pairs of cards still in order, if they didn't happen to be picked to be shuffled.

Displaying Tiles

I wanted to be able to display back-facing tiles without using any graphics characters used by tiles themselves and without spaces between, so I used ':'s for the main background and
 graphic characters between them. The routine for displaying tiles is a bit convoluted as it involves a loop for each row, but it saves on 3 different AT calculations. Perhaps, a single Print AT would have been better (and faster than the loop).

For the front side I didn't need to consider the display of the  separators, because they've been displayed earlier. All I needed to do was display each row, which was fairly easy as each row is 8-pixels. So, I obtained the pattern from the tile in P$ indexed by the position (1..36). This gave me a two character M$ string, one character per row. Then I extracted each nibble and indexed that into the graphics patterns in C$.

The Y, X print AT calculations were common to both front and back facing tiles, so I factored that. None of this is fast, but it's fast enough for playing interactively.

A 1K Version

It always seems that I should be able to squeeze a ZX81 program into 1K, but perhaps the only way to do this here is to resort to machine code. The screen takes up about 20 chars per row x 21 rows + 4 = 424 bytes. The patterns and tiles take up 36 x 2 bytes = 72 bytes and the graphics take another 16 = 88 bytes.

This means the total space in BASIC is about 1124+424+88 = 1636, which means it ought to fit in a Timex Sinclair 1000 as it has 2kB instead of 1kB of RAM.

It might be shorter to compute patterns from scratch by resetting SEED each time. The ZX81 has about 800 or something bytes free, so this leaves about 300 bytes for the program - pretty tight, but at least it wouldn't take too long to write ;-) . And I could use things like the printer buffer for some extra space if needed.

Friday 23 June 2023

QUADFIT: Least Squares Quadratic Curve Fitting on a 1K ZX81

 Yet another snappy blog post title!

The sub-theme for this one is that by rethinking a problem in a different way, we can squeeze the solution into far fewer resources.

In this case, I'd recently been doing some work (at work) which involved fitting a quadratic curve to a set of noisy datapoints (because they're derived from ADCs, which... often need a lot of filtering to get decent results despite their nominal 12-bit accuracy).

In practice I found a website that covered the Least Squares Quadratic fitting algorithm.

https://www.omnicalculator.com/statistics/quadratic-regression

The interesting thing for me is that this algorithm is explained in quite a lengthy webpage; and on top of that there's a whole side box that allows you to enter a set of (x,y) coordinates and it'll figure out the quadratic coefficients (and the correlation).

That's quite a lot of resources, which implies it's fairly involved, but is it really? Looking at the equations and how the coefficients are derived from them:



It looks like any programmer would need to store the set of (x,y) coordinates used, along with the number of coordinates n, and then perform a number of loops to sum all the terms, starting with the averages:

But the format for all the S** terms reminded me of how 1980s Casio Scientific Calculators managed to perform linear regression without needing to store an array of data, and they needed Sx, Sx2, Sxy type terms too. And it was possible to correct the data you'd entered too - even though they didn't store the data itself. How was this possible?

Well, the answer lies in observing that firstly we don't need to calculate the averages for x, x² or y, because you can simply calculate ∑x, ∑x² and ∑y after each new coordinate is entered, and divide by the current value of n. Then the same reasoning applies to calculating all the S** terms: you simply update the sums for each of them whenever you enter a new coordinate. This reduces the problem to evaluating 10 terms on each new iteration (a doesn't need to be stored, it can be merely displayed).

The final question then is how to delete data? That's surprisingly simple too, all you need to do is use the erroneous coordinate (x',y'), but subtract the corresponding x'áµ¢, x'²áµ¢, x'³áµ¢, x'áµ¢, y'áµ¢, x'áµ¢y'áµ¢, x'²áµ¢y'áµ¢ values from the corresponding terms and then decrement n.

And because the computation really can be reduced to that degree, it can be squeezed into a 1K ZX81 (and probably an early 1980s Casio programmable!). Go to the Javascript ZX81, then type POKE 16389,17*4 [Newline] then NEW. The ZX81 is then effectively a 1kB ZX81


You'll find that as you get towards the end of the program, the edit line will start jumping up, as you're running out of memory. In fact there are a whole 78 bytes free (including the screen space used by the program) for any enhancements you want (e.g. correlation!)

Saturday 22 April 2023

Barely Charging Network: Maybe we can't make to Fully Charged in our Zoe EV.

 Introduction

We're planning to go to the Fully Charged Live show in Farnborough in late April. We've had our 22KWh Renault Zoe for over 6 years. Our Zoe is AC-only charging, but can charge at 22KW, which means that we can travel decent distances - if 22KW charging posts are available.

Six years ago, the Ecotricity-sponsored Electric Highway chargers along UK motorways provided a relatively excellent means of getting around the country by AC charging. It (ironically) helped that even though there were few chargers, there were also few EVs, so there wasn't much competition. Today, most EVs charge via DC (CCS, though there are a few Leaf cars around that still charge via CHaDeMO) so there still isn't much competition for AC chargers.

Unfortunately, the 22KW AC Charging network in the UK has grown and simultaneously been trashed over this period. It is easy to describe why:

  • The Electric Highway has been replaced by GRIDSERVE, which initially took out all the 22KW charging facilities and replaced them with nothing. Then they installed '22KW' charging points, which never charge at the full rate. GRIDSERVE do not seem to understand that spending 2hours charging up at 11KW or less is ABSOLUTELY UNACCEPTABLE. Bosses at GRIDSERVE, a message to you: why would anyone spend 2 hours charging up at your charging points? I mean, literally anyone? Your AC market is literally ZERO people who would want to charge at your AC charging points. No-one! ZERO!!!!!!
  • Open competition against a total lack of key aspects of regulation mean that there are literally dozens of different charging apps you need to download onto your phone in order to charge. It was OK, when it was only the Electric Highway - it's not OK, when it's dozens of charging apps. And they are all variants on exactly the same thing. We only need one.
  • Many, many, MANY charging points are broken. Currently we have to plan for at least 2 alternatives to the main charging point we want to go to. Why are there no requirements for charging point maintenance? It can't all be done remotely, e.g. a software upgrade won't fix slow BP chargers that have failed due to water ingress.
  • Many charging points advertised as 22KW simply aren't. They don't charge faster than 11KW. For example, the ones at Bicester OX26 6BP. In this case, because I know they won't charge up at 22KW, there is no point in charging at them. YOU HAVE WASTED MONEY INSTALLING THESE CHARGERS. But what's worse is that because we no longer know if a proprietary 22KW charger will charge at 22KW, we can't risk using any of them, even by a different company, anywhere in the UK, unless there is proof that it's possible to actually charge at 22kW.
As it turns out, some companies' chargers really do work as advertised: BP Pulse (formerly Charge Master), Pod-Point and Swarco E-connect do work at 22kW (though some are broken).

The Plan

Getting to Farnborough involves going down the M40, where we've had problems charging a year ago. I can't see that it's any better now. There are GRIDSERVE charging points at Cherwell services after 57 miles, so that's out, because they won't charge at 22KW. There's nothing reliable that's close to that.

However, there's some charging points at Kidlington which is at about 67 miles:
Of them, only a single charging point is acceptable. The top blue one is out of the way; the remaining blue one I can't be sure about; the second BP one is out of order; the bottom one is GRIDSERVE, which don't charge at 22kW. That's a failure rate of 80%.

Moving on: there's a SWARCO E-Connect at OX1 4NA in Oxford, that's about 76.7 miles away, at the practical limit for the car. SWARCO seem to work. Then there's Westgate in Oxford where there's several 22KW charging points, but I don't know the company. Perhaps I can check.

So, this covers the mid-way charging, literally about half way there. Then we get to the Fully Charged Show Live at: Farnborough International Exhibition & Conference Centre. Here, I expanded the criteria to 7KW charges and even so, there seems to be only a few charging points.. like how is this viable given the number of people who are likely to go there by EV?



If this turns out to be viable, we then need to charge again at Kidlington on the way back. All of it is at the boundary of practicality.

Conclusion

Underneath all of these woes is a simple reason: Our government, which has utterly failed to provide regulations that ensure a working charging network. This is 'liberated' free enterprise in action, a totally dysfunctional industry, free from any sense of responsibility to its actual users.

It would not have been hard to regulate it: the bullet points above describe what need to be done:

  • Chargers should work as advertised, 22kW should mean exactly that, or a range of charging rates if uncertain.
  • Every Charging point should be registered on Zap-map (or a national body) as it's installed, along with its capabilities.
  • There should be at most one charging app to cover all charging points even from different companies. There should be access to contactless charging on all motorway charging points.
  • Maintenance should be swift, again, it should be provided at a national level.
  • Support should always be available 24/7 from a single national body.
  • Charging locations should be distributed according to the need to cover the country, not just installed where the market penetration is highest.
  • Adequate 22kW charging coverage should be supported until 80% of the cars that supported it are no longer on the roads.
Let's finish with this, because it is, surprisingly, not hard to achieve. Consider: the surface area of England: 130,279km2. A viable charging network would need a charging point every 80km, or a charging point every 692km2. So, we need: 188 x 22kW charging points to cover the country, a cost of about £188K.

In the meantime, I'm not sure we'll be able to make it to the Fully Charged Show from Birmingham in our Zoe.



Monday 10 April 2023

Decarbonisation Sim

The IPCC (AR6) Synthesis report came out on March 20. It doesn't say anything new, it just says it louder. Climate Scientist Katharine Hayhoe has a good summary of the main points on a Twitter thread and they make for sober reading.

One of the charts, in particular, shows how global heating will increasingly severely impact people the later they were born. For example, I was born in 1968 (323ppm CO2), so if I live to 85 or so, then that will be just into the zero-carbon era, if the world can achieve the intention of the Paris Agreement, but that zero-carbon era will almost certainly be over 1.5ºC warmer than pre-industrial times.

There is increasing controversy over the way in which IPCC report policy statements have been watered-down in order to please politicians and the fossil fuel industry. Kevin Anderson covers this well in an article in The Conversation. In a sense, although we're just at the point where renewable energy deployment has reduced the rate of increase of CO2 emissions to near 0 (300MT in 2022 vs 1.4GT in 2021 [Refs]) we are also heading in the wrong direction; for example COP27 being hosted by a fossil fuel industry CEO; or new UK coal mines shortly after hosting COP26.

In addition I've been watching TickZero's YouTube videos explaining why there's little chance that we can meet net-zero by 2050 - the modelling here uses deployment latency to conclude we have to reduce energy usage by about 60% between now and 2050 in order to avoid the 50% chance of overshooting 1.5ºC. @KevinClimate's recent SGR article also makes the point forcefully:
“But such a rapid deployment of existing zero carbon technologies, in itself, can no longer be sufficient. We’ve left it so late that technology will never deliver in isolation”
People have interpreted this to mean steep reductions in energy demand, but currently I'm struggling to find the relevant Kevin Anderson quote for this:
“As always @KevinClimate sees through the smoke and mirrors! We cannot achieve deep mitigation without steep reductions in demand.” https://twitter.com/kristiansn89
As someone outside of academic circles, I'm not aware of the datasets and appropriate models needed to accurately determine what kinds of mitigation is needed, but I am interested in exploring simple models that can provide rough (but reasonable) answers to questions about the depth of mitigation and demand reduction.

The Model

My approach is fairly simple, let's assume we start with the current global energy budget. Some (most) is provided by fossil fuels and some (mostly electricity) is provided by renewable energy. Some is provided by Nuclear energy (which I don't think I've included in my model, though I can update it fairly easily to do that). If we want to fully decarbonise between now and 2050 using renewable energy (which is far easier than using nuclear power or carbon capture), then we have to allocate some of our energy budget each year to building renewable technology. The amount we allocate, directly translates into the amount of renewable energy we generate; how quickly it's deployed and how much additional energy it provides.

Global Energy

So, first we need to know how much energy we used in 2022. Now, you may find it fairly surprising, but a simple Google search for "Global energy used 2022" doesn't give you a straight answer: a figure in TWh. However, what I did find was the amount of electrical energy produced (27TWh [1]) and the proportion of global energy that's electric (20.4% in 2021 [2], which I interpreted to mean could reach 21% in 2022). So, this gives: 129TWh for global energy. I also found out that about 11% is solar heating [3].

Secondly, we need to know how much electricity is produced using renewable energy, it's currently about 29% [4], so we know that 27TWh*0.29 = 7.83TWh is renewable sources (e.g. wind, solar, hydro).

Renewable Investments

We know that Wind Turbines produce a return on investment, given their manufacturing costs, of about 21:1 (over twice as much as for fossil fuels), and if we assume that a Wind Turbine lasts 25 years, then that means 1kWh invested in a Wind Turbine produces 21kWh of energy over 25 years, which is: 0.84kWh per year. I assume the same is true for Solar PV and that there's an even mix of both [5]. Finally I know that Wind and Solar energy is getting cheaper every year, and I assume it's getting better by 7% per year, which translates into a ROI increase of 7% every year [6][7]. It's actually been twice as good as that ([6] says the improvement is $5.66/W to $0.27/W => 16% per year and [7] directly says 16%/year) and that the 21:1 ratio is from the early 2010s, and so it's significantly better now. I am assuming diminishing improvements.

So, just based on this information we can generate a model for how much energy we need to invest per year just to reach 100% renewable energy (not electricity) by 2050. That's the first value you can control in the model.

Energy Reductions

The TickZero videos and Kristiann89's tweet argue that we need to reduce energy usage as well over this period. We can combine the previous model with energy reductions by simply taking the total energy reduction we expect in total and applying the (2050-2022) = 28th root of it. For example a 60% reduction means we're left with 40% of the energy so the energy reduction is 28√0.4 = 0.968, so each year we have 96.8% of the energy of the year before, a fall of 3.2% per year.

Energy reductions for people are like applying austerity. If we have to use 3.2% less each year, and we can't gain 3.2% more efficiency (certain), then that means we will have to ration energy usage. The combined amount of energy reduction for people will be the renewable investment + the energy reductions. If TickZero and Kristiann89 are correct and this model would be approximately correct, then that's 3.2+1.52 = 4.82% energy reductions per year, roughly a Covid-19 pandemic impact every year between now and 2050.

CO2

The primary limitation on energy and renewables is the carbon budget. The remaining carbon budget as of 2023 is taken to be 455000 MTonnes of CO2 (or maybe CO2e, which this model doesn't consider). to have a 50% chance of staying under 1.5ºC. For a 50% change of staying under 2.0ºC the carbon budget is taken to be 1375000 MTonnes of CO2. It's possible I'm quoting for the budgets that allow the temperature of overshoot providing that global temperatures return to 1.5ºC or 2.0ºC respectively, but this isn't crucial to the model, since the budgets can be adjusted.

We can calculate how much emissions we expect based on the the global energy usage and the amount of fossil fuels (TotalEnergy-Renewables-ZeroCarbonHeating) by knowing the CO2 produced by burning fossil fuels. This depends upon the kind of fossil fuels in general. We consider the primary components: Coal (at 0.85kg of CO2e per kWh[9]) and gas (at 0.49kg of CO2e per kWh[9]) and so the overall emissions are governed by the mix, which in this model is just a constant (by default 1:1 Coal:Gas [10]).

Simulations

It's possible to write a crude simulation involving these few variables: Total Energy, Energy Reductions over time, Renewables Investment and a mix of fossil fuels that lead to CO2 emissions. The simulation provides a default set of values and you can tweak them to see what happens under various conditions. Most of the information is shown with simple curves covering the years 2022 to 2050. When CO2 emissions exceed the 1.5ºC budget the background is banded in light yellow which increases linearly to pink as emissions reach 2.0ºC. It's not an accurate scale, mid-way between yellow and pink doesn't necessarily mean 1.75ºC is likely to have been breached. Instead it's an indication of severity of emissions.

Results

The default simulation hits +1.5ºC sometime in the late 2020s and gets to +1.8ºC by 2050. This is roughly similar to actual climate models in the sense that it involves major global investments and when it hits +1.5ºC. It involves a decline of 3.2% of energy usage per year.

It's possible to alter the parameters so that the overall energy loss to humanity is reduced - and correspondingly the renewables investment must be higher. For example if we don't reduce energy usage over time; and want to hit the same maximum temperature of +1.8ºC, then we need to invest 3.55% of global energy every year, and we hit zero carbon by 2043.

It's possible to explore trade-offs that result in zero-carbon before 2050 (and for developed countries there's a strong argument to say it should), but scenarios that avoid 1.5ºC altogether require investments + losses that exceed 10% per year. Again, this concurs with IPCC and other climate science models which emphasise the difficulty of achieving this: i.e. the virtual impossibility given the current lack of political will. As Kevin Anderson has said:

“There are no non-radical futures.” https://twitter.com/70sBachchan/status/1415023625183404036?s=20

Conclusions

I came into this model accepting the premise that we need steep declines in energy usage in order to reach zero-carbon. The model that agrees with TickZero's energy decline involves a 1.52% global energy investment per year, but in effect this has a 4.7% impact on energy per year at a personal level, something close to the crash of 2008 or the Covid pandemic.

I am not sure that society could deal with that kind of stress year on year for the next 27 years. Even then, the temperature rise is +1.8ºC, significantly higher than Paris and we hit it this decade.

The model also shows that for a constant energy usage, remaining at today's energy usage, and limiting the temperature rise to +1.8ºC, we would need a 3.55% global energy investment per year, which is a 3.55% hit for the first year followed by a 0% impact on energy per year at a personal level and we get to zero carbon about 7 years earlier.

It's certainly possible for society to take that kind of hit for a single year. To my mind, this is an easier and better course of action.

Why does it work that way? The simple answer is that a decline in energy usage leaves fewer resources available to invest in renewable energy. The argument for energy decline is that it will be impossible to ramp up renewable energy quickly enough, but ironically this model implies that it will actually make decarbonisation harder to achieve. In other words, reductionism is a far less feasible solution.

You are free to take this model and improve it, since it is undoubtably crude. The source code can be read easily by simply viewing the html for the page. Please bear in mind that the objective is to provide a simple model that can be readily understood. Essentially this model achieves that by treating the carbon budget as a proxy for complex climate equations.

Refs

[1] and
[4] https://www.weforum.org/agenda/2023/03/electricity-generation-renewables-power-iea/ Search for "29% to 35%"  Nuclear growth 3.6%/ year.

Possible Corrections

https://www.carbonbrief.org/guest-post-what-the-tiny-remaining-1-5c-carbon-budget-means-for-climate-policy/ says that the Carbon budget for 1.5ºC in 2020 was about 500GTCO2 and that we generated 70-80GtCO2 in 2020-2021 => 37.5GTCO2 per year average, with 40GTCO2 in 2022. So, that's about twice my estimate, because I have 71GTCO2 for 2022. This leaves a remaining budget of 380GtCO2 from the start of 2023, about 70GT lower than mine. Or it could be as low as 260GT.


Further Reading


Decarbonisation Sim

Renewables Investment %:
Renewables Improvement%:
Energy Decline%:
Coal/GasRatio%:

Your browser does not support the canvas element.