Friday 23 November 2012

National Lottery: A Government Wealth Warning

I just met a guy buying a lottery ticket at a local mini supermarket I tried to talk him out of it, but it didn't work - he thinks he has lucky gypsy genes in him :-s . There must be ways of putting people off buying lottery tickets. At the very least I figure every lottery ticket should be a bit more honest and carry a government wealth warning. Something along the lines of:

"Play every week for 50,000 years and odds are: You'll win!"

 Well, it's the truth, (1-1/14million)^(52*50000) just lowers the chances of you losing to under 0.5. Actually, I just checked and it's not the truth, you really need to play for around 190,000 years.

Another way of looking at the National Lottery is that it's a tax on the less well off. To quote from the article 14% of the population play the lottery, but it's 36% of the less well-off (those earning < £20,000). Since those earning under £20K represent 50% of the population, that 36% represents pretty much every lottery player. Which gives plenty of scope for other warnings:

"Help fund the causes of richer people than you! Play the lottery Now!"

By and large that'd be true too. Simply at the level of not playing you benefit more in lottery-funded government initiatives and since I've never played it, I've benefited by roughly £14million*0.3*52*(2012-1994)/60million = £65 since its inception :-) Woohoo this leads me to my third warning!

"Don't play the Lottery and WIN EVERY WEEK!"

 I can't argue with that, but then I had a better idea. Why not get the top 25% of the population to pay for the lottery? This is how it'd work, the top 25% are taxed an extra £4 every week - no sweat, they can easily cope - and anyone who wants a lottery ticket can have a single ticket for free! Then the same rules apply for winnings, depending on how many numbers you get right and the other aspects such as government income and charity funding works as before. Brilliant, and all the money flows in the right direction! Best of all, it's no longer even gambling since you're not actually risking any money to gain a return!

What do you think? I think I'll call it, the Rational Lottery.

Monday 24 September 2012

An iPod mini Adventure

Remember the old black and white iPod generations? Well, yesterday I retreived my long-lost first-generation iPod mini and it thrilled me to bits!

The iPod mini was the shortest-lived iPod and somewhat derided on introduction as being over-priced and kinda girly. It turned out to be so stunningly popular Apple had difficulty meeting demand. I ordered mine just two hours after they became available in the UK Apple Store in June 2005.

You'd think their internal 1.4" spinning hard disk would make them pretty delicate, but in fact that they could take real abuse and my iPod mini's no exception:


It's a battered little critter, mostly from being driven over for a few hours in a Stockport car park where I once accidently dropped it. Chunky isn't it - they made electronics that way then ;-)

Yet two years ago I managed to leave it on holiday at Dove farm cottages in Ellestone with my fiancée and some friends. I'd kept trying to make an effort to track it down since then, but after a little autumn camping trip at the wonderful, good-value and basic Pudding Room campsite with a couple of friends this weekend:

I thought I'd have make a real effort to find it. So I headed off back to Dove Farm and had to knock a couple of times before a lady opened the door (they weren't expecting me, 'natch). So I cautiously asked whether they remembered seeing an old iPod mini left here back in 2010.

“That’s a long time ago”, she said.

I understand, I said as I described it, It’s a bit of a long-shot, enough of one to have made a personal point of giving it back to God a few minutes before.

“But if it's there, I think I know where it'll be.” and she headed off back into the house. I thought she was going to head off to the dining hall where we'd actually played iPod music during the holiday, but actually she returned just a minute later with it exactly as I'd left it. Rather than chucking it all, they have a bit of a stash of things guests have left behind, which is impressive considering how often people are likely to do that.

Speaks volumes about Dove farm, thank you.

I charged it up last night and I've been listening happily to its contents this morning, batteries are still good after all these years :-D

Thursday 13 September 2012

SIA Later!

Or rather we won't as in a few years it'll all be gone!

This is a little blog about current Arctic Sea Ice Area as we near the 2012 record-breaking summer minimum.

Take a look at the image:

It's a section of the Arctic SIA as of yesterday.

We can see that the whole of the top-right hand edge, is the Northern Sea Route. You can see it's open-water and there's a lot of it. It first opened in 2009, but right now it's so wide you could pretty much sail the UK straight through; and within 3 years!

The opening of the Northern Sea Route means that ocean currents can sweep more easily round the eastern edge of the North Pole; bringing warmer waters from the Gulf Stream (not to be confused with the Jet Stream); and thus accelerating the collapse of the Arctic Sea Ice. We can see the effect quite clearly; the eastern edge continues to melt significantly, just days away from the supposed end of the melting season.

By contrast the Western edge of the Arctic Sea Ice is pinned by an extensive set of islands as well as the all-important Greenland land mass. This is why the Sea Ice is clinging to that edge, the frozen land keeps it cooler and protects it from ocean currents.

The last thing to note is the colour scheme. Red means '60%' ice, pink means '80%' ice; purple is near 100%. This means that the white dot in the middle, which is the North Pole itself, is only 80% Sea Ice at best and has a large amount of 60% (i.e. rotten) Sea Ice relatively close by. Given that the Eastern edge of the ice is about half-way to the North Pole, I'd guess it'll be gone within 3 years.

SIA Later, in a week or so for my post on the Arctic minimum!

Friday 3 August 2012

Horses That Run...

...are the ones that aren't starved.

The BBC recently linked to an article on PowerPoint's 25th anniversary. I'd always thought it was a Windows app until now, but actually it was Mac about 6 years before it made it to Windows. Suddenly that made Powerpoint about 10x more interesting than I'd ever considered so I did a bit of searching and found a book called Sweating Bullets.

The free google book edition leaves out quite a bit, but I was mostly interested in the history up until the first release: the kind of development difficulties Forethought had working on early Macs (they targetted the 512Kb Mac) and the online book covers that quite nicely.

Actually most of the book is about the marketing of Powerpoint: why the concept was such a hard sell and Robert Gaskins, the project manager, deserves a lot of credit for his persistence. But the thing I found most interesting was their whole perspective on the development platform.

This is how it goes. They started work on Powerpoint in late 1984 / early 1985 after spending 18 wasted months working on a PC-based graphical predecessor that included an operating system (because graphics OSs weren't really available by then). Now, Windows 1.0 had kinda just been released, but the Mac had been around for 18 months. However, Gaskins initially discounted the Mac version and planned all the design and development for Windows. It was only when he found out that the current state of Windows in 1985 was:
  1. Unusable, and
  2. Years away from being useful (in fact, about 6 years, but they didn't know that then); and
  3. Not even Microsoft were planning to release practical graphical apps for Windows until years after their Mac versions of e.g. Word and Excel.
That they decided (grudgingly) to go with the Mac version first. When they did, they found that with the exception of having to run the development tools on Lisa its predecessor, not only was the Mac wonderful to use even in 1985, but that its design gave it ample performance and software components to complete both the first plus second versions of Powerpoint with less than 1/3 of the effort it took to merely port it to Windows when that day finally came.

What this really shows is how mindshare affects business decisions; in this case Windows had already won in the minds of software developers even though it was a complete train crash at the time - the quality and capability of an existing competitor wasn't even a consideration until the alternative was known to be infeasible.. and even then the Mac version (that saved their company) was really only developed in order to springboard their way to Windows.

It's not the way I think, I'd sooner plug quality and good ideas even if they're not going to be the obvious winners - after all, any horse can win if you starve the rest - it's just not something to be proud of. So it's not surprising then that Apple found it so hard to make headway for the first 20 years and being caught out by the Windows mindshare effect in this case was a real surprise for me!

Friday 6 July 2012

Are They Human, or Are They Bankers?

Tony Robinson raised that question on Question Time last week. Although he gets a couple of facts wrong (e.g. bailouts were billions, not millions) his stinging litany of Bankers' greed leading up to 2008 and the aftermath makes for sober listening.

However, the latest scandal involving Barclays Libor rates appears to have the previous Labour Government possibly implicated. Based on the evidence I can muster though I don't think that really makes sense. Here's why:

Banks make short-term (daily-ish) loans to each other to cover short-falls in each others' reserves. The largest 8 banks publish the rates they have to pay when they take a loan of this kind and the middle 6 are used to derive what is called the Libor rate. The scandal is that Barclays' fiddled their published Libor rate after late October 2008 to, 2010 (or was it 2009)?

However, Barclays now claim that they only manipulated it because they'd heard that senior Whitehall figures were concerned about their Libor rate being in the top quartile or decile. This is how the history pans out:

Before October 8, 2008

This Guardian article gives the sequence of events, but you'll need to start at the bottom. HBOS crashes 34% on Sep 15 as Lehman Brothers employees are turfed out of Canary Wharf. Barclays seals deal Lehman assets on Sep 16 while HBOS shares crash to 88p. Sep 17, Libor hits a 7-year high; Lloyd takes over HBOS for £12.2bn while Morgan Stanley crashes 30% (and turns into a non-investment bank 5 days later). Sep 25: Bradford & Bingley let go of 350 staff (later bought by Santander) and HBSC axes 500 a day later. By Sep 29, RBS are down 20% Barclays another 9% and Libor goes 'through the roof'. Just before October 8, Icesave goes into default and then the UK government announce their rescue package.

Barclays Libor During The Rescue Package

The UK Government's bank rescue package provided £500bn for a number of major banks and of the major ones, only Barclays declined to get involved, despite them having shrunk pretty much as much as Lloyds.


 Barclays needed a cash injection of 6.5bn, but they chose to get it privately. Now, if we switch to looking at their Libor graph over the period:
 To my mind the first oddity is that their Libor had been rising in Mid-september, but goes lower at the point where they decide to buy Lehman Bros assets before shooting back up by about the 20th. I'm not sure if that makes complete sense. The Libor rate reflects the insecurity of the bank - why would it go down if they buy the assets of a bankrupt company?


Their Libor then climbs sharply (along with other banks) through to the bailout on October 8; drops briefly and then continues to rise to the end of October. Now, I would be pretty sure that there would be civil servants in Whitehall who would be monitoring the Libor of particular banks - because given their prior financing behaviour, it would not be unexpected for them to engage in financial cover-ups. Barclays figured it needed a small fraction of other major banks (6.5bn), but would finance it privately so it was claiming it's finances were secure. However, their increasing Libor implies they aren't.


The question is, does it make sense for Labour to push Barclays into artificially lowering its Libor rate given that it was willing to bailout Barclays a month before (Barclays 6.5bn would have been 1.2% of the overall bailout)? My thinking is it wouldn't make sense: it'd make more sense for Labour to push Barclays into joining the bailout.


On the other hand, would it make sense for Barclays to not want the UK government (and by that token the UK tax-payer) to part-own Barclays? Yes, that would make sense.


Would it make sense for Whitehall to be concerned about Barclays Libor figures? Yes, given the financial situation at the time, yes it'd make sense.


One further thing to note; by artificially lowering Barclays Libor rate to a negative one, doesn't that mean that Barclays would be effectively gaining money on inter-bank lending? Money that had come from the UK government bailout? At the time, note that Barclays justified its financial security by arguing that it would raise £7.3bn via its Middle East investments (Guardian, October 31, 2008), but its Libor rate had started falling 2 days before.


Friday 1 June 2012

Global Crunch Twin Pack

Most of my blogs are techie, but occasionally I stray into politics. It looks like the global economy is about to take a second bite at the same, un-nutritious Global Crunch candy bar, I don't think it has to happen.

In 2008 I wrote a post about the Global Economy Crunch. In it I compared it with previous major economic collapses. My assessment at the time meant I connected those crashes with an unsustainable boom due to market deregulation and the following depressions due to protectionist (i.e. austerity) measures and only overcome due to practical implementations of Keynsian economic theory. I predicted that this crash would last at least as long as the worst of the previous ones.

Since then, although we managed to avoid a meltdown by the courageous step of major bailouts for banks and global financial institutions, governments have followed the predictably damaging path of Austerity to the point where it looks to me like we're heading for another, worse crunch than in 2008. Here's why.

Austerity measures in the UK mean that we're back in recession (as Labour predicted in 2010), whereas the US is not (though their crash was worse). The conservative government have been championing manufacturing, yet manufacturing is shrinking in the UK today (at its fastest pace for 3 years). So, there's no fall-back in the public sector (which has been downsized) and no pick-up from the service sector either.

Manufacturing in Spain and Greece is shrinking; there are major issues with the funding of banks in Spain and unemployment there is at a record 24% plus; Italy's bond yields are virtually unsustainable, Greece is about to vote on whether they exit the Euro (it'll be awful for them either way); Ireland's just voted for a heavier austerity package.

More disturbingly though is what's happening in East Asia as everywhere you look there the economy is slowing down. China's growth is down and even internal growth is slowing. India's growth is similarly slowing (an article just one hour before I've linked this argues whether India's growth is over) and it's manufacturing sector recently even shrank. South Korea is suffering a slow-down. In my opinion this is all relevant, because a major reason why we survived the original Crunch in 2008 was because the far East was doing so well and could basic bail-out the West; and doing so well to the point where it was understood that even if the West went under they'd continue pretty well.

The other concerning aspect is the way right-wing think tanks are continually arguing for business deregulation; eliminating workers rights through, for example no fault dismissals, which are supported by the PM, and more flexible working. Similarly in Europe, right-wing economists are arguing for more Austerity.

I don't believe that Austerity works at a macro-economic level, though it can work at a personal level. The reason is that at a personal level if you're frugal, everyone else can cope with you spending less - they only lose a fraction of a % of their income; but if everyone cuts back to suddenly make ends meet then the result is a self-inflicted vicious circle of deprivation. For example, tying the Greek bail-out to austerity meant that Greece becomes deprived of the engine (i.e. the workforce) they need to get out of their debt crisis. In that sense it's better to tie European a mandatory trade boost with Greece to internal cut-backs; which would be analogous to personal frugality, with the Greeks making do with an average lower standard of living, but gaining full employment (by virtue of the trade agreement) and thereby the means to overcome debts. Europe here would be taking up the economic strain, but we're in a better position to do so.

On the other hand, I think there's a better way: cooperative economics. Rather than penalising the very people who lost the most in the initial Crunch we would be better off by Enfranchising The Workforce; making it ludicrously easy for people to form cooperative micoenterprises and for working people to have a greater stake in their companies in lieu of the pay-rises they're not going to get for the foreseeable future. The thing is, we already have the resources with us; we don't need to Crunch twice on the same disastrous candy bar.

Tuesday 15 May 2012

More Xubuntu-based Emulation: xz81

My experience with getting a ZX Spectrum emulator to run on my iBook 600 led me to wonder how easy it would be to get a ZX81 emulator to run. The tricky part was finding a suitable X11-based emulator. There wasn't an obvious port like FUSE or Xspectemu, but after a while I found one, the z81 emulator by Russell Marks.

This emulator can be compiled for svgalib or plain X11, which is the version I chose. It's easy to compile, you just unpack it and type: make xz81 then sudo make install xz81 . You also need to download a ZX81 ROM - I used the Shoulders of Giants ROM.

xz81 was only about 300Kb unpacked and 100Kb of that was a directory of ZX81 games. It compiled in about 1 minute on my iBook. What I'm most impressed by is the fact that not only didn't I require the involved .configure step, but that the compile was so quick and it worked despite it being last updated about 7 years ago. XZ81 is really quite nice and quirky:

  quirky mostly because the keyboard help screen is rendered as a ZX81 bitmapped image and the file-selector option is managed as a ZX81 program (though it's really written in C as part of the emulator).

xz81 is pretty crude though - it needs to be recompiled to run in a different scale (the normal scale is 2x) and although it supports some typical ZX81 hardware (like the printer, by producing .pbm files)
It doesn't support proper high-resolution ZX81 graphics, so some programs don't work so well.

Since I'd had a comment from a previous user about a ZX81 Forth ROM I thought I'd try it, but the h4th.rom is quite hard to track down and in the process I found an alternative, a recently written (2011) ZX81 version of Forth called Toddy Forth.

It's also quite nice and simple, though as standard it doesn't include ZX81 features, such as INKEY, PLOT or AT. It's based on Camel Forth and since it's a DTC Forth it's actually pretty fast. I benchmarked it using the same benchmarks I used for FIGnition:

 Benchmark
 Jupiter-Ace (fast mode)
 Toddy Forth (slow mode)
 FIGnition  Ratio  FIGnition KIPS
 BM1  0.16  0.21  0.02  8  50
 BM2  0.54  0.54  0.088  6.14  56.82
 BM3  7.66  7.87  0.41  18.5  36.23
 BM4  6.46  7.94  0.47  13.69  31.78
 BM5  6.52  8.22  0.51  12.78  33.33
 BM6  7.38  9.78  0.64  11.53  39.06
 BM7  12.98  15.12  1.27 10.22  25.98
 BM3L  1.0  1.58  0.05  20  200
Mean Ratio (ZX81:Ace) 0.82  Mean 12.61


 (w/o BM3L)  11.55

 Although Toddy Forth in slow mode is merely 82% of the speed of the Jupiter Ace we must remember that in ZX81 slow mode, the computer spends about 75% of its time just generating a display: in fast mode the ZX81 should be about 3 or 4 times faster; easily beating the Jupiter Ace.

Toddy Forth's download includes some extra definitions in a library. I added its plot command so that I could time plot operations.

 Benchmark
 Jupiter-Ace (fast mode)
 Toddy Forth (slow mode)
 ZX81/Ace Ratio  FIGnition  Ratio  FIGnition KIPS
 Plot Full Screen  3.02 (2816 pixels)  4.66 (3072 pixels)  0.65  0.10 (2400 pixels)  29.61  94.1

Here, Toddy Forth is significantly slower, though not outrageously so.

Here are the benchmarks themselves (in Toddy Forth):

: bm1
  cr ." S"
  1000 0 do loop
  ." E" ;

: bm2
  cr ." S"
  0 begin
    1+ dup 999 >
  until
  ." E" drop ;



: bm3
  cr ." S"
  0 begin
    1+ dup dup / over
    * over + over -
    drop dup 999 >
  until ." E"
  drop ;
 : bm4
  cr ." S"
  0 begin
    1+ dup 2 / 3
    * 4 + 5 - drop dup
  999 > until
  ." E" drop ;

: bm5sub ;
: bm5
  cr ." S" 0 begin
    1+ dup 2 / 3
    * 4 + 5 - drop bm5sub
  dup 999 > until
  ." E" drop ;


 : bm6
  cr ." S" 0 begin
   1+ dup 2 / 3 * 4 + 5 -
   drop bm5sub
   5 0 do loop
  dup 999 > until
  ." E" drop ;

: array
  create 2 * allot
  does> over + + ;

5 array m

: bm7
  cr ." S" 0 begin
   1+ dup 2 / 3 * 4 + 5 -
   drop bm5sub
   5 0 do dup i m ! loop
  dup 999 > until
  ." E" drop ;
 : bm3l 0 10000 0 do i +
  minus i and i or i xor
  loop drop ;


 : bm1g
  48 0 do i
    50 0 do
      i over 3 plot
    loop drop
  loop
;



16436 constant clock

: time-bm
  find
  clock @ swap execute
  clock @ swap - . ;



Tuesday 8 May 2012

Should Have Gone To Xspect Savers!

I'm an old-time ZX Spectrum owner from 1982 and since it was the machine's 30th birthday just last week I figured it's mandatory to run an emulator on every Mac of mine, including my cranky, old, 600MHz Xubuntu-running G3 iBook. I had no idea how much effort would be involved!

For lots of more modern systems it's just pretty easy to install FUSE, the Free Unix Spectrum Emulator; but it's not readily available as a package for my iBook, so I had to recompile it, and when I did: for the SDL library and then just for plain X I found it would crash with an 'illegal instruction' message before doing anything.

My overall impression of compiling FUSE though was of the immensity of the work involved, and the size of the code, the download expands to 10.3Mb and then I needed SDL at a whopping 29.8Mb - you can't get much more simple than that eh!?

And of course, in the end it didn't work.

However, I managed to find an alternative: Spectemu-X11, which wonderfully I could download as a package for my machine. Unfortunately, that too didn't quite work - on my iBook I found that I couldn't generate a Symbol Shift since the keyboard didn't support the Right-Shift key-code. So, then I downloaded the source (which when expanded and compiled uses 1.7Mb, almost small!) plus the diff for the 0.94a-9 version; and then had to learn how to apply the diff to a folder: place the diff file in the root directory of the package (in my case just inside my spectemu-0.94 folder) and then type patch -p1 < theNameOfThePatchFile.diff 

And it did it!

Then I compiled SpectEmu, which it duly did and left me with the same problem, no Symbol Shift. So I did some tracking on why that's the case. It's entirely due to the key codes defined in spkey_p.h. The definition for Symbol Shift is derived from SK_Shift_R which is:

#define SK_Shift_R        0xFFE2    /* Right shift */

With a bit of debugging I found out that the command key on my Mac generated 0xFFEB, so all I had to do was replace the above line with:

#define SK_Shift_R        0xFFEB    /* Right shift */

touch all the files that #include spkey_p.h and then make it again. Easy, and now I have a Spectrum Emulator on my lowly (but lovely) iBook!

Next, back to working on my own project, the FIGnition DIY 8-bit computer!

Tuesday 1 May 2012

The FIGnition/Raspberry PI Challenge!

Hi folks,

I managed to meet up with a fellow embedded techie friend at my old church last Sunday and he mentioned he’d been chatting with some friends about whether my FIGnition project stands much of a chance against Raspberry PI, so I thought I’d blog my response.

Although I’m persuaded that there should be room in the market for both of us, of course it’s perfectly true that the publicity of Raspberry PI has presented a major challenge as both it and FIGnition are being sold as machines to encourage children to program. However, my conclusion is that kids don’t program today primarily because modern computers (like the Raspberry PI) are 10,000 times more complex than in the 80s. I blogged about this more fully a while ago.

FIGnition is as simple as an old 80s computer, but suffers from some practical dis-incentives. The keypad is unconventional, requires learning and is slow (though better than texting); getting code in and out of FIGnition is still awkward and Forth isn’t an ideal language for beginners. I don’t plan to change the keypad, but at some point, FIGnition owners will have created a PS/2 shield for FIGnition. I want the keypad as it is because I don’t want to be dependent on an artificial protocol for reading keys, that is, I’d probably have used a switch-matrix membrane if I could have sourced one cheaply, but my primary goal was to have a machine that was as self-contained as possible. I’m working on the uploading/downloading issue by enabling audio saving and loading, which is still being debugged. I’m not absolutely committed to Forth though I personally enjoy the language, instead I’m committed to comparable 8-bitter user code performance (because otherwise I can’t say that FIGnition is a substitute for an 80s 8-bit computer), and that with my original development schedule meant that Forth was the natural choice. Nevertheless I’m working on making it easier to use, starting with a new editor once the audio I/O is ready.

At the heart of it though I think FIGnition is the right kind of machine, because it can be understood whereas Raspberry PI can’t be understood by a single person. This is why I think FIGnition is worthwhile. Because Raspberry PI is a full machine its very complexity leads to the same mentality and solutions as for other modern machines. You can either give it a dumbed-down language and programming environment which teaches you less about programming than a ZX81; or you give it an emulated 80s computer (which is to admit the host computer isn’t the right tool for the job); or you program it with modern tools and put up with the same complexity that turned kids of programming in the first place.

Let’s extrapolate what’s going to happen with the Raspberry PI. Raspberry PI is a full Linux system, so you can recompile thousands of programs, tools and games so I strongly suspect kids will find it easier just to run pre-written programs and games than do actual programming (note MAME was ported to the PI before its general availability). Kids will simply stuff their own SD cards in the PIs and do what’s easiest. It’s an unavoidable effect of the complexity of the system which makes it overwhelmingly easier to go with that flow.

Consider what happens when a child makes an effort to resist. Let’s say they learn Python and want to write a game, e.g. Snake or Blitz. Python’s fine for scripting text processing stuff, but even doing something as simple as the equivalent of a Spectrum’s PRINT AT y,x; (or FIGnition’s x y at) requires substantial effort and at the end you don’t even get user-defined graphics, just plain characters. Scratch can’t really handle that complexity so they’re forced into developing a java or javascript game (with all the overheads) or developing on (e.g) a BBC Micro emulator. At this point the hurdles and disappointment will make it far more inviting to just start downloading and running other people’s stuff.

Let’s say educators or individual adults see the problem with the above so they develop an ‘easy’ programming environment alternative. The problem is that because of the underlying complexity of the system, developing this ‘easy’ environment becomes far more than a one-person job so it’s either left unfinished; or they short-cut and expose the underlying complexity for the ‘real’ stuff or they open-source it and the several(?) collaborators create a bloated composite of their partially overlapping variants of the idea.

Or let’s say they want to by-pass the Linux OS and re-write a classic BASIC (or something) kernel for the Raspberry-PI. What’ll happen is that the effort of developing a kernel from scratch means they’ll get outmaneuvered by programmers who take short-cuts that bolt something that looks like the classic environment onto the normal kernel. Then what’ll happen is that people will get frustrated by the way this dedicated Rasperry-PI programming kernel doesn’t interface with all the other full-desktop goodies (editors/scripts/games/drivers/protocols etc) that they’ll add a full Linux bootup in the background and then basically we’re back to square one.

A typical example is the BBC Micro 2.0 project. It looks like they’ve employed a number of 21st century programmers to create a ‘new’ programming environment. The idea is to have the ‘simplicity’ of BBC Micro programming, but hosted by a full Linux/Mac/Windows system with bindings for multiple languages and programming paradigms. So already it’s just the same multi-pane, monstrous class library, over-complex nonsense that put kids off programming in the first place; because the programmers and designers are immersed in modern environments and aren’t raising the key question about why kids don’t code today: that modern systems are 10,000 times more complex and can’t be genuinely understood - and their failure to understand how no-one understands modern systems forces them to replicate the problem on the new BBC Micro 2.0 platform.

But once you accept that the issue is the actual, literal complexity of modern systems, then the way to get kids to code becomes a case of constructing real, but genuinely simple computers. Computers like FIGnition. You can make use of genuinely new concepts, but it’s no longer a problem if these computers don’t do everything as long as they meet the criteria of being simple, complete, immediate and understandable. FIGnition isn’t perfect, but it gets the primary thing right: you can build it, code it and understand it. It is to complex computers what Phonics and the Gruffalo is to War and Peace and no-one would ever think of teaching kids to read and write using the latter.

In a future blog, I’ll argue why the issue isn’t cost, nor primarily ICT teaching, nor the inevitability of progress. These kinds of arguments are smokescreens for the only issue: computers are 10,000 times more complex than computers that can be learned and understood. I’ll finish with one comment about price: both FIGnition and R-PI are so cheap you could buy more than 10 of both for the adjusted cost of a ZX Spectrum from 1982 and millions of recesssion-hit, cash-strapped parents bought Spectrums and their ilk then. If journalists, parents and educators can’t afford or encourage a market big enough for both then our cultures and economic prospects must be so messed up the challenges I face in promoting FIGnition are the least of my problems :-)

Wednesday 21 March 2012

The New FIGnition ROM

I've been working on a new version of FIGnition's Forth ROM and I'd like to share some of the details with you.

History

There's really two main sections to FIGnition's firmware (three if you count the bootloader). The initial 10 to 11Kb or so is the code that interfaces to the hardware: the video, the SRAM, the Flash and the keypad. It's written in a mixture of C and assembler. The inner Forth interpreter is also here; it's about 1.5Kb long and is written in assembler.

The rest of the firmware is the Forth language itself, written in Forth byte codes. My original version of this was taken primarily from Andrew Holmes' Mark 1 Forth computer's Forth ROM, because it was a Forth ROM written in Forth; and that version of the Forth ROM was taken from the 6502 assembler version of FIG-Forth, which is in the public domain.

We can see how the history of this works out using a few code snippets. The 6502 version implements most of its entire word set in 6502 code, so a command such as > is written in assembler:

L563 .BYTE $82,'>',$D2
.WORD L548 ; link to LEAVE
TOR .WORD *+2
LDA 1,X ; move high byte
PHA
LDA 0,X ; then low byte
PHA ; to return stack
INX
INX ; popping off data stack
JMP NEXT

The Mark 1 Forth computer is a true virtual machine, so Andrew had to rewrite the ROM using his core set of 32 words and since > isn't one of them he defines it as follows:

$Colon GreaterThan, '>'
DW Swap, LessThan, Exit

< in turn is:

$Colon LessThan '<'
DW Subtract, ZeroLt, Exit

Subtract isn't even a core word, it's defined as:

$Colon Subtract, '-'
DW Minus, Plus, Exit;

So, here we see a number of key changes. He's factored the code quite a lot; he's used the macro facilities of the MASM assembler to make headers easier to define and finally he's converted it all to compiled Forth expressed as words.

In FIGnition Forth's ROM it looked like this:

_Colon GreaterThan, ">",1
.byte kFigSwap
.word LessThan
.byte kFigExit

There's some minor changes. The macro definitions and word directive use as-avr syntax macros. I couldn't get the macros to properly calculate the length of the text so it's an extra parameter. In addition, in FIGnition Forth I can mix byte codes and words.

FIGnition Forth isn't a true virtual machine, since the Forth ROM and main firmware share the same code and addresses spaces. This means, for example, I can access the data stack pointer by simply declaring a constant which is the address of the register used for the stack pointer and I can then access it directly from Forth - I don't need a primitive definition to do that:

_Const GSP, "sp", 2,gDP

Means that GSP is used for the label for sp and gDP is the actual address (which is an AVR register pair). In addition, I recently added the primitive kFigNative which allows me to intersperse Forth and AVR assembler. Key for example is now:

_Colon FigKey, "key", 3
.byte kFigNative
.align 1
call Key
rjmp RomPushByteRet

FIGnition Forth Today

The Mark 1 Forth ROM is much smaller at around 4Kb than the 6502 version (at around 8Kb), because the native Forth code is more compact. In turn FIGnition's Forth ROM is more compact, because it uses byte codes - the same code becomes about 3.5Kb instead.

With FIGnition Forth I made a number of changes to the set of primitives so that it now supports shifts; internal ram access; native cmove and fill; flash block firmware access amongst others. In addition I had to add some basic missing words such as vlist, forget; list (for blocks); and of course edit. Finally I had to add definitions such as: x, so that both byte codes and execution addresses could be compiled in correctly.

FIGnition Forth Tomorrow

I've made a lot of comments about the future of FIGnition Forth on the FIGnition Google group and perhaps previous blogs, but progress is slow. The primary change is that I want FIGnition Forth to support a nice editor, written in Forth, to replace the primitive screen editor we have at the moment. The major impact for this is that I don't want to write the entire editor using .byte and .word directives, because it's just too error prone - instead I want to write it in Forth. But I also don't want to write it in Forth on FIGnition itself, because although it's going to be only 1 to 1.5Kb; I want it to integrate with a different set of lower-level string functions and compiler words; which means re-writing other parts of the FIGnition Forth ROM, which is again fairly error prone. Remember, the Mark 1 Forth ROM is basically a translation of existing (working) code; whereas this is new development.

So what I've decided to do is re-write the entire FIGnition Forth ROM in proper Forth and modify a rudimentary Forth compiler called NFC that I wrote in C around 3 or 4 years ago to generate the ROM.

This means that > for example is now defined as:

: > swap < ;


Simple!

Current Progress

So far I've re-written the entire ROM as FigRom.fth and it's much smaller, at around 1000 lines of code instead of 2300. Now I'm working on the Forth ROM Generator (FRG), which is mostly complete, but untested. This is the major reason why the most recent ROM update (which enables bitmapped graphics and sound to work together) was done by hacking the .hex files rather than modifying the actual firmware.

The first stage is to get FRG to generate the same ROM as I had before before I move on to improving it, I'm a little way from seeing that happen.

I'll blog the rest of the progress as I go along!

Monday 9 January 2012

Lucid Lynx on an iBook G3

Way back in September '09 I blogged about my xubuntu 9.04 install (twice) and finally, in January 2012 I decided to upgrade to xubuntu.. a whole year later! As usual there were problems with the display, but now it's sorted I have an even better machine than before, so I thought I'd use it to write this blog.

Every new version of xubuntu seems to improve upon the old one, despite the sluggishness of new software the O/S seems somehow to keep up and delivers me a nicer experience as well as a few hurdles to get over.

So how is this one different? My first step this time was to backup all my old home folder to some CDs - because I wanted to triple-boot the iBook into Mac OS X, Mac OS 9 or Linux. I wanted Mac OS 9, because I have an old Dazzle USB composite video grabber which doesn't work with Mac OS X AFAIK, but I wanted to grab better quality videos and images of my FIGnition DIY 8-bit computer in action!

But of course, Linux is essential for running anything new, e.g. this version of Firefox I'm writing the blog on.

The first hurdle was the installation. I found a guide on LowEndMac which recommended running the alternative CD installer. Either way, I found the disk image too big for writing to a CD, even using the command-line hdiutil so in the end I wrote it to DVD using my iMac G5; and then booted it on my combo-drive iBook. It wasn't a problem installing via the text screens, in fact it's better because the response time is so much quicker; I recommend it from here on!

As usual, the second hurdle was the video display mode. For some reason no xubuntu setup I've ever tried has initially worked in the correct video mode and this is no exception. What was worse this time was that after 9.10 ubuntu stopped using the xorg.conf and instead auto-detects the modes, so I couldn't simply edit it. Fortunately, a number of links tell you what to do:

Basically it involved booting into text mode from Yaboot:

Linux init=/usr/bin/bash rw

creating the xorg.conf:

sudo Xorg -configure

and copying xorg.conf.new to /etc/X11/xorg.conf. I didn't need to add all the special device and monitor sections as before, because the auto-generated one actually had the correct primary setting listed as I wanted (except it's chosen 24-bit colour instead of 16-bit colour).

So now I have a new lease of life for my iBook - Yet AGAIN! FireFox looks modern, I've restored all my data; the eject button works for CDs (in 9.04 I needed a command line utility) and of course it includes lots of the newer features. One of the best things is that the trackpad is no longer over-sensitive, the cursor doesn't skit around the screen and set the typing position to a random location merely because my fingers are close to it! Lovely, lovely, these people are doing a great job!

I'm happy! Hope this is of some help :-)