Let's code some Atari ST "C"

Before we venture back in time to writing C code for a 1985 Atari ST, let's go for a little wander back in time as to what led to wanting to write C code for an Atari ST and why should we bother?  Surely, we should just learn and write some code in Python, grow a hipster beard and drink a chai latte.....

.
{}{}{}{}{}{}{wiggly lines portraying going back in time (Waynes world style)}{}{}{}{}{}{}

.
.
.
.
.
So........"C".  That wonderful language that was created around the year of my birth.  Back when flares and beards were trendy.

Here's the inventors of the language, with required beards:
And no, they do not do a cooking show or host a radio show nowdays :-)


Right, well, if you're reading this, you know what C is.  It's a programming language that sits one-level up from Machine code, it allows us humans to type something that isn't Assembler or lower into a computer.
If I recall right, I was first introduced into C coding back in 1990/1991-ish, doing an Industrial Computer course at Anglia University, we wrote C code to communicate between 2 PCs (2/3/486 spec machines!), also a program to monitor a toilet cistern level, output basic graphics to the screen and activate an overflow valve if the water increased too much.  Then there was a final project of writing an ECG monitor that allowed connecting various sensors to a human, converting the signals and rendering them as an output to the screen.  I can still see the oscilloscope style green checked circle with the luminous animated green heart beat monitor.  I probably still have the code sitting on a 3.5" floppy disc, somewhere..... :-)
There were probably more programs that were written, but those were the one's I remember.  Boy, was that a long time ago. 25+years ago.

Well, the beautiful thing about C is..... that length of time means nothing really.  There have been a vast array of higher-level languages produced that sit above the C programming level, I know I've used quite a lot of them and paid my monthly mortgage payments by developing with them.  Who remembers Delphi.  Ah...now that was a language I loved, mixed with an Oracle database, it was great and kept me busy for a good 4-5 years.

Having spent the past decade working with a whole variety of products, tools / languages, mainly based around the basic Web technologies: HTML / JavaScript / CSS and usually some form of server-side processing, I'd drifted away from being "that close to the hardware" I was working with.  I was mainly working inside the container of a web-browser.  Again, monthly mortgage payments successfully paid [check].

When not doing the day job, I become interested in the Arduino, which is an open-source little embedded hardware device that you can program with C code to flash LEDs on and off (OKAY, you can do a lot more than Blink LED 13, I know!).  I played around with a whole variety of Arduino based devices, making all sorts of things, from Speech recognition devices that would open/close relays that turned lights on & off - with a plan to fit inside my custom car.
I even built little robots to do various things.  It was great to have a little play around with and to write some minimal C code on.

In the middle somewhere, I got distracted by the BeagleBone device.  It was a bit like the Arduino on steroids.  I even remember spending most of a night, recompiling the linux OS so that I could get a custom 7" LCD screen to work on it....  anyway, that resides in the back of a cupboard nowadays as then along came the Raspberry Pi with all of it's AWESOMEness!

Wait, hang on - "don't all the hipster people write Python on the Pi?"... well, probably, some no doubt do, it's a simple enough language to pick up and use.  I've heard it referred to as a modern day BASIC (with Object Orientation thrown in).  Not a bad analogy, as they are both Interpreted languages and both are pretty easy to pick up and you can just start coding from a flashing command-line prompt.  Something that we used to love doing back in the 1980s.... because, well, because that is all we had when we turned the "Home Computer" on.  It booted in seconds to a flashing prompt.  You could then choose to start writing some code or load & play a game (well, in 15-20 minutes after it had loaded from tape or in 15-30 seconds if you were rich & had a floppy disc drive!)

Well, back to C....yes, you can write C code for the Raspberry Pi, there is even a nice Udemy course on it, if you wish to take a guided tour or follow these instructions on setting it up.

So, whilst technology has moved on fantastically over the past 25+ years, as you can see, even the newest technology still supports being programmed in C.  C is not going anywhere anytime soon.  It's at the heart of so much server-side technology, embedded technology and probably running on chips inside your washing machine and inside your cars.

Which also brings me to the work I was doing during mid/end of 2016, I was working with a lot of the new Internet of Things devices and projects.  Whilst it could be argued that I was kind of already playing with that sort of thing 25+ years ago, it wasn't really using "the internet", just a network - okay, it could be argued that it was the grand father of what we are using today.  Anyway, whilst playing around setting these new IoT things up for Smarter Buildings and Cars, it occurred to me that when it came to it, you still had to drop back to the common denominator to work with them: C.

Now, I could have just fired up XCode on my Mac and written some funky C code (actually, I confess, I actually did), it'd be a bit boring.  I don't know, since the PC arrived in the early 90s, everything kind of went a bit "beige" and the "fun" of a "Home Computer" went away.  Maybe it's nostalgia, maybe it's old age? I don't know, maybe it's because I remember writing C code on the early PCs?  Instead, I spent far too long in my personal time, sourcing and setting up a C-Compiler to run on an old 1985 Atari 1040 STF.
Now, I'm not planning on writing the next Xenon 2 or Bubble Bobble:


But, who knows, I'll make small steps a see how I get on.  There's no harm.  The way of coding in C will not be a waste of time as I'll be brushing up on the skills that I can re-use in the modern day world.  Who knows, it might actually end up being beneficially to pay off more of those mortgage payments in the future.

..and yes, I confess today I am sporting a rather fetching beard!  I'm not wearing sandles or drinking warm ale though, so I'm not a total cliche of the 1970s C programmer :-D
.
.
.
.
.
{}{}{}{}{}{}{wiggly lines portraying going back in time (Waynes world style)}{}{}{}{}{}{}
.

Right!  I set about wanting to write code for the Atari ST, but as explained at the bottom of this article, I also setup a Raspberry Pi 3 to run an Atari ST emulator.  This got me thinking...why don't I just write the code, compile, link, make it in the emulator and then when I'm happy with the final result I either copy the .PRG file to the real machine (via the nifty FTP setup, also explained in that previous article) or I copy the C source-code over and build it on the machine.

What I did find is that the Hatari emulator was good, but it didn't quite have ALL of the keyboard mappings that were needed.  For some reason I could not type {} [] or _- characters via the PC keyboard plugged into the Raspberry Pi.  hmmm.... those characters are kind of important to write C code with.

This got me to thinking, I wonder......

So, I use Atom on the Mac to write the C source code.

When I've finished I use FileZilla (could have used any old FTP mechanism, but I just happened to have this available), to send the .C file via SSH to the RetroPie.
You'll notice that I have a Hard Drive folder setup, this is for the Hatari Emulator and as I found out, Megamax C compiler only works if all the files are on the top level, so there I drop the C file:

A quick check on the Atari ST Emulator screen and there we see the BITMAP.C file:

Now, the Megamax C-compiler is a little bit odd by todays standards.  It has a separate EDITOR.PRG program that you can run to type the C code into and save the file (we've already done this bit), but when you want to Compile/Link and make the .PRG you need to run the SHELL.PRG program:
 As I've already set this is I can go straight to the "Execute | Compile" command, under the "Locate" menu you setup the locations of the Compiler/Linked/Editor/Disassembler and Librarian programs.
 You then select the BITMAP.C program itself:
 and the Megamax C compiler does some compiling:
 Ah! It found an error and will not compile the .C program.  Now, we could just open up the file using the EDITOR.PRG program and modify the file and repeat the exercise OR we could go back to the original source on the Mac.
As it nicely told us, the error is on line 101...so let's track it down:
Ah, there we go, fat-fingers.  I put an "=" instead of a "-" in the code.  All fixed and re-saved.

Time to use FileZilla again to overwrite the BITMAP.C on the Emulator:

BTW - just to show the EDITOR.PRG program that runs on the Atari ST, as you can see the BITMAP.C file is show and can be manually edited - although a lack of displaying line numbers makes it a bit tricky, but the fact the ERRORS.OUT shows the actual line of code makes it a little easier to find the correct line of code.  I just thought I'd show that you could do all of this directly still in the Atari if needed.

Then we re-compile.  If all is good this screen is shown and then whoosh! it just disappears, so no news is good news!

We then select "Execute | Linker".  Now, this let's us pick the BITMAP.O object file that was created from the compiled BITMAP.C file.  We specify the name of the executable to create: BITMAP.PRG and we must remember to press the [>> ADD >>] button to get the BITMAP.O file over to the right hand side:

Then, we get to see the following output.  This is pulling in the references from the #include parts of the BITMAP.C code and that final section tells us some important information, it's so important that it doesn't stay on the screen and just like the Compiler program, you blink and whoosh! it's gone.  Well, the good news, no news is good news!
Then we Quit the SHELL.PRG program and return to the Desktop and we can now see the created BITMAP.PRG application program.
Time to double-click on the program to start it :
Well, that was exciting wasn't it :-)

Actually, it's a test piece of code that is showing how to write values to a memory location that is then reflected as the output for the screen.  We can create in code what we want to appear in a bitmap area in memory and then we can choose to display it, like so:
 Then we can flip back:
 Change the contents of the memory and then flip back:
 and go back and forth doing this:
 until we get tired or we have written the code to do something "useful" after this key press:

Yay, we actually wrote the code to output some blocks / stairs(?) in code in memory and then flipped the screen to render this new bitmap onto the screen.
It may sound a little dumb, but it's actually how most of the animation, graphics and games worked for these machines.  With these basic concepts, I can now go forth and create a master piece (or not).

The real purpose here though was to demonstrate that I could actually write the C code on a Mac, ping it across to a Raspberry Pi and then Compile/Link/Build a real .PRG program and execute it on an Atari ST.

Actually, thinking about it, that's actually pretty awesome  :-)  I surprise myself on just how geeky I can be sometimes.


Right, time to do something else non-geeky, the missus is complaining I need to do "housie" stuff, after all if that why we do this geeky stuff, to pay all those monthly mortgage payments?  and isn't that why those payments take 25+years to do... just long enough for us to do all the geeky things and then when we stop paying them, well, we stop doing geeky stuff.... probably because we'll be retired / old & senile by then.  But, and I can assure you of this, someone, somewhere will STILL be coding in C as a day job  :-D





Comments