Page 1 of 1

DOS tools risc gcc discussion

Posted: Sat May 17, 2014 12:00 am
by a31chris
Because of the things recently told to me by ex High Voltage Software employees about the compiler setup they used on the Jaguar, I posted the 'paging system' question to a linker expert(Frank Wille, vbcc's vasm assembler and vlink linker author) asking what it would take to implement what HVS described their risc c setup to be like. This was asked before I found the rest of the article that states HVS did indeed know how to run gpu code out of main ram successfully. I don't know how much that will change the landscape of the discussion but here it is with the initial question. So I'm assuming that he is assuming like I did at the time that the compiler was a setup that did NOT execute code out of main ram at all. --[edit: And now it seems that this may have been the case anyway] Here we go:
High Voltage Software wrote:For Dactyl Joust, we were using an automatic memory paging system which was started with Ruiner. This worked by augmenting function calls to load in each function in 256-byte chunks, as many as needed, and doing address fixups. Rarely called support routines remained in main store, specially tagged to avoid being loaded in. (See above re: running from main RAM and crossing page boundaries. The addresses had to be guaranteed by creating a million sections in the link file. Can you say link file nightmare?) In the end though, C and eventually C++ use became pretty invisible (read easy and efficient) even on the GPU RISC processor.
Now I've always assumed this 'automatic paging system' refers to a linker?

Certainly not. The actual paging must be performed by a runtime system
they had written. The linker can help you to manage all these 256-byte
sections. I guess with "link file" he is refering to the linker script.

I don't quite understand what "augmenting function calls" means, though.
Probably each function was divided into several small sub-functions, where
each of them fits into 256 bytes. Then you have to call all these small
functions instead of a single, big one? But I have no idea how that could
be transparent, when programming in C.

Another possibility would be to connect these 256-byte pages by a jump
to the next page at the end. Jumps to pages which are not already loaded
to the 4K local RAM could point to a paging service routine first, and
then, after the required page is loaded, will be relocated to the correct
destination. A little bit like a mix between virtual memory and a dynamic
linking system. Who knows?
Those of us who knew of this rumor always assumed it was HVS own custom compiler they came up with. But now, one of the old HVS alumni is saying it was Ataris gpu GCC they were using.

Maybe they fixed and enhanced it?
Its just a custom paging system they had to build to get it to work.

Since you're the linker expert, what kind of crazy linker would that have to be to accomplish what he's describing?

I guess the linker could be more or less standard, with a complex linker
script. But they need a specialized compiler to make that paging system
transparent to C.


Frank Wille

Re: DOS tools risc gcc discussion

Posted: Sun Jun 01, 2014 4:53 am
by a31chris
For all of us poking around let's make sure to check all the libraries to make sure there isn't workaround already in there and check the Bin directory to make sure these tools aren't just sitting there, for uh, 19 years now?

You see a binary you don't recognize, poke sticks at it! See what happens.

June marks the 19th year of these tools being released. Let's make it a good year.

Re: DOS tools risc gcc discussion

Posted: Tue Jun 17, 2014 10:18 pm
by a31chris
Mike Fulton wrote:Chris, there was no problem with developer communications with Jaguar. Any story you hear to the contrary is just sour grapes combined with hindsight, and perhaps a bit of fuzzy memory about the available methods of the day. There'd be occasional problems where we didn't have a solution, but it wasn't for lack of trying.

First of all, keep in mind that the web was brand spanking new at that time. We didn't have a developer website because there was barely an "" at all.

Secondly, we didn't even have email as we know it today. Some people had internet email, some had Compuserve, others had GEnie, America Online, even Prodigy. Even so, we didn't even have email addresses at all for many developers. But the big thing was that there was no email client that could talk to more than one service. You'd have to run the CompuServe program to get CompuServe email, the Genie program to get Genie email, etc. Frequently we'd have developers at the same company who used different services for their email, and conversations involving more than one other person sometimes took place over several different email systems. Sending out a mass email to a few hundred people could take hours.

We did have a developer BBS which could handle up to four users online at once. We posted the latest tools in the download area on a regular basis. They were available both as a big archive or as individual files so you could grab whatever you needed. The BBS had uncensored message forums, where any of our developers could have asked anything they wanted to of Atari or each other at any time.

However, there was never much message traffic. A couple of years later when I was at SCEA, the situation wasn't much different with the PlayStation developer message forums.

As far as developers sharing information between themselves goes, keep in mind a couple of things. First, while Atari could facilitate the process, we couldn't make developers share what they might consider to be trade secrets. In many cases developers were reluctant to share things with ATARI, let alone each other. If one developer figured out how to get better performance out of something, that was a competitive advantage they didn't necessarily want to share.

Any Jaguar developer could get anybody on the Atari support staff on the phone pretty much at any time they wanted during business hours. Maybe not quite so reliably as today, but this was before the age of the cellphone and we were occasionally away from our desks.

Anybody who couldn't get in touch with us simply wasn't trying that hard.

The C compiler you mentioned is the same one I was referring to. We had a French company, Brainstorm, doing a lot of our tools, and they were working on a C compiler for the RISC processors. We knew going into the project that there would be problems but we hoped to overcome them.

It wasn't until early 1995, that we got the first early versions of the compiler from Brainstorm, however, many of those aforementioned problems were still unaddressed and that kept it from being put into general distribution.

While was not suitable for most developers to use, the guys at High Voltage did manage to get something out of it, with the use of some clever support code they wrote themselves and by following very specific coding guidelines. I would have loved to share the tricks they used, but HV didn't share the details with us at the time so that was never an option.

That message thread you linked has some problems: "The revisions for the risc GCC go up to June 1995. Why would Brainstorm keep updating a gpu compiler up to the bitter end of the Jaguars life? If they could not of gotten something to work wouldn't they have abandoned the efforts some time earlier?"

The RISC compiler project was actually no more than a few months old at that point. It was a fairly late development and not part of the original tool chain. Early on, we didn't think developers would have such difficulties with creating assembly code for the RISC processors, and that, combined with the hardware issues, made the creation of a RISC compiler lower priority. It later became a higher priority after we saw how many developers were struggling to do RISC assembly code and how they were using the 68000 far more than we wanted.

Also, referring to June 1995 as the "bitter end of the Jaguar's lifecycle" is misleading. Admittedly, things started to nose dive a few months later, but at that time, development, both internal and external, was still going ahead at full speed.

Re: DOS tools risc gcc discussion

Posted: Fri Jun 27, 2014 7:16 am
by a31chris
I've sent a message to Carl Forhan asking him to look in his HVS files. They probably threw those tools in there with the rest of the files. The Jaguars commercial life was long over and no more money to be made off of it so I doubt they would of cared about proprietary tools anymore. It seems the assets with the actors faces on them would always be a concern but they let those go with warnings so I'll bet those tools have been sitting right in the communities face for years.

Re: DOS tools risc gcc discussion

Posted: Tue Aug 26, 2014 1:25 pm
by txg/mnx
any update from Carl on this ?

Re: DOS tools risc gcc discussion

Posted: Tue Aug 26, 2014 5:17 pm
by a31chris
Yeah he says he does not have any HVS files.

Re: DOS tools risc gcc discussion

Posted: Sat Feb 07, 2015 9:12 pm
by a31chris
Something has occurred to me. John Carmack targetted a compiler for the Jaguars gpu when working on Doom. He said the compiler wasn't very good and he has 'lost it'.

However he must have ran into the comparison error with his compiler. Since according to Mike Fulton the gpu hardware bug causing it was basically a compiler crippling bug and it's a problem that affects assembly programmers except in a different way. It seems to me Carmack must have found a workaround for it. He must have. Either way the solution would be in the JagDoom sources. The assembly his gpu compiler created is still there.

When the time comes probably defining the relationship between what the compiler is generating that is tripping the gpu bug and then looking at a comparable area in the Doom sources might have the solution to our problem.

Re: DOS tools risc gcc discussion

Posted: Sun Feb 08, 2015 2:13 am
by a31chris
I've sent Carmack a tweet to see if he remembers anything.

Re: DOS tools risc gcc discussion

Posted: Sun Feb 08, 2015 6:13 pm
by a31chris
John Carmack wrote:
a31chris wrote:When creating the gpu compiler for jag do you remember a gpu hardware bug causing problems w/ its ability to do comparisons?
I remember the read after write hazard, but not a compare one. Been a long time, though.

Portland Retro Game Expo rumors

Posted: Mon Oct 19, 2015 3:48 am
by a31chris
I went to the 2015 PRGE on Sunday and hung out with a vendor friend of mine from the area and Atari enthusiast 8-bit fix.

Anyways 8 bit fix said that on Saturday an ex- High Voltage Software alumni was at the 2015 Portland Retro Gaming Expo and chatting with him about his Jaguar stuff at his booth. What I am told is that this guy claims to remember the bugfix workaround for the risc gcc in specific detail. His name was not gotten/remembered because the expo was busy and 8 bit fix was easily as busy. So details are understandably hard to remember.

The ex-developer was encouraged to write a blog about it. However I don't know if they were referred to any forum or website.