Create an account to leave a comment.
Already have an account?
As mentioned in my blog, the compiler is of mostly of interest to people who want to write embedded software. There is no need for 16-bit programs to run on Linux when there are already 32-bit and 64-bit compilers. Although you may be able to partly test the concept on Linux by compiling the same C program with the standard gcc.
The packages do include stdio.h and stdlib.h. Those contain constants and function prototypes. Certainly it linked with the library containing printf, there were no undefined symbols. So it probably linked to the DOS console putchar. It was probably done this way because DOS is a known target that can be tested against.
printf will eventually call putchar or putc. it's just a variadic function that calls various formatters, which can be written in C. An embedded program may not even need putchar. Think of an ROMed program that reads switches, drives displays and motors with I2C.
COM files are in fact already "flat" in the sense that everything has to be in 64kB, it's just that execution starts at 0x100 IIRC. An embedded binary would have to start at the reset vector, 0xFFFF0.
Are you sure? yes | no
I'm just wondering what exactly you're trying to achieve. Are you trying to develop for some old x86 chips you have without the whole PC architecture? Are you wanting to write DOS applications? It sounds like the former.
As you're no doubt aware real mode x86 uses a segmented memory model, which is why rather few programs written in C were larger than 64kB. It seems you've found a C compiler which supports far pointers and can exceed the 64kB limitation.
The far pointer you put at the reset vector will contain both segment and address, so you could set the start of your program wherever and still have it "begin" at the start of a 64kB chunk. If you don't know how to set the base address in your compiler I'm afraid I can't help you.
Remember the first kB of program space is used for the CPU's Interrupt Vector Table so you probably don't want to start your program there.
As I have written in the blog, I don't have any use for it at this point, and probably never will. I'm just posting it in case someone has use for it. There are a few people on HAD building 8088/6 computers.
Actually this gcc variant doesn't do far pointers, (again) as I have already written, the author is working on far pointers. In that respect this compiler is less capable than the Turbo C++ compiler that I have somewhere in my software stash which could handle several memory models. But gcc has the virtue that it compiles up to date C/C++.
I don't want to do DOS. When I want to run DOS programs, I just fire up my FreeDOS VM.
Every now and then I look at my 8088/6 chip collection and wonder if it's worth the effort and cost to do something with them. Indeed, modern MCUs are cheaper, more powerful, and easier to use so the answer is persistently no. Mostly because of the need to interface peripheral chips when these are built-in on modern MCUs. Plus a host of other hassles.
So TL;DR my blog post was just a PSA based on a quick experiment with packages that are available for download (remember my discussion getting them from the developer's Ubuntu PPA?) So speculate no more.
I see, sorry for the misunderstanding (I'm a bit tired).
Watch out for the methylene chloride. :)
Looks like the cross-compiler you found is designed for making MS-DOS executables. You may be able to create a flat binary by changing your linker settings (or you may have already by accident) but rest assured whatever file you create won't run on Linux.
If you want to run 16-bit programs on a PC without DOS (or any other OS) you won't have any libraries to call or link to. This includes all but the most basic headers (meaning no "stdio.h", "stdlib.h" and certainly no "printf"). You will however have access to the old BIOS interrupt calls and I believe one of those is effectively a "putchar," which you can use to write a "putstring" function.
© 2019 Hackaday