Question

I found this amazing piece of work by Arthur Whitney - http://www.jsoftware.com/jwiki/Essays/Incunabulum

It compiled with a few warnings

$ gcc-4.7 incuna.c -o incuna.o
incuna.c: In function 'ma':
incuna.c:8:15: warning: incompatible implicit declaration of built-in function 'malloc' [enabled by default]
incuna.c: In function 'pi':
incuna.c:26:7: warning: incompatible implicit declaration of built-in function 'printf' [enabled by default]
incuna.c: In function 'nl':
incuna.c:26:24: warning: incompatible implicit declaration of built-in function 'printf' [enabled by default]
incuna.c: In function 'pr':
incuna.c:28:10: warning: incompatible implicit declaration of built-in function 'printf' [enabled by default]
incuna.c: In function 'ex':
incuna.c:35:36: warning: assignment makes integer from pointer without a cast [enabled by default]
incuna.c:35:25: warning: return makes pointer from integer without a cast [enabled by default]
incuna.c: In function 'noun':
incuna.c:37:57: warning: return makes integer from pointer without a cast [enabled by default]
incuna.c: In function 'wd':
incuna.c:39:21: warning: incompatible implicit declaration of built-in function 'strlen' [enabled by default]

But it segfaulted on entering a basic input 1 + 1.

./incuna.o
warning: this program uses gets(), which is unsafe.
1 + 1
[1]    11525 segmentation fault  ./incuna.o

I'm guessing this has something to do with the difference in C compiler since 1989.

How would I be able to run this? Can I get this working on recent Linux/Mac? or on a VirtualBox VM? or anything else?

My Google searches turned up nothing related.

Was it helpful?

Solution

It converts pointers to int and long and vice-versa. This breaks with 64-bit architectures in which they have different sizes.

Compile it for a 32-bit target. E.g., with “-arch i386” using clang/LLVM on Mac OS X.

OTHER TIPS

I would wager a guess that it segfaulted because of this:

incuna.c:8:15: warning: incompatible implicit declaration of built-in function 'malloc' [enabled by default]

If malloc is not declared, then it's not going to allocate you memory and you'll end up dereferencing a null and that could lead to a seg fault.

After including:

#include <stdio.h>
#include <stdlib.h>
#include <string.h>

The warnings for printf(), malloc(), and strlen() are gone. The code runs and works if you input:

1+1

Note the spacing is important here:

1 + 1 

will segfault.

Run it through the preprocessor only:

gcc -E interp.c > interp-pp.c

Then prettify it in an editor, then use a debugger to watch what it does.

On my system (AMD64 Win 8), it appears that pointer values often have the top bit set, so treating a pointer as an integer (which this program does) will misbehave and crash.

Changing the qv(a) function ("query verb") allows the program to run:

qv(a){R a<'a';}

should be

qv(a){R a<'a'&&a>0;}

or

qv(a)unsigned a;{R a<'a';}

Here's a link to a minimally-modified version that should compile without warnings (with gcc, default options) and execute (with correct input).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top