On Sun, 20 Oct 2019 04:04:42 -0500, David Entwistle
Post by David EntwistlePost by Robert WesselIf you're running x86 Linux, gfortran is probably already installed.
https://gcc.gnu.org/wiki/GFortran
Yes, indeed. I was impressed to find I could run GFORTRAN from within the
Eclipse IDE and get access to full debugging facilities. I was able to
run the Lucifer FORTRAN program and happily produce output (see below).
The cipher text decrypted back to the original plain text.
It was when I was trying to reproduce the same output using a Java
implementation I noticed that behaviour within the FORTRAN program wasn't
what I expected. Reading a byte (stored as an integer) into the Confuser,
uses a integer array as the transport. If the integer was say hex 'E9'
11101001, (ignoring the possible input hexdigit interchange), it was
presented to the S-Box as hex '97' 10010111. The S-Boxes permutated
nibbles '9' and '7' and presented the expected output, but as this output
was read back into an integer, from the integer array, the bit order was
again reversed. As part of the confuser, that may be intentional, but it
seems unlikely and it isn't documented, as far as I can see.
Running on a little-endian system, I suspect there would be no bit
reversal, but that's just my thought and I may have it quite wrong.
Figure 1 - Confuser Interrupter Diffuser
<http://www.radiometeor.plus.com/Lucifer/CID.jpg>
Figure 2 - S-Boxes.
<http://www.radiometeor.plus.com/Lucifer/S-Box.jpg>
key
0123456789abcdeffedcba9876543210
plain
00000000000000000000000000000000
key
0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 1
0 1 0 0 0 1 0 1 0 1 1 0 0 1 1 1
1 0 0 0 1 0 0 1 1 0 1 0 1 0 1 1
1 1 0 0 1 1 0 1 1 1 1 0 1 1 1 1
1 1 1 1 1 1 1 0 1 1 0 1 1 1 0 0
1 0 1 1 1 0 1 0 1 0 0 1 1 0 0 0
0 1 1 1 0 1 1 0 0 1 0 1 0 1 0 0
0 0 1 1 0 0 1 0 0 0 0 1 0 0 0 0
plain
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
cipher
C318179D5848D88C322F7462C4F82B2A
cipher
1 1 0 0 0 0 1 1 0 0 0 1 1 0 0 0
0 0 0 1 0 1 1 1 1 0 0 1 1 1 0 1
0 1 0 1 1 0 0 0 0 1 0 0 1 0 0 0
1 1 0 1 1 0 0 0 1 0 0 0 1 1 0 0
0 0 1 1 0 0 1 0 0 0 1 0 1 1 1 1
0 1 1 1 0 1 0 0 0 1 1 0 0 0 1 0
1 1 0 0 0 1 0 0 1 1 1 1 1 0 0 0
0 0 1 0 1 0 1 1 0 0 1 0 1 0 1 0
plain
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
key
0123456789ABCDEFFEDCBA9876543210
plain
00000000000000000000000000000000
You seem to have some misunderstanding about byte order issues.
First, x86 is little-endian. You usage of the term is a bit confused.
Second, byte order does not impact the bit order within bytes. Thus a
byte 0xe9 will get stored as such on both little-endian x86 and
big-endian S/360. You'll never see it as 0x97. A larger type will
get the order of the bytes changes - thus a 32-bit variable holding
the value 0x12345678 on x86 would have the bytes 0x78, 0x56, 0x34,
0x12, while on S/60 they'd be 0x12, 0x34, 0x56, 0x78.
Note that this does not apply to some types of I/O, where the bit
order can also vary on the "wire".
Now that doesn't mean a program can't expose the byte order of the
machine in various ways, but it can only happen if you access data as
both the larger (byte order dependent) and smaller types. I think you
mentioned in one of the other posts that some of the structures in the
Fortran source you're using make use of Fortran equivalence, which
would certainly provide some opportunities for doing that (as would,
say, a C union). But again, you should not see reinterpretations of
units of things smaller than a byte.
If the code is doing something like storing a "byte" as individual
objects (perhaps bytes), then any byte order changes might be visible.
Consider a C union:
union BITS {unsigned long long ull; unsigned char b[8]} bits;
bits.ull = 0x0101010001000001; 0xe9, one bit at a time
for (i=0; i<8; i++) printf("%i\n", bits.b[i]);
This would print 1/1/1/0/1/0/0/0/1 on a big-endian machine, and
1/0/0/1/0/1/1/1 on a little-endian machine.