mdw
|
|
response 50 of 55:
|
Apr 24 04:18 UTC 2000 |
The multiple pass feature actually weakens the key. How much it weakens
the key depends greatly on the relative sizes of the key file & data
file. A pass count of exactly 256 on key & data files of identical size
is an extreme case, but it's easy to find other combinations that
results in an expanded key stream that is either always a constant, or
repeats with a small period. Either of these is bad. If the key file
is at least the length of the data file and contains "true random" (*not
generated by a PRNG) data, then there is no value to a pass count other
than 1. If the key file is *not* "true random" then adding bytes
together does not improve the entropy and may well hurt it badly
depending on the exact values selected.
One of the problems with using addition to mix bytes is that it
introduces a 1 bit left shift dependency. In the worst case (repeatedly
adding a value to itself) this can result in very bad things like the
all 0 expanded key case. In the best case, a clever algorithm can use
this for bit avalanche, idea uses multiplication as a clever way to mix
bits and basically takes advantage of bit avalanche from addition in
doing so.
Encrypting a continuous stream of data is pretty common in the data
market; this is what you might want to use with a telephone
conversation, video, or with telnet/ssh. Especially with video, you are
dealing with humongous amounts of data; reading through the key file
multiple times would not be very attractive even if the exact length of
the data stream were known in advance. In any event, this is a pretty
obvious case where what's happening in the computer world doesn't match
up well at all with the "human wants to encrypt a file" model.
Public key technology generally relies on a set of one-way functions
that can be nested two different ways yet yield the same result. For
instance, here's Diffie-Hellman (which can be used for key exchange, but
not directly for encryption or decryption):
(1) Alice chooses a large random integer x and calculates
X = (g**x)mod n
(2) Bob choses a large ranodm integer y and calculates
Y = (g**y)mod n
(3) x,y are private keys; Alice & Bob don't suare these.
X and Y are public keys, Alice & Bob tell each other
their public keys.
(4) Alice computes k = (Y**x)mod n
(5) Bob computes k' = (X**x)mod n
k and k' are both equal to (g**(x*y))mod n. This value
is the shared secret no one else can compute; this can be
used in any ordinary block or stream cipher.
A 3rd party may know n,g,X,Y (the latter by evesdropping) but still
can't compute k unless they can solve the "discrete logarithm" problem;
this problem hasn't been solved despite considerable mathematical
attention. n and g are public parameters (Alice & Bob need to agree on
this), n & g need to be chosen carefully to ensure best security. n
should be prime, (n-1)/2 must also be prime, and n should be large, like
at least 1028 bits. g should be a primitive root mod n.
|
eyenot
|
|
response 52 of 55:
|
Apr 25 22:16 UTC 2000 |
for the process i have,
if you multiply the difference between size of infile and size of key,
by the number of passes , and the result is a number equal to the size of
the infile, and you multiply the number of passes in this case by 256,
you will end up with no change to the infile.
i have not tested this, say using a file of 3000 bytes and a key of 2999,
running 3000*256 passes (well # passes is limited to 65,535) but it
would theoretically result in 0 encryption
infile of 500 bytes and key of 295 bytes (diff of 5)
number of passes 25600, you will end up with 0 encryption
(theoretically)
i installed the passes feature ... well let's illustrate why
values in decimal (0-255)
Infile bytes: 010 033 050 009 017 100
key bytes : 000 001 005 013 100 255
(5 passes)
result 1 : 010 034 055 022 117 099
result 2 : 010 035 060 031 217 098
result 3 : 010 036 065 040 062 097
result 4 : 010 037 070 049 162 096
result 5 : 010 038 075 058 007 095
anyways, as you can see using the bytes 000 or 255 is foolish, but, only
in the following situations..
1. no.passes * (diff in file sizes) = size of one file
2. or, if the size of the one file is even divisible by the other
(which would be the case if both files are same size or if say one file
was 50 and the other was 1 (duh), 2, 5, 10, or 25 )
because these two situations are not hard to produce, i also added
features to mr.jima that will count the number of 00h value bytes in a
file, and a feature that, after encryption, will tell you how many bytes
of the original file have been left unchanged in the same position.
i am thinking of adding a third feature that will tell you whether or not
the byte values have been effectively 'scrambled'
for example, here are three bytes 003 , 010 , and 050 .
here they are encrypted, 013 , 020 , and 060 .
in this case the difference in three bytes was 10 bytes. mr.jima would see
that the bytes are indeed 'changed', but would not know that they are
effectively no different than before.
bytes A B C D E decrypted
a b c d e encrypted
in another version i will install a feature that checks the diff. between
bytes A and a , and then the diff. between the bytes B and b. if they are
no different, then it will count this as a 'weak spot' . if they are
different, if will count by how much (diff. between 2 diff. between 2
diff.) and then keep this in the back of its mind. it will then compare
the diff. between the B and b , to the diff. between C and c. same as
before. if they are different, it will check by how much and compare this
to it's last 'back of the mind' 'diff between diff between diff' . if this
is the same, then it will count this as a 'pass weak spot'. if not, then
it will take the newer value to the 'back of its mind' and then keep
going.
hopefully this will tell a person more about the encryption they received.
see what i mean ?
|
eyenot
|
|
response 55 of 55:
|
Jul 13 13:00 UTC 2000 |
one final message to end the thread ; i've updated the mr.jima encryption
program, it's still written in turbopascal because i still haven't found
a good ASM compiler (namely borland asm, for sale, used, etc.) so i still
haven't tried to work it out in assembly.
the update isn't finished yet, it's going to be called jima06h.zip and will
be available here : http://www.lunarsurf.com/~eyenot/jima06h.zip for
download. i'll leave this version up permanently.
changes from th last version : fixed the debugging output so that it is more
specific and easy to understand, fixed some problems in said output concerning
saying the wrong thing to end-user, more or less
no more future versions in store, but i've designed several routines for
scrambling the content of a file so that the original bytes are simply
re-arranged to other positions in the file ; routines for both keyed and
unkeyed scrambling also working on a routine that uses prime numbers to
generate public keys, hopefully will be able to also implement additional
encryption stages for jima that will allow for use of public keys
withoutendangering the security of the file and/or randomness of generated
keyfiles.
still haven't uploaded mentioned file to server; rewriting documentation,
being lazy with graph paper as of now. check mentioned filename periodically
and it will appear in probably a week.
(end of thread)
|