Self-destruct search program

For scripts to aid with computation or simulation in cellular automata.
User avatar
simsim314
Posts: 1823
Joined: February 10th, 2014, 1:27 pm

Re: Self-destruct search program

Post by simsim314 » May 13th, 2017, 2:36 pm

I saw the current destroy.exe has limitation of 248x248 grid. Can you change/remove/make possible to adjust/make additional version where the limitation is higher or non existent?

simeks
Posts: 402
Joined: March 11th, 2015, 12:03 pm
Location: Sweden

Re: Self-destruct search program

Post by simeks » May 13th, 2017, 4:43 pm

simsim314 wrote:I saw the current destroy.exe has limitation of 248x248 grid. Can you change/remove/make possible to adjust/make additional version where the limitation is higher or non existent?
The reason for the limit is that intermediate results from one round are stored down into a byte stream, using one byte for each coordinate value. I could make a fix for that, but it would be helpful to know what is likely to be an acceptable limit, 504×504, 1016×1016, or more than that?

User avatar
simsim314
Posts: 1823
Joined: February 10th, 2014, 1:27 pm

Re: Self-destruct search program

Post by simsim314 » May 13th, 2017, 8:30 pm

simeks wrote:I could make a fix for that, but it would be helpful to know what is likely to be an acceptable limit, 504×504, 1016×1016, or more than that?
Well for now I managed to fit exactly at the 248 boundary, but I think some of the posts for Orthogonid etc. could be much larger. I think the largest construction I can think of (say quadratic replicator) is even larger than 1016x1016.
simeks wrote:The reason for the limit is that intermediate results from one round are stored down into a byte stream, using one byte for each coordinate value
If you need this not as a hash - maybe it's better to copy the whole result, calculating the bounding box at the beginning of the run. I'm afraid making assumptions like this is hurting the app, while the major performance issue is probably the iterator anyway.

If you only need this as a hash for "been there done that" you could easily create some hash function even of size 64 bits. You can take the bounding box divide into 64 bit streams, and then make some not trivial hash function. For example (taken from LifeAPI):

for(int i = state->min; i <= state->max; i++)
{
result += CirculateRight(state->state, i);
result += CirculateLeft(state->state, (i + 7) % 47);
}

User avatar
calcyman
Moderator
Posts: 2932
Joined: June 1st, 2009, 4:32 pm

Re: Self-destruct search program

Post by calcyman » May 14th, 2017, 8:26 am

248-by-248 has always been fine for me (and the way that GoLGrid works, I think that increasing the grid area also increases the time it takes to compute a generation proportionally). For destroying larger mechanisms, it's more time-efficient to break it into pieces, run destroy256 on each piece in parallel, and add a few extra still-lifes to trigger the destruction of each piece from a single input glider.
What do you do with ill crystallographers? Take them to the mono-clinic!

simeks
Posts: 402
Joined: March 11th, 2015, 12:03 pm
Location: Sweden

Re: Self-destruct search program

Post by simeks » May 14th, 2017, 12:11 pm

simsim314 wrote:If you only need this as a hash for "been there done that" you could easily create some hash function even of size 64 bits. You can take the bounding box divide into 64 bit streams, and then make some not trivial hash function.
The byte stream is not used as a hash. GoLGrid already has a nice vectorizable hash function that I've used in most projects so far. It's based on the public domain MurmurHash by Austin Appleby, but instead of multiplying the current hash with a constant for each word (which prevents vectorization) each word in the key is xor-ed with a random word which is specific for the position of that key word in the grid.
simsim314 wrote:For example (taken from LifeAPI):

for(int i = state->min; i <= state->max; i++)
{
result += CirculateRight(state->state, i);
result += CirculateLeft(state->state, (i + 7) % 47);
}

The problem with such a simple hash function is that it's easy to find different patterns that hash to the same value, for example these two do (note that the position of the block differ):

Code: Select all

x = 144, y = 64, rule = LifeHistory
64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B
$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.
64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B
16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$
64B16.64B$64B16.64B$64B16.31B2A31B$64B16.30BA2BA30B$64B16.31B2A31B$
64B16.64B$64B16.64B$64B16.64B$64B16.37B2A25B$36B2A26B16.37B2A25B$36B
2A26B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.
64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B
16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$
64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B$64B16.64B
!
calcyman wrote:248-by-248 has always been fine for me (and the way that GoLGrid works, I think that increasing the grid area also increases the time it takes to compute a generation proportionally). For destroying larger mechanisms, it's more time-efficient to break it into pieces, run destroy256 on each piece in parallel, and add a few extra still-lifes to trigger the destruction of each piece from a single input glider.
I pretty much agree, and I think the program will have trouble producing useful results for larger patterns anyway. Still it would be an easy fix to increase the limit a bit, and it will not affect performance much, because the GoLGrid datatype keeps track of the bounding box of the pattern it contains at all times, and will only evolve those parts.

Post Reply