Random Glider Collision Search
Posted: November 22nd, 2014, 7:06 pm
Here is my version of "soup search". The script generates long and random glider streams. After the collision collects SLs (or oscillators) with 16 bits or more.
It's incremental script - with save and load capability. The longer you run it, the more interesting results it will give (as it doesn't repeat the result twice).
The results are located in your pattern folder under directory ScriptResults. With prefix GliderSoupResults - the script currently saves the results every 5 successes, but the underlying data is saved on each success.
Stopping the script - will not erase the analyzed data, as it saved into GliderSoupResultsData.json (in same folder) and loads on start.
---
You can take the results files and json here
---
Few words on how to read the result report (See GliderSoupResults000105.rle attached).
On the left you can see all the SLs which were found. Their index - and bit count. Having the index, if you want to see a specific synth - goto column with that index. Above each column there is a SL that was found and the index, so it's very easy to find. Each SL can have up to 5 "recipes". Each recipe in the column produces the SL specified.
---
Feature history:
1. Rotation and symmetries removed. Oscilators are shown once.
2. Numeration and results report added.
3. Adds up to 5 synth per SL (made to have additional options in case some additional rare SL synth was found).
4. Trivial cases removed.
5. Bit count added to header report
6. Logs every 5 successes (as the non trivial results are more rare)
7. File operations support all os
8. Set base added to support other defaults.
9. Small performance boost.
10. Reports in LifeHistory format - doesn't allows interactions between the results.
11. Informative constants (with comments) added in top of the script for tweaking.
12. Oscillation for multiplier of 5 support added (64,3,5 are now supported).
13. Small performance boost, by reducing the max generation + #12.
14. Creates SoupCleaner rule file in rules folder. Uses the rule to clear common object. Major performance boost.
15. Performance report added + SHOW_PERFORMACE_EVERY constant added.
Code: Select all
import random
import golly as g
import copy
import os
import json
import time
#The distance from which gliders start to be placed
STRATING_DISTANCE_OF_FIRST_GLIDER = 35
#each glider arm has gliders. It's minimal and maximal number defined here.
#The actual value is set to be random number between min and max.
MINIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT = 6
MAXIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT = 22
#each glider is set to be placed not only at random distance but also in (-x, x). This is the variance constant
RANGE_VARIANCE_OF_GLIDERS = 25
#Each glider is placed at some distance from pervious glider. The minimal and maximal are defined here.
#The actual value is random between min and mix. Notice that minimal below 5 has chance to place two gliders one on top another.
MINIMAL_DISTANCE_BETWEEN_TWO_GLIDERS = 5
MAXIMAL_DISTANCE_BETWEEN_TWO_GLIDERS = 21
#if the suop continues to evolve after this number of generations, the script generates new soup.
MAXIMAL_GENERATION_EVOLVE_TO_STABILITY = 60 * 512
#The diretory where the results will be placed.
SCRIPT_RESULTS_DIRECTORY = os.path.join(g.getdir("patterns"),"ScriptResults")
#The directory where json dump file will be placed.
JSON_DUMP_DIRECTORY = os.path.join(g.getdir("patterns"),"ScriptResults")
#The minimal number of bits should be considered a "finding" and will be included in the report.
MINIMAL_BITS_TO_REPORT = 16
#Each SL has several soups reported for it. This is maximal number to report.
MAXIMAL_REPORTS_PER_SINGLE_SL = 5
#Each SL is evolved this number of times and saved in all 8 symmerties. Made to avoid multiple report of oscillation.
MAXIMAL_OSCILLATION_PERIOD_TO_STORE = 6
#Create report every N successes. This is N.
CREATE_REPORT_EVERY = 5
#Show performance of soops per second after collecting this number of soups.
SHOW_PERFORMACE_EVERY = 1000
#What is the square size in which we assume the final soup and the initial glider configuration will fit.
REPORT_SQUARE_SIZE = 1000
g.new("")
g.setbase(8)
g.setstep(2)
g.setrule("B3/S23")
g.setalgo("HashLife")
def CreateSoupRule():
fname = os.path.join(g.getdir("rules"), "SoupCleaner.rule")
with open(fname, 'wb') as f:
f.write('@RULE SoupCleaner\n')
f.write('@TABLE\n')
f.write('\n')
f.write('n_states:27\n')
f.write('neighborhood:Moore\n')
f.write('symmetries:rotate4reflect\n')
f.write('\n')
f.write('var a = {0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26}\n')
f.write('var b = {a}\n')
f.write('var c = {a}\n')
f.write('var d = {a}\n')
f.write('var e = {a}\n')
f.write('var f = {a}\n')
f.write('var g = {a}\n')
f.write('var h = {a}\n')
f.write('\n')
f.write('var known1 = {1,7,9,12,14,16,19,22,24,25,26}\n')
f.write('var s1 = {5,7,9,12,14,16,19,22,24,25,26}\n')
f.write('var s0 = {6,8,11,13,15,17,18,20,21,23}\n')
f.write('\n')
f.write('#2 - 0 with 1 or two neighbour \n')
f.write('#3 - confirmed 1 \n')
f.write('#4 - confirmed deletion \n')
f.write('\n')
f.write('#5 - suspected block \n')
f.write('\n')
f.write('#6 - suspected beehive inside\n')
f.write('#7 - suspected beehive outside (1)\n')
f.write('#8 - conformed beehive inside\n')
f.write('\n')
f.write('#9 - suspected blinker\n')
f.write('#10 - suspected blinker step 2\n')
f.write('\n')
f.write('#11 - suspected boat inside\n')
f.write('#12 - suspected boat outside (1)\n')
f.write('\n')
f.write('#13 - suspected ship inside\n')
f.write('#14 - suspected ship outside (1)\n')
f.write('\n')
f.write('#15 - suspected loaf inside\n')
f.write('#16 - suspected loaf outside (1)\n')
f.write('#17 - conformed loaf inside\n')
f.write('\n')
f.write('#18 - suspected pond inside\n')
f.write('#19 - suspected pond outside (1)\n')
f.write('#20 - conformed pond inside\n')
f.write('\n')
f.write('#21 - suspected tub inside\n')
f.write('#22 - suspected tub outside (1)\n')
f.write('\n')
f.write('#23 - suspected glider1 inside\n')
f.write('#24 - suspected glider1 outside (1)\n')
f.write('\n')
f.write('#25 - suspected glider2 inside\n')
f.write('#26 - suspected glider2 outside (1)\n')
f.write('\n')
f.write('#initial read\n')
f.write('0,1,0,0,0,0,0,0,0,2\n')
f.write('0,1,1,0,0,0,0,0,0,2\n')
f.write('0,0,1,1,1,0,0,0,0,2\n')
f.write('\n')
f.write('#block deletion\n')
f.write('1,2,2,1,1,1,2,2,0,5\n')
f.write('5,5,5,5,0,0,0,0,0,0\n')
f.write('5,a,b,c,d,e,f,g,h,3\n')
f.write('\n')
f.write('#beehive deletion\n')
f.write('0,1,0,1,1,0,1,1,0,6\n')
f.write('1,2,0,0,1,6,1,0,0,7\n')
f.write('1,0,0,2,2,1,6,6,1,7\n')
f.write('6,7,0,7,7,6,7,7,0,8\n')
f.write('8,7,0,7,7,8,7,7,0,4\n')
f.write('\n')
f.write('7,1,b,c,d,e,f,g,h,3\n')
f.write('7,a,1,c,d,e,f,g,h,3\n')
f.write('\n')
f.write('#blinker deletion\n')
f.write('1,2,0,2,2,1,2,2,0,9\n')
f.write('1,1,2,2,2,1,2,2,2,9\n')
f.write('9,9,0,0,0,9,0,0,0,10\n')
f.write('10,9,0,0,0,9,0,0,0,4\n')
f.write('10,a,b,c,d,e,f,g,h,3\n')
f.write('\n')
f.write('#boat deletion\n')
f.write('0,1,0,1,1,1,0,1,0,11\n')
f.write('1,2,0,0,1,11,1,0,0,12\n')
f.write('1,0,0,2,2,1,1,11,1,12\n')
f.write('1,1,2,2,0,2,2,1,11,12\n')
f.write('11,12,0,12,12,12,0,12,0,4\n')
f.write('\n')
f.write('#ship deletion\n')
f.write('0,1,1,1,0,1,1,1,0,13\n')
f.write('1,2,2,1,1,13,1,0,0,14\n')
f.write('1,2,0,2,2,1,13,1,2,14\n')
f.write('13,14,14,14,0,14,14,14,0,4\n')
f.write('\n')
f.write('#Loaf deletion\n')
f.write('0,1,1,0,0,1,0,1,0,15\n')
f.write('0,1,0,1,1,0,1,0,1,15\n')
f.write('\n')
f.write('1,2,2,1,15,15,1,0,0,16\n')
f.write('1,2,0,0,1,15,15,1,2,16\n')
f.write('1,15,1,0,0,2,0,0,1,16\n')
f.write('1,15,15,15,1,0,0,0,1,16\n')
f.write('\n')
f.write('15,16,16,15,15,16,0,16,0,17\n')
f.write('15,16,0,16,16,15,16,15,16,17\n')
f.write('\n')
f.write('17,16,16,17,17,16,0,16,0,4\n')
f.write('17,16,0,16,16,17,16,17,16,4\n')
f.write('\n')
f.write('#Pond deletion\n')
f.write('0,1,1,0,0,0,1,1,0,18\n')
f.write('1,2,2,1,18,18,1,0,0,19\n')
f.write('\n')
f.write('18,19,19,18,18,18,19,19,0,20\n')
f.write('20,20,20,20,19,19,0,19,19,4\n')
f.write('\n')
f.write('#tub deletion\n')
f.write('0,1,0,1,0,1,0,1,0,21\n')
f.write('1,2,0,0,1,21,1,0,0,22\n')
f.write('\n')
f.write('21,22,0,22,0,22,0,22,0,4\n')
f.write('\n')
f.write('#glider1 deletion\n')
f.write('0,1,0,1,1,1,1,0,0,23\n')
f.write('\n')
f.write('1,2,2,1,1,23,1,0,0,24\n')
f.write('1,2,0,2,2,1,23,1,2,24\n')
f.write('1,1,2,2,2,1,0,23,1,24\n')
f.write('1,0,1,23,0,2,0,2,0,24\n')
f.write('1,1,2,2,0,2,0,0,23,24\n')
f.write('\n')
f.write('23,24,24,24,24,0,0,24,0,4\n')
f.write('\n')
f.write('#glider2 deletion\n')
f.write('1,1,0,1,0,0,1,0,1,25\n')
f.write('\n')
f.write('1,2,2,1,25,0,0,2,0,26\n')
f.write('1,2,0,0,1,25,0,1,2,26\n')
f.write('1,0,0,2,0,2,0,25,1,26\n')
f.write('1,0,25,0,0,2,0,2,0,26\n')
f.write('\n')
f.write('25,1,0,1,2,0,1,0,1,25\n')
f.write('25,26,0,26,0,0,26,0,26,4\n')
f.write('\n')
f.write('#SL support \n')
f.write('2,a,b,c,d,e,f,g,h,0\n')
f.write('\n')
f.write('1,2,a,b,c,e,f,g,h,3\n')
f.write('known1,3,a,b,c,e,f,g,h,3\n')
f.write('known1,a,3,b,c,e,f,g,h,3\n')
f.write('\n')
f.write('s0,3,b,c,d,e,f,g,h,0\n')
f.write('s0,b,3,c,d,e,f,g,h,0\n')
f.write('s1,1,b,c,d,e,f,g,h,3\n')
f.write('s1,b,1,c,d,e,f,g,h,3\n')
f.write('s1,3,b,c,d,e,f,g,h,3\n')
f.write('s1,b,3,c,d,e,f,g,h,3\n')
f.write('1,s0,b,c,d,e,f,g,h,3\n')
f.write('1,b,s0,c,d,e,f,g,h,3\n')
f.write('\n')
f.write('#universal deletion \n')
f.write('4,a,b,c,d,e,f,g,h,0\n')
f.write('a,4,b,c,d,e,f,g,h,0\n')
f.write('a,b,4,c,d,e,f,g,h,0\n')
snakeLineHor = g.parse("2obob2obo$ob2obob2o!")
snakeLineVer = g.transform(snakeLineHor, -3, 3, 0, 1, 1, 0)
figure8 = [snakeLineVer, g.transform(snakeLineVer, 0, 13), snakeLineHor, g.transform(snakeLineHor, 0, 13), g.transform(snakeLineHor, 0, 26), g.transform(snakeLineVer, 13, 0), g.transform(snakeLineVer, 13, 13)]
def PlaceDigit(digit, x = 0, y = 0):
digitIdx = [[0,1,2,4,5,6], [5,6],[1,2,3,4,5],[2,3,4,5,6],[0,3,5,6],[0,2,3,4,6],[0,1,2,3,4,6],[2,5,6],[0,1,2,3,4,5,6],[0,2,3,4,5,6]]
if digit >= 0 and digit <= 9:
for idx in digitIdx[digit]:
g.putcells(figure8[idx], x, y)
def NumDigit(num):
if num < 10:
return 1
else:
return 1 + NumDigit(int((num - (num % 10))/10))
def PlaceNumber(number, x = 0, y = 0):
curNum = number
d = 20 * NumDigit(number)
while True:
PlaceDigit(curNum%10, x + d, y)
curNum = (curNum - curNum%10) / 10
if curNum == 0:
return
d -= 20
gld = g.parse("3o$o$bo")
glds = []
soupIdx = 0
data = [{}, [], 0, 0, {}]
for i in xrange(0, 4):
glds.append(g.evolve(gld, i))
def FindActive():
rect = g.getrect()
cells = g.getcells([rect[0], rect[1], rect[2], 1])
return [cells[0], cells[1]]
def FindConnected(listXY, step2 = True):
result = copy.copy(listXY)
for xy in listXY:
x = xy[0]
y = xy[1]
for i in xrange(-2, 3):
for j in xrange(-2, 3):
if (abs(i) == 2 and abs(j) >= 1) or (abs(j) == 2 and abs(i) >= 1):
continue
if not step2:
if abs(i) == 2 or abs(j) == 2:
continue
if g.getcell(x + i, y + j) > 0 and len([i for t in listXY if (t[0] == x + i and t[1] == y + j)]) == 0:
if len([i for t in result if (t[0] == x + i and t[1] == y + j)]) == 0:
result.append([x + i, y + j])
return result
def RemoveList(listXY):
for xy in listXY:
x = xy[0]
y = xy[1]
g.setcell(x, y, 0)
def CountSL(step2 = True):
result = []
while int(g.getpop()) > 0:
xy = [FindActive()]
while True:
newXY = FindConnected(xy, step2)
#g.getstring(str(xy) + " : " + str(newXY) )
if len(newXY) == len(xy):
break
xy = newXY
result.append(xy)
RemoveList(xy)
return result
def GliderSoup():
global data
g.new(str(data[3]))
data[3] += 1
for i in xrange(-1, 2, 2):
for j in xrange(-1, 2, 2):
d = STRATING_DISTANCE_OF_FIRST_GLIDER
numGld = random.randint(MINIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT, MAXIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT)
for k in xrange(0, numGld):
x = d * i + random.randint(-RANGE_VARIANCE_OF_GLIDERS, RANGE_VARIANCE_OF_GLIDERS)
y = d * j
g.putcells(glds[random.randint(0, 3)], x, y, i, 0, 0, j)
d += random.randint(MINIMAL_DISTANCE_BETWEEN_TWO_GLIDERS, MAXIMAL_DISTANCE_BETWEEN_TWO_GLIDERS)
def EvolveToStability():
g.setbase(8)
g.setstep(2)
while int(g.getgen()) < MAXIMAL_GENERATION_EVOLVE_TO_STABILITY:
pop = int(g.getpop())
for i in xrange(0, 15):
g.step()
if int(g.getpop()) == pop:
cells = str(g.getcells([-REPORT_SQUARE_SIZE / 2, -REPORT_SQUARE_SIZE / 2 , REPORT_SQUARE_SIZE, REPORT_SQUARE_SIZE]))
for i in xrange(0, 15):
g.step()
if cells == str(g.getcells([-REPORT_SQUARE_SIZE / 2, -REPORT_SQUARE_SIZE / 2 , REPORT_SQUARE_SIZE, REPORT_SQUARE_SIZE])):
return True
return False
def SortSL(sl):
sl.sort(key=lambda r: 100000 * r[0] + r[1])
x1 = sl[0][0]
y1 = sl[0][1]
for o in xrange(0, len(sl)):
sl[o][0] -= x1
sl[o][1] -= y1
def PlaceSL(sl, x, y, make0 = False):
for xy in sl:
x1 = xy[0]
y1 = xy[1]
if make0:
g.setcell(x1 + x, y1 + y, 0)
else:
g.setcell(x1 + x, y1 + y, 1)
def IsTrivial(sl):
g.new("")
PlaceSL(sl, 0, 0)
slList = CountSL(False)
if len(slList) == 1:
return False
for s in slList:
g.new("")
PlaceSL(sl, 0, 0)
PlaceSL(s, 0, 0, True)
g.run(2)
cells = g.getcells(g.getrect())
g.run(2)
if str(g.getcells(g.getrect())) != str(cells):
return False
return True
if not os.path.exists(SCRIPT_RESULTS_DIRECTORY):
os.makedirs(SCRIPT_RESULTS_DIRECTORY)
if not os.path.exists(JSON_DUMP_DIRECTORY):
os.makedirs(JSON_DUMP_DIRECTORY)
def SaveData():
global data
fname = os.path.join(JSON_DUMP_DIRECTORY,"GliderSoupResultsData.json")
with open(fname, 'wb') as f:
json.dump(data, f)
def LoadData():
global data
fname = os.path.join(JSON_DUMP_DIRECTORY,"GliderSoupResultsData.json")
if os.path.exists(fname):
with open(fname, 'r') as f:
data = json.load(f)
LoadData()
CreateSoupRule()
totalTime = 0
numRuns = 0
while True:
numRuns += 1
startTime = time.time()
GliderSoup()
curPat = g.getcells(g.getrect())
if EvolveToStability():
g.setrule("SoupCleaner")
g.run(10)
g.setrule("B3/S23")
g.setalgo("HashLife")
g.setbase(8)
g.setstep(2)
sls = CountSL()
g.reset()
for sl in sls:
if(len(sl)) >= MINIMAL_BITS_TO_REPORT:
SortSL(sl)
if data[0].has_key(str(sl)):
key = data[4][str(sl)]
for d in data[1]:
if str(d[0]) == key:
if len(d[1]) < MAXIMAL_REPORTS_PER_SINGLE_SL:
d[1].append(curPat)
break
else:
if not IsTrivial(sl):
data[2] += 1
data[1].append([sl, [curPat]])
cl = ""
for o in xrange(0, MAXIMAL_OSCILLATION_PERIOD_TO_STORE):
g.new("")
PlaceSL(sl,0,0)
g.run(o)
cl = g.getcells(g.getrect())
for i in xrange(-1, 2, 2):
for j in xrange(-1, 2, 2):
g.new("")
g.putcells(cl, 0,0,i, 0, 0, j)
sl0 = CountSL()[0]
SortSL(sl0)
data[0][str(sl0)] = 1
data[4][str(sl0)] = str(sl)
g.new("")
g.putcells(cl, 0,0,0, i, j, 0)
sl0 = CountSL()[0]
SortSL(sl0)
data[0][str(sl0)] = 1
data[4][str(sl0)] = str(sl)
g.show(str(data[2]))
g.update()
SaveData()
if data[2] % CREATE_REPORT_EVERY == 0:
d = 0
dy = -1000
g.new("")
g.setrule("LifeHistory")
linehor = g.parse(str(REPORT_SQUARE_SIZE) + "F!")
linever = g.parse(str(REPORT_SQUARE_SIZE) + "F!", 0, 0, 0, 1, 1, 0)
for patList in data[1]:
yDist = 0
for pat in patList[1]:
g.putcells(pat, d, yDist)
g.putcells(linehor, d - REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
g.putcells(linever, d - REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
g.putcells(linehor, d - REPORT_SQUARE_SIZE / 2, yDist + REPORT_SQUARE_SIZE / 2)
g.putcells(linever, d + REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
yDist += REPORT_SQUARE_SIZE
PlaceSL(patList[0], -1200, dy + 10)
PlaceNumber(int(d / REPORT_SQUARE_SIZE), -1100, dy)
PlaceNumber(len(patList[0]), -1000, dy)
PlaceSL(patList[0], d, -600)
PlaceNumber(int(d / REPORT_SQUARE_SIZE), d + 100, -600)
d += REPORT_SQUARE_SIZE
dy += 60
prefix = ""
power = 10
for i in xrange (0, 5):
if data[2] < power:
prefix += "0"
power *= 10
fname = os.path.join(SCRIPT_RESULTS_DIRECTORY,"GliderSoupResults" + prefix + str(data[2]) + ".rle")
g.save(fname, "rle")
g.new("")
g.setrule("B3/S23")
g.setalgo("HashLife")
g.setbase(8)
g.setstep(3)
endTime = time.time()
totalTime += endTime - startTime
if numRuns % SHOW_PERFORMACE_EVERY == 0:
g.show("Soups per second = " + "{0:.2f}".format(numRuns / totalTime) + "; Num total successes = " + str(data[2]))
g.update()
The results are located in your pattern folder under directory ScriptResults. With prefix GliderSoupResults - the script currently saves the results every 5 successes, but the underlying data is saved on each success.
Stopping the script - will not erase the analyzed data, as it saved into GliderSoupResultsData.json (in same folder) and loads on start.
---
You can take the results files and json here
---
Few words on how to read the result report (See GliderSoupResults000105.rle attached).
On the left you can see all the SLs which were found. Their index - and bit count. Having the index, if you want to see a specific synth - goto column with that index. Above each column there is a SL that was found and the index, so it's very easy to find. Each SL can have up to 5 "recipes". Each recipe in the column produces the SL specified.
---
Feature history:
1. Rotation and symmetries removed. Oscilators are shown once.
2. Numeration and results report added.
3. Adds up to 5 synth per SL (made to have additional options in case some additional rare SL synth was found).
4. Trivial cases removed.
5. Bit count added to header report
6. Logs every 5 successes (as the non trivial results are more rare)
7. File operations support all os
8. Set base added to support other defaults.
9. Small performance boost.
10. Reports in LifeHistory format - doesn't allows interactions between the results.
11. Informative constants (with comments) added in top of the script for tweaking.
12. Oscillation for multiplier of 5 support added (64,3,5 are now supported).
13. Small performance boost, by reducing the max generation + #12.
14. Creates SoupCleaner rule file in rules folder. Uses the rule to clear common object. Major performance boost.
15. Performance report added + SHOW_PERFORMACE_EVERY constant added.