Page 1 of 1

Random Glider Collision Search

Posted: November 22nd, 2014, 7:06 pm
by simsim314
Here is my version of "soup search". The script generates long and random glider streams. After the collision collects SLs (or oscillators) with 16 bits or more.

Code: Select all



import random 
import golly as g 
import copy 
import os 
import json
import time 

#The distance from which gliders start to be placed
STRATING_DISTANCE_OF_FIRST_GLIDER = 35

#each glider arm has gliders. It's minimal and maximal number defined here. 
#The actual value is set to be random number between min and max. 
MINIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT = 6
MAXIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT = 22

#each glider is set to be placed not only at random distance but also in (-x, x). This is the variance constant 
RANGE_VARIANCE_OF_GLIDERS = 25

#Each glider is placed at some distance from pervious glider. The minimal and maximal are defined here. 
#The actual value is random between min and mix. Notice that minimal below 5 has chance to place two gliders one on top another. 
MINIMAL_DISTANCE_BETWEEN_TWO_GLIDERS = 5
MAXIMAL_DISTANCE_BETWEEN_TWO_GLIDERS = 21

#if the suop continues to evolve after this number of generations, the script generates new soup. 
MAXIMAL_GENERATION_EVOLVE_TO_STABILITY = 60 * 512

#The diretory where the results will be placed.
SCRIPT_RESULTS_DIRECTORY = os.path.join(g.getdir("patterns"),"ScriptResults")

#The directory where json dump file will be placed. 
JSON_DUMP_DIRECTORY = os.path.join(g.getdir("patterns"),"ScriptResults")

#The minimal number of bits should be considered a "finding" and will be included in the report. 
MINIMAL_BITS_TO_REPORT = 16

#Each SL has several soups reported for it. This is maximal number to report. 
MAXIMAL_REPORTS_PER_SINGLE_SL = 5

#Each SL is evolved this number of times and saved in all 8 symmerties. Made to avoid multiple report of oscillation. 
MAXIMAL_OSCILLATION_PERIOD_TO_STORE = 6

#Create report every N successes. This is N. 
CREATE_REPORT_EVERY = 5

#Show performance of soops per second after collecting this number of soups. 
SHOW_PERFORMACE_EVERY = 1000

#What is the square size in which we assume the final soup and the initial glider configuration will fit. 
REPORT_SQUARE_SIZE = 1000

g.new("")
g.setbase(8)
g.setstep(2)
g.setrule("B3/S23")
g.setalgo("HashLife")

def CreateSoupRule():
   fname = os.path.join(g.getdir("rules"), "SoupCleaner.rule")
   with open(fname, 'wb') as f:
      f.write('@RULE SoupCleaner\n')
      f.write('@TABLE\n')
      f.write('\n')
      f.write('n_states:27\n')
      f.write('neighborhood:Moore\n')
      f.write('symmetries:rotate4reflect\n')
      f.write('\n')
      f.write('var a = {0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26}\n')
      f.write('var b = {a}\n')
      f.write('var c = {a}\n')
      f.write('var d = {a}\n')
      f.write('var e = {a}\n')
      f.write('var f = {a}\n')
      f.write('var g = {a}\n')
      f.write('var h = {a}\n')
      f.write('\n')
      f.write('var known1 = {1,7,9,12,14,16,19,22,24,25,26}\n')
      f.write('var s1 = {5,7,9,12,14,16,19,22,24,25,26}\n')
      f.write('var s0 = {6,8,11,13,15,17,18,20,21,23}\n')
      f.write('\n')
      f.write('#2 - 0 with 1 or two neighbour \n')
      f.write('#3 - confirmed 1 \n')
      f.write('#4 - confirmed deletion \n')
      f.write('\n')
      f.write('#5 - suspected block \n')
      f.write('\n')
      f.write('#6 - suspected beehive inside\n')
      f.write('#7 - suspected beehive outside (1)\n')
      f.write('#8 - conformed beehive inside\n')
      f.write('\n')
      f.write('#9 - suspected blinker\n')
      f.write('#10 - suspected blinker step 2\n')
      f.write('\n')
      f.write('#11 - suspected boat inside\n')
      f.write('#12 - suspected boat outside (1)\n')
      f.write('\n')
      f.write('#13 - suspected ship inside\n')
      f.write('#14 - suspected ship outside (1)\n')
      f.write('\n')
      f.write('#15 - suspected loaf inside\n')
      f.write('#16 - suspected loaf outside (1)\n')
      f.write('#17 - conformed loaf inside\n')
      f.write('\n')
      f.write('#18 - suspected pond inside\n')
      f.write('#19 - suspected pond outside (1)\n')
      f.write('#20 - conformed pond inside\n')
      f.write('\n')
      f.write('#21 - suspected tub inside\n')
      f.write('#22 - suspected tub outside (1)\n')
      f.write('\n')
      f.write('#23 - suspected glider1 inside\n')
      f.write('#24 - suspected glider1 outside (1)\n')
      f.write('\n')
      f.write('#25 - suspected glider2 inside\n')
      f.write('#26 - suspected glider2 outside (1)\n')
      f.write('\n')
      f.write('#initial read\n')
      f.write('0,1,0,0,0,0,0,0,0,2\n')
      f.write('0,1,1,0,0,0,0,0,0,2\n')
      f.write('0,0,1,1,1,0,0,0,0,2\n')
      f.write('\n')
      f.write('#block deletion\n')
      f.write('1,2,2,1,1,1,2,2,0,5\n')
      f.write('5,5,5,5,0,0,0,0,0,0\n')
      f.write('5,a,b,c,d,e,f,g,h,3\n')
      f.write('\n')
      f.write('#beehive deletion\n')
      f.write('0,1,0,1,1,0,1,1,0,6\n')
      f.write('1,2,0,0,1,6,1,0,0,7\n')
      f.write('1,0,0,2,2,1,6,6,1,7\n')
      f.write('6,7,0,7,7,6,7,7,0,8\n')
      f.write('8,7,0,7,7,8,7,7,0,4\n')
      f.write('\n')
      f.write('7,1,b,c,d,e,f,g,h,3\n')
      f.write('7,a,1,c,d,e,f,g,h,3\n')
      f.write('\n')
      f.write('#blinker deletion\n')
      f.write('1,2,0,2,2,1,2,2,0,9\n')
      f.write('1,1,2,2,2,1,2,2,2,9\n')
      f.write('9,9,0,0,0,9,0,0,0,10\n')
      f.write('10,9,0,0,0,9,0,0,0,4\n')
      f.write('10,a,b,c,d,e,f,g,h,3\n')
      f.write('\n')
      f.write('#boat deletion\n')
      f.write('0,1,0,1,1,1,0,1,0,11\n')
      f.write('1,2,0,0,1,11,1,0,0,12\n')
      f.write('1,0,0,2,2,1,1,11,1,12\n')
      f.write('1,1,2,2,0,2,2,1,11,12\n')
      f.write('11,12,0,12,12,12,0,12,0,4\n')
      f.write('\n')
      f.write('#ship deletion\n')
      f.write('0,1,1,1,0,1,1,1,0,13\n')
      f.write('1,2,2,1,1,13,1,0,0,14\n')
      f.write('1,2,0,2,2,1,13,1,2,14\n')
      f.write('13,14,14,14,0,14,14,14,0,4\n')
      f.write('\n')
      f.write('#Loaf deletion\n')
      f.write('0,1,1,0,0,1,0,1,0,15\n')
      f.write('0,1,0,1,1,0,1,0,1,15\n')
      f.write('\n')
      f.write('1,2,2,1,15,15,1,0,0,16\n')
      f.write('1,2,0,0,1,15,15,1,2,16\n')
      f.write('1,15,1,0,0,2,0,0,1,16\n')
      f.write('1,15,15,15,1,0,0,0,1,16\n')
      f.write('\n')
      f.write('15,16,16,15,15,16,0,16,0,17\n')
      f.write('15,16,0,16,16,15,16,15,16,17\n')
      f.write('\n')
      f.write('17,16,16,17,17,16,0,16,0,4\n')
      f.write('17,16,0,16,16,17,16,17,16,4\n')
      f.write('\n')
      f.write('#Pond deletion\n')
      f.write('0,1,1,0,0,0,1,1,0,18\n')
      f.write('1,2,2,1,18,18,1,0,0,19\n')
      f.write('\n')
      f.write('18,19,19,18,18,18,19,19,0,20\n')
      f.write('20,20,20,20,19,19,0,19,19,4\n')
      f.write('\n')
      f.write('#tub deletion\n')
      f.write('0,1,0,1,0,1,0,1,0,21\n')
      f.write('1,2,0,0,1,21,1,0,0,22\n')
      f.write('\n')
      f.write('21,22,0,22,0,22,0,22,0,4\n')
      f.write('\n')
      f.write('#glider1 deletion\n')
      f.write('0,1,0,1,1,1,1,0,0,23\n')
      f.write('\n')
      f.write('1,2,2,1,1,23,1,0,0,24\n')
      f.write('1,2,0,2,2,1,23,1,2,24\n')
      f.write('1,1,2,2,2,1,0,23,1,24\n')
      f.write('1,0,1,23,0,2,0,2,0,24\n')
      f.write('1,1,2,2,0,2,0,0,23,24\n')
      f.write('\n')
      f.write('23,24,24,24,24,0,0,24,0,4\n')
      f.write('\n')
      f.write('#glider2 deletion\n')
      f.write('1,1,0,1,0,0,1,0,1,25\n')
      f.write('\n')
      f.write('1,2,2,1,25,0,0,2,0,26\n')
      f.write('1,2,0,0,1,25,0,1,2,26\n')
      f.write('1,0,0,2,0,2,0,25,1,26\n')
      f.write('1,0,25,0,0,2,0,2,0,26\n')
      f.write('\n')
      f.write('25,1,0,1,2,0,1,0,1,25\n')
      f.write('25,26,0,26,0,0,26,0,26,4\n')
      f.write('\n')
      f.write('#SL support \n')
      f.write('2,a,b,c,d,e,f,g,h,0\n')
      f.write('\n')
      f.write('1,2,a,b,c,e,f,g,h,3\n')
      f.write('known1,3,a,b,c,e,f,g,h,3\n')
      f.write('known1,a,3,b,c,e,f,g,h,3\n')
      f.write('\n')
      f.write('s0,3,b,c,d,e,f,g,h,0\n')
      f.write('s0,b,3,c,d,e,f,g,h,0\n')
      f.write('s1,1,b,c,d,e,f,g,h,3\n')
      f.write('s1,b,1,c,d,e,f,g,h,3\n')
      f.write('s1,3,b,c,d,e,f,g,h,3\n')
      f.write('s1,b,3,c,d,e,f,g,h,3\n')
      f.write('1,s0,b,c,d,e,f,g,h,3\n')
      f.write('1,b,s0,c,d,e,f,g,h,3\n')

      f.write('\n')
      f.write('#universal deletion \n')
      f.write('4,a,b,c,d,e,f,g,h,0\n')
      f.write('a,4,b,c,d,e,f,g,h,0\n')
      f.write('a,b,4,c,d,e,f,g,h,0\n')

snakeLineHor = g.parse("2obob2obo$ob2obob2o!")
snakeLineVer = g.transform(snakeLineHor, -3, 3, 0, 1, 1, 0)

figure8 = [snakeLineVer, g.transform(snakeLineVer, 0, 13),  snakeLineHor, g.transform(snakeLineHor, 0, 13), g.transform(snakeLineHor, 0, 26), g.transform(snakeLineVer, 13, 0), g.transform(snakeLineVer, 13, 13)]

def PlaceDigit(digit, x = 0, y = 0):
   digitIdx = [[0,1,2,4,5,6], [5,6],[1,2,3,4,5],[2,3,4,5,6],[0,3,5,6],[0,2,3,4,6],[0,1,2,3,4,6],[2,5,6],[0,1,2,3,4,5,6],[0,2,3,4,5,6]]
   
   if digit >= 0 and digit <= 9:
      for idx  in digitIdx[digit]:
         g.putcells(figure8[idx], x, y)
def NumDigit(num):
   if num < 10: 
      return 1 
   else: 
      return 1 + NumDigit(int((num - (num % 10))/10))
      
def PlaceNumber(number, x = 0, y = 0):
   curNum = number
   d = 20 * NumDigit(number)
   
   while True:
      PlaceDigit(curNum%10, x + d, y)
      curNum = (curNum - curNum%10) / 10 
      
      if curNum == 0:
         return 
      
      d -= 20

gld = g.parse("3o$o$bo")

glds = [] 
soupIdx = 0 
data = [{}, [], 0, 0, {}]

for i in xrange(0, 4):
   glds.append(g.evolve(gld, i))

def FindActive():

   rect = g.getrect()
   
   cells = g.getcells([rect[0], rect[1], rect[2], 1])
   return [cells[0], cells[1]]
         
def FindConnected(listXY, step2 = True):
   result = copy.copy(listXY)

   for xy in listXY:
      x = xy[0]
      y = xy[1]
      
      for i in xrange(-2, 3): 
         for j in xrange(-2, 3): 
              
            if (abs(i) == 2 and abs(j) >= 1) or (abs(j) == 2 and abs(i) >= 1):
               continue 

            if not step2: 
               if abs(i) == 2 or abs(j) == 2:
                  continue 

            if g.getcell(x + i, y + j) > 0 and len([i for t in listXY if (t[0] == x + i and t[1] == y + j)]) == 0:
               if len([i for t in result if (t[0] == x + i and t[1] == y + j)]) == 0:
                  result.append([x + i, y + j])

   return result
   
def RemoveList(listXY):

   for xy in listXY:
      x = xy[0]
      y = xy[1]
      g.setcell(x, y, 0)

def CountSL(step2 = True):
   
   result = []
   while int(g.getpop()) > 0: 
      xy = [FindActive()]
      
      while True: 
        newXY = FindConnected(xy, step2)
        
        #g.getstring(str(xy) + " : " + str(newXY) )
        
        if len(newXY) == len(xy):
          break
          
        xy = newXY
      
      result.append(xy)
      RemoveList(xy)
     
   return result
   
def GliderSoup():
   
   global data
   g.new(str(data[3]))
   data[3] += 1
   
   for i in xrange(-1, 2, 2):
      for j in xrange(-1, 2, 2):
         d = STRATING_DISTANCE_OF_FIRST_GLIDER 
         numGld = random.randint(MINIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT, MAXIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT)
         
         for k in xrange(0, numGld):
            x = d * i  + random.randint(-RANGE_VARIANCE_OF_GLIDERS, RANGE_VARIANCE_OF_GLIDERS)
            y = d * j 
            g.putcells(glds[random.randint(0, 3)], x, y, i, 0, 0, j)
            d += random.randint(MINIMAL_DISTANCE_BETWEEN_TWO_GLIDERS, MAXIMAL_DISTANCE_BETWEEN_TWO_GLIDERS)


def   EvolveToStability():
   g.setbase(8)
   g.setstep(2)
   
   while int(g.getgen()) < MAXIMAL_GENERATION_EVOLVE_TO_STABILITY:

      pop = int(g.getpop())
      
      for i in xrange(0, 15):
         g.step()
      
      if int(g.getpop()) == pop:
         cells = str(g.getcells([-REPORT_SQUARE_SIZE / 2, -REPORT_SQUARE_SIZE / 2 , REPORT_SQUARE_SIZE, REPORT_SQUARE_SIZE]))

         for i in xrange(0, 15):
            g.step()

         if cells == str(g.getcells([-REPORT_SQUARE_SIZE / 2, -REPORT_SQUARE_SIZE / 2 , REPORT_SQUARE_SIZE, REPORT_SQUARE_SIZE])):
            return True

   return False
   
def SortSL(sl):

   sl.sort(key=lambda r: 100000 * r[0] + r[1])
   
   x1 = sl[0][0]
   y1 = sl[0][1]
   
   for o in xrange(0, len(sl)):
      sl[o][0] -= x1
      sl[o][1] -= y1

def PlaceSL(sl, x, y, make0 = False):
   
   for xy in sl:
      x1 = xy[0]
      y1 = xy[1]
      
      if make0:
         g.setcell(x1 + x, y1 + y, 0)
      else:
         g.setcell(x1 + x, y1 + y, 1)
         
def IsTrivial(sl):
   g.new("")
   PlaceSL(sl, 0, 0)
   
   slList = CountSL(False)
   
   if len(slList) == 1:
      return False
   
   for s in slList:
      g.new("")
      PlaceSL(sl, 0, 0)
      PlaceSL(s, 0, 0, True)
      g.run(2)
      cells = g.getcells(g.getrect())
      g.run(2)
      if str(g.getcells(g.getrect())) != str(cells):
         return False
   
   return True
   


if not os.path.exists(SCRIPT_RESULTS_DIRECTORY):
    os.makedirs(SCRIPT_RESULTS_DIRECTORY)

if not os.path.exists(JSON_DUMP_DIRECTORY):
    os.makedirs(JSON_DUMP_DIRECTORY)

def SaveData():
   global data
   fname = os.path.join(JSON_DUMP_DIRECTORY,"GliderSoupResultsData.json")
   with open(fname, 'wb') as f:
      json.dump(data, f)

def LoadData():
   global data
   fname = os.path.join(JSON_DUMP_DIRECTORY,"GliderSoupResultsData.json")
   
   if os.path.exists(fname):
      with open(fname, 'r') as f:
         data = json.load(f)


LoadData()
CreateSoupRule()

totalTime = 0 
numRuns = 0 

while True:
   numRuns += 1
   startTime = time.time()

   GliderSoup()
   curPat = g.getcells(g.getrect())
   if EvolveToStability():
      
      g.setrule("SoupCleaner")
      g.run(10)

      g.setrule("B3/S23")
      g.setalgo("HashLife")
      g.setbase(8)
      g.setstep(2)

      sls = CountSL()

      g.reset()
        
      for sl in sls:
        if(len(sl)) >= MINIMAL_BITS_TO_REPORT:

          SortSL(sl)
          
          if data[0].has_key(str(sl)):
            key = data[4][str(sl)]
            
            for d in data[1]:
               if str(d[0]) == key:
                 if len(d[1]) < MAXIMAL_REPORTS_PER_SINGLE_SL:
                   d[1].append(curPat)
                   
                 break
          else:
            if not IsTrivial(sl):
               data[2] += 1
               data[1].append([sl, [curPat]])
               
               cl = ""
               
               for o in xrange(0, MAXIMAL_OSCILLATION_PERIOD_TO_STORE):
                 
                 g.new("")
                 PlaceSL(sl,0,0)
                 g.run(o)
                 cl = g.getcells(g.getrect())
                 
                 for i in xrange(-1, 2, 2):
                   for j in xrange(-1, 2, 2):
                     g.new("")
                     g.putcells(cl, 0,0,i, 0, 0, j)
                     sl0 = CountSL()[0]
                     SortSL(sl0)
                     data[0][str(sl0)] = 1
                     data[4][str(sl0)] = str(sl)
                     g.new("")
                     g.putcells(cl, 0,0,0, i, j, 0)
                     sl0 = CountSL()[0]
                     SortSL(sl0)
                     data[0][str(sl0)] = 1
                     data[4][str(sl0)] = str(sl)
                     
               g.show(str(data[2]))
               g.update()

               SaveData()

               if data[2] % CREATE_REPORT_EVERY == 0:
                 d = 0 
                 dy = -1000
                 g.new("")
                 g.setrule("LifeHistory")
                 
                 linehor = g.parse(str(REPORT_SQUARE_SIZE) + "F!")
                 linever = g.parse(str(REPORT_SQUARE_SIZE) + "F!", 0, 0, 0, 1, 1, 0)
                 
                 for patList in data[1]:
                   
                   yDist = 0 
                   
                   for pat in patList[1]:
                     g.putcells(pat, d, yDist)
                     
                     g.putcells(linehor, d - REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
                     g.putcells(linever, d - REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
                     
                     g.putcells(linehor, d - REPORT_SQUARE_SIZE / 2, yDist + REPORT_SQUARE_SIZE / 2)
                     g.putcells(linever, d + REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
                     
                     yDist += REPORT_SQUARE_SIZE 
                     
                     
                   PlaceSL(patList[0], -1200, dy + 10)
                   PlaceNumber(int(d / REPORT_SQUARE_SIZE), -1100, dy)
                   PlaceNumber(len(patList[0]), -1000, dy)
                   
                   PlaceSL(patList[0], d, -600)
                   PlaceNumber(int(d / REPORT_SQUARE_SIZE), d + 100, -600)
                   
                   d += REPORT_SQUARE_SIZE 
                   dy += 60
                 
                 prefix = ""
                 power = 10
                 
                 for i in xrange (0, 5):
                   if data[2] < power:
                     prefix += "0"
                   power *= 10
                  
                 fname = os.path.join(SCRIPT_RESULTS_DIRECTORY,"GliderSoupResults" + prefix + str(data[2]) + ".rle")

                 g.save(fname, "rle")
                 
                 g.new("")
                 g.setrule("B3/S23")
                 g.setalgo("HashLife")
                 g.setbase(8)
                 g.setstep(3)

   endTime = time.time()
   totalTime += endTime - startTime
   
   if numRuns % SHOW_PERFORMACE_EVERY == 0: 
	   g.show("Soups per second = " + "{0:.2f}".format(numRuns / totalTime) + "; Num total successes = " + str(data[2]))
	   g.update()


It's incremental script - with save and load capability. The longer you run it, the more interesting results it will give (as it doesn't repeat the result twice).

The results are located in your pattern folder under directory ScriptResults. With prefix GliderSoupResults - the script currently saves the results every 5 successes, but the underlying data is saved on each success.

Stopping the script - will not erase the analyzed data, as it saved into GliderSoupResultsData.json (in same folder) and loads on start.

---

You can take the results files and json here

---
Few words on how to read the result report (See GliderSoupResults000105.rle attached).

On the left you can see all the SLs which were found. Their index - and bit count. Having the index, if you want to see a specific synth - goto column with that index. Above each column there is a SL that was found and the index, so it's very easy to find. Each SL can have up to 5 "recipes". Each recipe in the column produces the SL specified.


---

Feature history:

1. Rotation and symmetries removed. Oscilators are shown once.
2. Numeration and results report added.
3. Adds up to 5 synth per SL (made to have additional options in case some additional rare SL synth was found).
4. Trivial cases removed.
5. Bit count added to header report
6. Logs every 5 successes (as the non trivial results are more rare)
7. File operations support all os
8. Set base added to support other defaults.
9. Small performance boost.
10. Reports in LifeHistory format - doesn't allows interactions between the results.
11. Informative constants (with comments) added in top of the script for tweaking.
12. Oscillation for multiplier of 5 support added (64,3,5 are now supported).
13. Small performance boost, by reducing the max generation + #12.
14. Creates SoupCleaner rule file in rules folder. Uses the rule to clear common object. Major performance boost.
15. Performance report added + SHOW_PERFORMACE_EVERY constant added.

Re: Random Glider Collision Search

Posted: November 22nd, 2014, 7:21 pm
by simsim314
Here are some samples for "cleaned up" results:

Code: Select all

x = 235, y = 46, rule = B3/S23
12bo$10bobo$11b2o5$bo$2bo$3o4$131b3o6$153bo$152b2o$151b2o$152bo$153bo
2$225b3o3b3o2$229bo$127b2o100bo$127b2o100bo$131b2o$130bo2bo$131b2o$43b
o$42bobo186b3o$42bobo186bobo$31bo11bo187bo2bo$30bobo199b2o$31bo6b2o7b
2o$37bo2bo5bo2bo$38b2o7b2o2$26b2o15bo$26b2o14bobo$42bobo$43bo!
EDIT Noticed that sometime *WSS emitted from one interaction collide with other and ruins it. If you have something suspicious check this option.

Another point to notice: SL is defined as closed proximity as well. So two blocks at distance 1 considered single SL. This gives some false positives, but doesn't emit some interesting SLs.

Re: Random Glider Collision Search

Posted: November 23rd, 2014, 12:55 pm
by dvgrn
simsim314 wrote:Here is my version of "soup search". The script generates long and random glider streams. After the collision collects SLs (or oscillators) with 16 bits or more.
...
EDIT Added few fixes.
A few small items:

1) Here again it's a good idea to do g.setbase(8) before your g.setstep(3) -- on my system the script crashes because EvolveToStability() doesn't run far enough to stabilize most patterns.

2) Your file handling is Windows-specific, I think, with all the double backslashes. If you use os.path.join for lines like

directory = os.path.join(g.getdir("patterns"),"ScriptResults")

then the script should run equally well on Mac, Linux or Windows.

Re: Random Glider Collision Search

Posted: November 23rd, 2014, 1:03 pm
by simsim314
dvgrn wrote:A few small items:
Thx will fix soon.

Small but important note - the script is in rapid development, so please when you take the new version delete the GliderSoupResultsData.json file from ScriptResults folder.

EDIT
@dvgrn
OK this all have been fixed now.

EDIT2
To me the script looks like good beta quality. Not too much junk is reported. Orientation and oscillation taken into account and saved. Save and load works nicely. The report looks very informative and allows easy access to results. Few synth per SL (if found). Looks very good already.

The main functionality missing is sharing + multi core. It would be nice if people would share the GliderSoupResultsData.json file they've got, and the application could combine all the results together and continue looking from there. While once more new people who use the combined data could continue the search from where other people have stopped.

Lets see the adaption of this script by the community.

Re: Random Glider Collision Search

Posted: November 23rd, 2014, 1:45 pm
by dvgrn
simsim314 wrote:Small but important note - the script is in rapid development, so please when you take the new version delete the GliderSoupResultsData.json file from ScriptResults folder.
I had a little trouble finding the ScriptResults folder, because I've set my Golly Patterns folder to a directory outside of the Golly folder, so that I can use the treeview to browse other pattern collections. A more standard location for things like ScriptResults would be

os.path.join(g.getdir("data"),"ScriptResults")

though I admit that that can be annoying to find also, depending on the OS.

My first report RLE showed a strange-looking pulsar core instead of the full pulsar.

24 to 80+ gliders seems like more than enough to get randomish soup going at the center of four salvos, most of the time. Have you checked yet to see if you get a similar density and distribution of interesting objects with half of the input gliders?

Re: Random Glider Collision Search

Posted: November 23rd, 2014, 2:13 pm
by simsim314
dvgrn wrote:My first report RLE showed a strange-looking pulsar core instead of the full pulsar.
Yep the pulsar core is annoying bug. The script is not optimized to work with oscillators although it supports them.

Consider the the pulsar core as false positive - there are many other filtration made, so this small issue (as well as HWSS) is just a minor annoyance.

If you let it run long enough it gives very nice SLs without mess (like three beehive in a row etc.)

I also lately changed the report format to "LifeHistory" with boundaries - so check it out.
dvgrn wrote: Have you checked yet to see if you get a similar density and distribution of interesting objects with half of the input gliders?
Hmm no - and I see the reason to do it. I wonder if I should let the user decide? how do we know when there is "too much" gliders, or too little? I also thought about adding *WSS input - but not sure I like the idea.

Re: Random Glider Collision Search

Posted: November 23rd, 2014, 3:43 pm
by simsim314
dvgrn wrote:Have you checked yet to see if you get a similar density and distribution of interesting objects with half of the input gliders?....though I admit that that can be annoying to find also, depending on the OS.
This all seems to be result of missing settings. Maybe I'll export all the options into GUI. There is no much options really:

0. Starting distance of the first glider (number)
1. Range of number of gliders (min, max)
2. Disturbance variance of each glider (-number, number)
3. Distance variance between gliders (min, max)
4. Maximal generation to evolve to stability (number)
5. Scripts Result directory (directory name)
6. Data store directory for json dump + last settings (directory name)
7. Minimal bit-count to report (number)
8. Maximal number of soups to report on specific SL (number)
9. Maximal oscillation period to include (number)
10. Create report file every (number)
11. Size of reporting square (number)

Maybe *WSS flag with similar parameters to gliders. But I really like gliders only approach - it seems to be more elegant in some way.

EDIT Have no time (and motivation) to write a tinker GUI. So just added the above constants in the top of the script, with comment and naming, for everyone who wants to tweak some "magic number" can easily do so without going deep into the script itself.

Notice it's also possible to stop the script, tweak some number - and continue the execution, as the result list has nothing to do with the initial glider soup parameters. No need to delete json - it was needed just couple of times before (when a major change in the report design was introduced).

Re: Random Glider Collision Search

Posted: November 24th, 2014, 7:13 am
by simsim314
Here is significant performance boost using Adam's trick.

I've created a rule with 27 states that cleans all common ash objects: blocks, blinkers, tub, ship, beehive, loaf, boat, pond and glider. After the soup is settled, the script is running for 10 generations (actually most of the time 5 is needed, but just for the most rare cases), cleans all the common ash, and then SL reader comes to analyze what's left.

This really boost the performance by factor of 5-10.

Here is the rule:

Code: Select all


@RULE SoupCleaner

@TABLE

n_states:27
neighborhood:Moore
symmetries:rotate4reflect

var a = {0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26}
var b = {a}
var c = {a}
var d = {a}
var e = {a}
var f = {a}
var g = {a}
var h = {a}

var known1 = {1,7,9,12,14,16,19,22,24,25,26}
var s1 = {5,7,9,12,14,16,19,22,24,25,26}
var s0 = {6,8,11,13,15,17,18,20,21,23}

#2 - 0 with 1 or two neighbour 
#3 - confirmed 1 
#4 - confirmed deletion 

#5 - suspected block 

#6 - suspected beehive inside
#7 - suspected beehive outside (1)
#8 - conformed beehive inside

#9 - suspected blinker
#10 - suspected blinker step 2

#11 - suspected boat inside
#12 - suspected boat outside (1)

#13 - suspected ship inside
#14 - suspected ship outside (1)

#15 - suspected loaf inside
#16 - suspected loaf outside (1)
#17 - conformed loaf inside

#18 - suspected pond inside
#19 - suspected pond outside (1)
#20 - conformed pond inside

#21 - suspected tub inside
#22 - suspected tub outside (1)

#23 - suspected glider1 inside
#24 - suspected glider1 outside (1)

#25 - suspected glider2 inside
#26 - suspected glider2 outside (1)

#initial read
0,1,0,0,0,0,0,0,0,2
0,1,1,0,0,0,0,0,0,2
0,0,1,1,1,0,0,0,0,2

#block deletion
1,2,2,1,1,1,2,2,0,5
5,5,5,5,0,0,0,0,0,0
5,a,b,c,d,e,f,g,h,3

#beehive deletion
0,1,0,1,1,0,1,1,0,6
1,2,0,0,1,6,1,0,0,7
1,0,0,2,2,1,6,6,1,7
6,7,0,7,7,6,7,7,0,8
8,7,0,7,7,8,7,7,0,4

7,1,b,c,d,e,f,g,h,3
7,a,1,c,d,e,f,g,h,3

#blinker deletion
1,2,0,2,2,1,2,2,0,9
1,1,2,2,2,1,2,2,2,9
9,9,0,0,0,9,0,0,0,10
10,9,0,0,0,9,0,0,0,4
10,a,b,c,d,e,f,g,h,3

#boat deletion
0,1,0,1,1,1,0,1,0,11
1,2,0,0,1,11,1,0,0,12
1,0,0,2,2,1,1,11,1,12
1,1,2,2,0,2,2,1,11,12
11,12,0,12,12,12,0,12,0,4

#ship deletion
0,1,1,1,0,1,1,1,0,13
1,2,2,1,1,13,1,0,0,14
1,2,0,2,2,1,13,1,2,14
13,14,14,14,0,14,14,14,0,4

#Loaf deletion
0,1,1,0,0,1,0,1,0,15
0,1,0,1,1,0,1,0,1,15

1,2,2,1,15,15,1,0,0,16
1,2,0,0,1,15,15,1,2,16
1,15,1,0,0,2,0,0,1,16
1,15,15,15,1,0,0,0,1,16

15,16,16,15,15,16,0,16,0,17
15,16,0,16,16,15,16,15,16,17

17,16,16,17,17,16,0,16,0,4
17,16,0,16,16,17,16,17,16,4

#Pond deletion
0,1,1,0,0,0,1,1,0,18
1,2,2,1,18,18,1,0,0,19

18,19,19,18,18,18,19,19,0,20
20,20,20,20,19,19,0,19,19,4

#tub deletion
0,1,0,1,0,1,0,1,0,21
1,2,0,0,1,21,1,0,0,22

21,22,0,22,0,22,0,22,0,4

#glider1 deletion
0,1,0,1,1,1,1,0,0,23

1,2,2,1,1,23,1,0,0,24
1,2,0,2,2,1,23,1,2,24
1,1,2,2,2,1,0,23,1,24
1,0,1,23,0,2,0,2,0,24
1,1,2,2,0,2,0,0,23,24

23,24,24,24,24,0,0,24,0,4

#glider2 deletion
1,1,0,1,0,0,1,0,1,25

1,2,2,1,25,0,0,2,0,26
1,2,0,0,1,25,0,1,2,26
1,0,0,2,0,2,0,25,1,26
1,0,25,0,0,2,0,2,0,26

25,1,0,1,2,0,1,0,1,25
25,26,0,26,0,0,26,0,26,4

#SL support 
2,a,b,c,d,e,f,g,h,0

1,2,a,b,c,e,f,g,h,3
known1,3,a,b,c,e,f,g,h,3
known1,a,3,b,c,e,f,g,h,3

s0,3,b,c,d,e,f,g,h,0
s0,b,3,c,d,e,f,g,h,0
s1,1,b,c,d,e,f,g,h,3
s1,b,1,c,d,e,f,g,h,3
s1,3,b,c,d,e,f,g,h,3
s1,b,3,c,d,e,f,g,h,3

#universal deletion 
4,a,b,c,d,e,f,g,h,0
a,4,b,c,d,e,f,g,h,0
a,b,4,c,d,e,f,g,h,0



And here is the script that uses it (not merged to main brunch, as I want to create function that creates the rule automatically and places it into rules folder from inside the script itself).

Code: Select all


import random 
import golly as g 
import copy 
import os 
import json

#The distance from which gliders start to be placed
STRATING_DISTANCE_OF_FIRST_GLIDER = 35

#each glider arm has gliders. It's minimal and maximal number defined here. 
#The actual value is set to be random number between min and max. 
MINIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT = 6
MAXIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT = 22

#each glider is set to be placed not only at random distance but also in (-x, x). This is the variance constant 
RANGE_VARIANCE_OF_GLIDERS = 25

#Each glider is placed at some distance from pervious glider. The minimal and maximal are defined here. 
#The actual value is random between min and mix. Notice that minimal below 5 has chance to place two gliders one on top another. 
MINIMAL_DISTANCE_BETWEEN_TWO_GLIDERS = 5
MAXIMAL_DISTANCE_BETWEEN_TWO_GLIDERS = 21

#if the suop continues to evolve after this number of generations, the script generates new soup. 
MAXIMAL_GENERATION_EVOLVE_TO_STABILITY = 60 * 512

#The diretory where the results will be placed.
SCRIPT_RESULTS_DIRECTORY = os.path.join(g.getdir("patterns"),"ScriptResults")

#The directory where json dump file will be placed. 
JSON_DUMP_DIRECTORY = os.path.join(g.getdir("patterns"),"ScriptResults")

#The minimal number of bits should be considered a "finding" and will be included in the report. 
MINIMAL_BITS_TO_REPORT = 16

#Each SL has several soups reported for it. This is maximal number to report. 
MAXIMAL_REPORTS_PER_SINGLE_SL = 5

#Each SL is evolved this number of times and saved in all 8 symmerties. Made to avoid multiple report of oscillation. 
MAXIMAL_OSCILLATION_PERIOD_TO_STORE = 6

#Create report every N successes. This is N. 
CREATE_REPORT_EVERY = 5

#What is the square size in which we assume the final soup and the initial glider configuration will fit. 
REPORT_SQUARE_SIZE = 1000

g.new("")
g.setbase(8)
g.setstep(2)
g.setrule("B3/S23")
g.setalgo("HashLife")

snakeLineHor = g.parse("2obob2obo$ob2obob2o!")
snakeLineVer = g.transform(snakeLineHor, -3, 3, 0, 1, 1, 0)

figure8 = [snakeLineVer, g.transform(snakeLineVer, 0, 13),  snakeLineHor, g.transform(snakeLineHor, 0, 13), g.transform(snakeLineHor, 0, 26), g.transform(snakeLineVer, 13, 0), g.transform(snakeLineVer, 13, 13)]

def PlaceDigit(digit, x = 0, y = 0):
   digitIdx = [[0,1,2,4,5,6], [5,6],[1,2,3,4,5],[2,3,4,5,6],[0,3,5,6],[0,2,3,4,6],[0,1,2,3,4,6],[2,5,6],[0,1,2,3,4,5,6],[0,2,3,4,5,6]]
   
   if digit >= 0 and digit <= 9:
      for idx  in digitIdx[digit]:
         g.putcells(figure8[idx], x, y)
def NumDigit(num):
   if num < 10: 
      return 1 
   else: 
      return 1 + NumDigit(int((num - (num % 10))/10))
      
def PlaceNumber(number, x = 0, y = 0):
   curNum = number
   d = 20 * NumDigit(number)
   
   while True:
      PlaceDigit(curNum%10, x + d, y)
      curNum = (curNum - curNum%10) / 10 
      
      if curNum == 0:
         return 
      
      d -= 20

gld = g.parse("3o$o$bo")

glds = [] 
soupIdx = 0 
data = [{}, [], 0, 0, {}]

for i in xrange(0, 4):
   glds.append(g.evolve(gld, i))

def FindActive():

   rect = g.getrect()
   
   cells = g.getcells([rect[0], rect[1], rect[2], 1])
   return [cells[0], cells[1]]
         
def FindConnected(listXY, step2 = True):
   result = copy.copy(listXY)

   for xy in listXY:
      x = xy[0]
      y = xy[1]
      
      for i in xrange(-2, 3): 
         for j in xrange(-2, 3): 
              
            if (abs(i) == 2 and abs(j) >= 1) or (abs(j) == 2 and abs(i) >= 1):
               continue 

            if not step2: 
               if abs(i) == 2 or abs(j) == 2:
                  continue 

            if g.getcell(x + i, y + j) > 0 and len([i for t in listXY if (t[0] == x + i and t[1] == y + j)]) == 0:
               if len([i for t in result if (t[0] == x + i and t[1] == y + j)]) == 0:
                  result.append([x + i, y + j])

   return result
   
def RemoveList(listXY):

   for xy in listXY:
      x = xy[0]
      y = xy[1]
      g.setcell(x, y, 0)

def CountSL(step2 = True):
   
   result = []
   while int(g.getpop()) > 0: 
      xy = [FindActive()]
      
      while True: 
        newXY = FindConnected(xy, step2)
        
        #g.getstring(str(xy) + " : " + str(newXY) )
        
        if len(newXY) == len(xy):
          break
          
        xy = newXY
      
      result.append(xy)
      RemoveList(xy)
     
   return result
   
def GliderSoup():
   
   global data
   g.new(str(data[3]))
   data[3] += 1
   
   for i in xrange(-1, 2, 2):
      for j in xrange(-1, 2, 2):
         d = STRATING_DISTANCE_OF_FIRST_GLIDER 
         numGld = random.randint(MINIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT, MAXIMAL_NUMBER_OF_GLIDERS_TO_SHOOOT)
         
         for k in xrange(0, numGld):
            x = d * i  + random.randint(-RANGE_VARIANCE_OF_GLIDERS, RANGE_VARIANCE_OF_GLIDERS)
            y = d * j 
            g.putcells(glds[random.randint(0, 3)], x, y, i, 0, 0, j)
            d += random.randint(MINIMAL_DISTANCE_BETWEEN_TWO_GLIDERS, MAXIMAL_DISTANCE_BETWEEN_TWO_GLIDERS)


def   EvolveToStability():
   g.setbase(8)
   g.setstep(2)
   
   while int(g.getgen()) < MAXIMAL_GENERATION_EVOLVE_TO_STABILITY:

      pop = int(g.getpop())
      
      for i in xrange(0, 15):
         g.step()
      
      if int(g.getpop()) == pop:
         cells = str(g.getcells([-REPORT_SQUARE_SIZE / 2, -REPORT_SQUARE_SIZE / 2 , REPORT_SQUARE_SIZE, REPORT_SQUARE_SIZE]))

         for i in xrange(0, 15):
            g.step()

         if cells == str(g.getcells([-REPORT_SQUARE_SIZE / 2, -REPORT_SQUARE_SIZE / 2 , REPORT_SQUARE_SIZE, REPORT_SQUARE_SIZE])):
            return True

   return False
   
def SortSL(sl):

   sl.sort(key=lambda r: 100000 * r[0] + r[1])
   
   x1 = sl[0][0]
   y1 = sl[0][1]
   
   for o in xrange(0, len(sl)):
      sl[o][0] -= x1
      sl[o][1] -= y1

def PlaceSL(sl, x, y, make0 = False):
   
   for xy in sl:
      x1 = xy[0]
      y1 = xy[1]
      
      if make0:
         g.setcell(x1 + x, y1 + y, 0)
      else:
         g.setcell(x1 + x, y1 + y, 1)
         
def IsTrivial(sl):
   g.new("")
   PlaceSL(sl, 0, 0)
   
   slList = CountSL(False)
   
   if len(slList) == 1:
      return False
   
   for s in slList:
      g.new("")
      PlaceSL(sl, 0, 0)
      PlaceSL(s, 0, 0, True)
      g.run(2)
      cells = g.getcells(g.getrect())
      g.run(2)
      if str(g.getcells(g.getrect())) != str(cells):
         return False
   
   return True
   


if not os.path.exists(SCRIPT_RESULTS_DIRECTORY):
    os.makedirs(SCRIPT_RESULTS_DIRECTORY)

if not os.path.exists(JSON_DUMP_DIRECTORY):
    os.makedirs(JSON_DUMP_DIRECTORY)

def SaveData():
   global data
   fname = os.path.join(JSON_DUMP_DIRECTORY,"GliderSoupResultsData.json")
   with open(fname, 'wb') as f:
      json.dump(data, f)

def LoadData():
   global data
   fname = os.path.join(JSON_DUMP_DIRECTORY,"GliderSoupResultsData.json")
   
   if os.path.exists(fname):
      with open(fname, 'r') as f:
         data = json.load(f)


LoadData()


while True:

	GliderSoup()
	curPat = g.getcells(g.getrect())
	if EvolveToStability():
		
		g.setrule("SoupCleaner")
		g.run(10)

		g.setrule("B3/S23")
		g.setalgo("HashLife")
		g.setbase(8)
		g.setstep(2)
		
		sls = CountSL()

		g.reset()
		  
		for sl in sls:
		  if(len(sl)) >= MINIMAL_BITS_TO_REPORT:

			 SortSL(sl)
			 
			 if data[0].has_key(str(sl)):
				key = data[4][str(sl)]
				
				for d in data[1]:
				   if str(d[0]) == key:
					  if len(d[1]) < MAXIMAL_REPORTS_PER_SINGLE_SL:
						 d[1].append(curPat)
						 
					  break
			 else:
				if not IsTrivial(sl):
				   data[2] += 1
				   data[1].append([sl, [curPat]])
				   
				   cl = ""
				   
				   for o in xrange(0, MAXIMAL_OSCILLATION_PERIOD_TO_STORE):
					  
					  g.new("")
					  PlaceSL(sl,0,0)
					  g.run(o)
					  cl = g.getcells(g.getrect())
					  
					  for i in xrange(-1, 2, 2):
						 for j in xrange(-1, 2, 2):
							g.new("")
							g.putcells(cl, 0,0,i, 0, 0, j)
							sl0 = CountSL()[0]
							SortSL(sl0)
							data[0][str(sl0)] = 1
							data[4][str(sl0)] = str(sl)
							g.new("")
							g.putcells(cl, 0,0,0, i, j, 0)
							sl0 = CountSL()[0]
							SortSL(sl0)
							data[0][str(sl0)] = 1
							data[4][str(sl0)] = str(sl)
							
				   g.show(str(data[2]))
				   g.update()

				   SaveData()

				   if data[2] % CREATE_REPORT_EVERY == 0:
					  d = 0 
					  dy = -1000
					  g.new("")
					  g.setrule("LifeHistory")
					  
					  linehor = g.parse(str(REPORT_SQUARE_SIZE) + "F!")
					  linever = g.parse(str(REPORT_SQUARE_SIZE) + "F!", 0, 0, 0, 1, 1, 0)
					  
					  for patList in data[1]:
						 
						 yDist = 0 
						 
						 for pat in patList[1]:
							g.putcells(pat, d, yDist)
							
							g.putcells(linehor, d - REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
							g.putcells(linever, d - REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
							
							g.putcells(linehor, d - REPORT_SQUARE_SIZE / 2, yDist + REPORT_SQUARE_SIZE / 2)
							g.putcells(linever, d + REPORT_SQUARE_SIZE / 2, yDist - REPORT_SQUARE_SIZE / 2)
							
							yDist += REPORT_SQUARE_SIZE 
							
							
						 PlaceSL(patList[0], -1200, dy + 10)
						 PlaceNumber(int(d / REPORT_SQUARE_SIZE), -1100, dy)
						 PlaceNumber(len(patList[0]), -1000, dy)
						 
						 PlaceSL(patList[0], d, -600)
						 PlaceNumber(int(d / REPORT_SQUARE_SIZE), d + 100, -600)
						 
						 d += REPORT_SQUARE_SIZE 
						 dy += 60
					  
					  prefix = ""
					  power = 10
					  
					  for i in xrange (0, 5):
						 if data[2] < power:
							prefix += "0"
						 power *= 10
						
					  fname = os.path.join(SCRIPT_RESULTS_DIRECTORY,"GliderSoupResults" + prefix + str(data[2]) + ".rle")

					  g.save(fname, "rle")
					  
					  g.new("")
					  g.setrule("B3/S23")
					  g.setalgo("HashLife")
					  g.setbase(8)
					  g.setstep(3)



NOTE No need to delete json. It's just improves performance, the script can continue to run from where it stopped.

EDIT Merged into main brunch. The script creates the rule itself. No need to copy rule into rules folder.

Re: Random Glider Collision Search

Posted: November 24th, 2014, 9:00 am
by dvgrn
simsim314 wrote:Here is significant performance boost using Adam's trick.
The cleanup rule looks good! I haven't tried really hard to find objects that it cleans up when it shouldn't, but it definitely passed a quick sanity check -- there don't seem to be any likely ways to fool it.

Just out of curiosity, have you tried Adam's other speedup trick? Tom Rokicki's idea of running a large number of soups in parallel, and cleaning them up in parallel, is rumored to have given apgsearch another 4x speed boost.

You just have to put lots of individual experiments in one universe, separated from each other by more than MAXIMAL_GENERATION_EVOLVE_TO_STABILITY/2. Then it's easy enough to figure out from the coordinates which experiment a given successful result came from.

EDIT: You're not identifying or collecting infinite-growth patterns, correct? I keep thinking that somewhere in these larger search spaces there ought to be another self-sustaining high-period puffer of some kind -- could be as simple as a pi or Herschel or LOM or whatever, bouncing off of some debris in a way that creates an offset/reflected copy of the debris, plus a copy of the active reaction in the right place.

But apgsearch has been on the lookout for such things for a while now. Nothing but switch engines so far...!

Re: Random Glider Collision Search

Posted: November 24th, 2014, 9:13 am
by simsim314
dvgrn wrote:The cleanup rule looks good!
I've fixed a small bug. You can just take the script from the header - and it creates the fixed SoupCleaner into the rule folder.

Anyway this script is great as it can run just by "run from clipboard" option in golly.
dvgrn wrote: have you tried Adam's other speedup trick?
It shouldn't be too hard to make the change. I'm not sure why it makes such improvement, but lets try it.

EDIT Just tried the trick with many soups in single screen - didn't do anything (22 soups per second). Also tried to use join + transform instead of putcells for each glider - also no significant difference.

The only thing I've noticed consistently is that open golly screen, takes a bit more time. So minimizing the golly which runs the script helps a bit.

I will add the part with timer report to the main soon.

EDIT2
Added the performance report.

To check no performance boost by few soups per single document, place a loop inside GliderSoup function after the g.new and add to X some value. Nothing else will be influenced by it (only you will ruin your json with some junk soups). I placed loop of 10 soups, and got 2.1 soup per second, which is 10 times slower from when I run 1 soup instead of ten in this function. So no boost there.

Re: Random Glider Collision Search

Posted: November 25th, 2014, 12:40 am
by wildmyron
dvgrn wrote:Just out of curiosity, have you tried Adam's other speedup trick? Tom Rokicki's idea of running a large number of soups in parallel, and cleaning them up in parallel, is rumored to have given apgsearch another 4x speed boost.

You just have to put lots of individual experiments in one universe, separated from each other by more than MAXIMAL_GENERATION_EVOLVE_TO_STABILITY/2. Then it's easy enough to figure out from the coordinates which experiment a given successful result came from.
simsim314 wrote:It shouldn't be too hard to make the change. I'm not sure why it makes such improvement, but lets try it.
simsim314 wrote:EDIT Just tried the trick with many soups in single screen - didn't do anything (22 soups per second). Also tried to use join + transform instead of putcells for each glider - also no significant difference.

The only thing I've noticed consistently is that open golly screen, takes a bit more time. So minimizing the golly which runs the script helps a bit.

I will add the part with timer report to the main soon.

EDIT2
Added the performance report.

To check no performance boost by few soups per single document, place a loop inside GliderSoup function after the g.new and add to X some value. Nothing else will be influenced by it (only you will ruin your json with some junk soups). I placed loop of 10 soups, and got 2.1 soup per second, which is 10 times slower from when I run 1 soup instead of ten in this function. So no boost there.
apgsearch doesn't actually run the soups in parallel. It runs 100 individual soups until stabilisation (or stabilisation failure) and then puts the result of all of these into a single universe. The census process then runs on these 100 soups in parallel. This involves running the auxillary rules, including APGSearchCoalesece for up to 2^12 gen and the simple still life + glider detection / removal, and then conducting the object detection. I believe it is running the Coalesce rule in parallel which results in the speedup and from what I understand there's no similar process in your random glider collision search. I suspect that running the soups in parallel slows it down because there's very little similarity between the soups and any long running soups extend the time you run the already stabililised soups, though I am surprised by the magnitude of the slowdown.

Re: Random Glider Collision Search

Posted: November 26th, 2014, 6:16 am
by simsim314
dvgrn wrote:: You're not identifying or collecting infinite-growth patterns, correct?
Hmm the current stability check also checks by population and not only cells, so it will think the pattern didn't came to stability.

But it's very easy to make it include switch engines and oscillators with other periods, this is something I'm working on right now.
wildmyron wrote:This involves running the auxillary rules, including APGSearchCoalesece for up to 2^12 gen and the simple still life + glider detection / removal,
I really don't see a reason to run something for 2^12 - my rule is deleting SLs in 6 gens. Maybe to check oscillations?

I've tried to run my soups in apg, and it was running at approximately the same speed as my search (the glider collision just take longer to stabilize). I really don't see any boost in performance which is significantly better from what my search have currently.

---

My main concern for now is unification function which I think to do very simply - just extract the soups from report, and run them as new soups. This will unify all the existing soups into one large report automatically.

Another small point to notice is that there is only issue with false positives, having few soups running in parallel can only cause extra reporting, it's not so bad. Also adding symmetry would be nice. Of course solution for oscillators and infinite growth should be added.

Re: Random Glider Collision Search

Posted: November 26th, 2014, 2:50 pm
by U03BB
The script does not work for me under Debian Sid.
I get the following error:

Code: Select all

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/philipp/.golly/golly_clip.py", line 413, in <module>
    os.makedirs(SCRIPT_RESULTS_DIRECTORY)
  File "/usr/lib/python2.7/os.py", line 157, in makedirs
    mkdir(name, mode)
OSError: [Errno 13] Permission denied: '/usr/share/golly/Patterns/ScriptResults'
I've tried to change my script to be inside my home-folder, to no avail. The folder ~/.golly exists, and is used by apgsearch. I'm hesistant to change a) either the permissions of a folder in /usr/ or b) run golly as root. Any ideas?

Re: Random Glider Collision Search

Posted: November 26th, 2014, 3:19 pm
by dvgrn
U03BB wrote:The script does not work for me under Debian Sid... I've tried to change my script to be inside my home-folder, to no avail. The folder ~/.golly exists, and is used by apgsearch. I'm hesistant to change a) either the permissions of a folder in /usr/ or b) run golly as root. Any ideas?
Things should work fine if the script is set up to store things in Golly's Data directory, instead of in Patterns:

Code: Select all

SCRIPT_RESULTS_DIRECTORY = os.path.join(g.getdir("data"),"ScriptResults")
JSON_DUMP_DIRECTORY = os.path.join(g.getdir("data"),"ScriptResults")

Re: Random Glider Collision Search

Posted: November 27th, 2014, 1:45 am
by wildmyron
simsim314 wrote:
wildmyron wrote:This involves running the auxillary rules, including APGSearchCoalesece for up to 2^12 gen and the simple still life + glider detection / removal,
I really don't see a reason to run something for 2^12 - my rule is deleting SLs in 6 gens. Maybe to check oscillations?

I've tried to run my soups in apg, and it was running at approximately the same speed as my search (the glider collision just take longer to stabilize). I really don't see any boost in performance which is significantly better from what my search have currently.
Indeed, in pretty much all cases this is wasted computation in apgsearch. I mentioned it as a likely explanation for why parellelisation of census operation is a performance boost for apgsearch. The 2^12 is a maximum - in groups of 100 soups without infinite growth, or oscillators of unusual period (population periodicity not a factor of 30), the step used for the long running auxillary rules is 2^4 or 2^5.

The reason for doing it is indeed oscillator detection - in particular connecting remote regions of large oscillators. In the unlikely event of some oscillator composed of a ship (or far reaching shuttle) which is reflected between two isolated islands appearing from a soup, it will be detected as a single object. I suspect that Adam wanted to guarantee that such objects would be correctly detected. In the case of infinite growth, I'm not sure that 2^12 gen is necessary or useful, but it doesn't happen very often in life.

Re: Random Glider Collision Search

Posted: November 27th, 2014, 9:36 am
by simsim314
wildmyron wrote:The reason for doing it is indeed oscillator detection
I'm currently developing detection of oscillation. It's pretty easy to calculate the period of oscillation, you just need to get cells region, and evolve it step by step until it returns to initial state. No need to run anything for 2^12, unless it's just the maximal limit.

After the period was calculated, one can simply run "AdvancedLifeHistory" (which places state 2 to all neighbors of living cells) to connect the oscillator parts, just for oscillation period generations.

For the stuff which left the detection region you can assume it should be taken into the final report if it's not glider or *WSS.

I can imagine some "crazy" oscillator with 2^12 period, but there is no need to run some rule each time for this amount of generations - it's serious waste.

---

In other note I considered to make some "hierarchical" search, which will collide gliders into already found objects. This could increase the success rate significantly.

Re: Random Glider Collision Search

Posted: November 27th, 2014, 10:56 am
by dvgrn
simsim314 wrote:In other note I considered to make some "hierarchical" search, which will collide gliders into already found objects. This could increase the success rate significantly.
I've started writing something like this a couple of times, but haven't yet managed to get around to finishing it. My version would be the other half of the exhaustive constellation enumeration project: start with a block-or-whatever, hit it with all possible gliders and record the unique output constellations. Hit each of those with all possible single gliders, either from all directions or from one particular direction, and record new constellations again.

Repeat until out of memory or disk space (which doesn't take very many gliders). See when the first switch engine shows up, among other things.

If all gliders come from one direction, and if intermediate targets are discarded if they're too big or they emit gliders, then we're back to the slow-salvo search that has been done several times now in different ways. What hasn't been done, I think, is to hit the final list of cheap constructible constellations with gliders from all possible directions, and see which ones are clean glider turners or glider splitters.

A big catalogue of these, along the same lines as we've discussed here, would allow the construction of switching circuitry and self-destruct circuitry that's much closer to optimal (since sometimes a one-time turner will need to be built from one direction, but the triggering glider comes from another direction.)

EDIT: Hmm, now I see I should have put these last speculations in the new "Splitters with common SL" topic.