Re: Meta-meta-blinker
Posted: August 18th, 2017, 11:40 pm
I once had an idea for a Life program where the cells would become metapixels once you zoomed in far enough, and then the cells of those metapixels would become metapixels, and so on. This would continue, with procedural generation of metapixels corresponding to single cells one level up, going through their cycles, and the speed of the pattern would change automatically with the zoom level, increasing smoothly from one generation per step to one meta-generation per step.
The cells wouldn't suddenly appear to turn into metapixels when you zoomed to the right level. In fact, the program would apply a function to each pixel that would return 0 if the number of on cells in the pattern corresponding to that pixel was less than average, and 1 if the number was greater than average. This way, at a scale of one metapixel per pixel, the pixel would be off if and only if the metapixel was off. Usually, though, the pixels would only be drawn as metapixels but rendered as a single cell with a metapixel-like evolution pattern stored in a data file. This would usually happen at farther zooms, when there were more generations between steps. Only when the user zoomed past a scale of about 256 pixels per meta-cell would the metapixel be calculated rather than animated to conserve storage memory--but even then, there would be just one ON and one OFF metapixel each running in the background and appearing wherever needed. (Actually there would probably need to be 4: off-to-off, off-to-on, on-to-off, and on-to-on.)
Doing the math: It appears that for this to work. screen pixels would need to be rendered as "on" if and only if the cell density of the pixel was at least (22524 + 7*c)/(2^22 - 63422 + 22524), where c is the total number of the rule's birth and survival transitions. (This is the limit superior of the cell densities of an OFF-state metapixel, meta-metapixel, meta-meta-metapixel...) This would apply whenever cells were zoomed out by a power of 2 steps. In fact, due to the specially chosen cutoff density, it even appears to apply when metapixels, meta-metapixels, etc. are rendered as a single pixel, because as noted above the pixels would be "on" if and only if the metapixels were in their ON state.
The period of a metapixel is 35328, and the size is 2^11 by 2^11. For this reason, it makes sense for the program to zoom in and out by factors of two. To transition smoothly from the speed of evolution to the speed of meta-evolution in eleven zooms, each zoom speed evolution up by 35328^(1/11). Specifically, t steps into the cycle at a zoom level of 2^n cells per pixel, the generation floor(t*35328^(n/11)) would be shown. I think the value of t should reset to zero every 35328 generations, even though this might make the speed rather inaccurate when n is high like 9 or 10, because fewer steps would pass before the value of t resets. At a zoom level of 2^n, each pixel would represent a square of 2^(2*n) cells. The squares would be the same every time; the coordinates of their corners would always be (p*2^n, q*2^n) for all p, q such that p and q are between 0 and (2^(11 - n)), relative to an arbitrary center point.
When the program is opened, there is an empty grid of cells which look like zoomed-out metapixels but behave, and are stored in memory, like normal cells. They can be turned on and off and everything. When zoomed out, the cells get less and less detailed. The rendering method of the scaled display works as described above.
The application should feature editing, but when zooming in past a level of about 2^-8 (meta)cells per pixel, editing the pattern changes the states of the cells making up the metapixels instead of treating the metapixels like cells themselves. However, as even small alterations of the workings of a metapixel can cause chaos to spread, I think the default mode for observing cells past this zoom level should be view only. If the alteration of the metapixel is meaningful, like changing the rule of the meta-pattern (or even making a single cell follow a different rule), the program should allow a way to do it without manually changing the states of cells, such as a toolbar option. If one did decide to edit the metapixels in a way that caused chaos, the pattern might run slowly when zoomed farther out (frequency of rendered generations is tied to zoom level, remember). One should also be sure that the program doesn't render the metapixels as single cells once they are destroyed, and the program should also check how far the destruction has spread each generation and know which parts to render manually and which to render the memorized evolution of.
How feasible is this idea? When I started drafting this post in December, I was expecting it would be done with an executable plus files that give the animations of zoomed-out metapixels and possibly a collection of pattern files. Now that Golly has the overlay, though, it might be possible to do it as a Python/Lua script, possibly in part of an interactive Scale of the Life Universe. Thoughts?
The cells wouldn't suddenly appear to turn into metapixels when you zoomed to the right level. In fact, the program would apply a function to each pixel that would return 0 if the number of on cells in the pattern corresponding to that pixel was less than average, and 1 if the number was greater than average. This way, at a scale of one metapixel per pixel, the pixel would be off if and only if the metapixel was off. Usually, though, the pixels would only be drawn as metapixels but rendered as a single cell with a metapixel-like evolution pattern stored in a data file. This would usually happen at farther zooms, when there were more generations between steps. Only when the user zoomed past a scale of about 256 pixels per meta-cell would the metapixel be calculated rather than animated to conserve storage memory--but even then, there would be just one ON and one OFF metapixel each running in the background and appearing wherever needed. (Actually there would probably need to be 4: off-to-off, off-to-on, on-to-off, and on-to-on.)
Doing the math: It appears that for this to work. screen pixels would need to be rendered as "on" if and only if the cell density of the pixel was at least (22524 + 7*c)/(2^22 - 63422 + 22524), where c is the total number of the rule's birth and survival transitions. (This is the limit superior of the cell densities of an OFF-state metapixel, meta-metapixel, meta-meta-metapixel...) This would apply whenever cells were zoomed out by a power of 2 steps. In fact, due to the specially chosen cutoff density, it even appears to apply when metapixels, meta-metapixels, etc. are rendered as a single pixel, because as noted above the pixels would be "on" if and only if the metapixels were in their ON state.
The period of a metapixel is 35328, and the size is 2^11 by 2^11. For this reason, it makes sense for the program to zoom in and out by factors of two. To transition smoothly from the speed of evolution to the speed of meta-evolution in eleven zooms, each zoom speed evolution up by 35328^(1/11). Specifically, t steps into the cycle at a zoom level of 2^n cells per pixel, the generation floor(t*35328^(n/11)) would be shown. I think the value of t should reset to zero every 35328 generations, even though this might make the speed rather inaccurate when n is high like 9 or 10, because fewer steps would pass before the value of t resets. At a zoom level of 2^n, each pixel would represent a square of 2^(2*n) cells. The squares would be the same every time; the coordinates of their corners would always be (p*2^n, q*2^n) for all p, q such that p and q are between 0 and (2^(11 - n)), relative to an arbitrary center point.
When the program is opened, there is an empty grid of cells which look like zoomed-out metapixels but behave, and are stored in memory, like normal cells. They can be turned on and off and everything. When zoomed out, the cells get less and less detailed. The rendering method of the scaled display works as described above.
The application should feature editing, but when zooming in past a level of about 2^-8 (meta)cells per pixel, editing the pattern changes the states of the cells making up the metapixels instead of treating the metapixels like cells themselves. However, as even small alterations of the workings of a metapixel can cause chaos to spread, I think the default mode for observing cells past this zoom level should be view only. If the alteration of the metapixel is meaningful, like changing the rule of the meta-pattern (or even making a single cell follow a different rule), the program should allow a way to do it without manually changing the states of cells, such as a toolbar option. If one did decide to edit the metapixels in a way that caused chaos, the pattern might run slowly when zoomed farther out (frequency of rendered generations is tied to zoom level, remember). One should also be sure that the program doesn't render the metapixels as single cells once they are destroyed, and the program should also check how far the destruction has spread each generation and know which parts to render manually and which to render the memorized evolution of.
How feasible is this idea? When I started drafting this post in December, I was expecting it would be done with an executable plus files that give the animations of zoomed-out metapixels and possibly a collection of pattern files. Now that Golly has the overlay, though, it might be possible to do it as a Python/Lua script, possibly in part of an interactive Scale of the Life Universe. Thoughts?