01/06/2013, 05:03 PM

When looking at the commonly used definition of zeration, there are three rules:

• a [0] b = a + 1 for a > b

• a [0] b = b + 1 for a < b

• a [0] b = a + 2 = b + 2 for a = b

The last one is needed to get a [0] a = a + 2. It also causes a non-continuous function for y = a [0] c at a = c. This 3rd rule therefore seems a bit of a hack to me, needed to let the zeration operation fall in line with the rest of the hyperoperation sequence. However, if the definition of the other operators is looked at more closely, it seems they are not defined consistently with the + operator.

The expression "a * b" is taken to mean "take b a's and put plus signs between them." This also leads to "a * 1" meaning "taka just 1 a", which is just of course the value "a". In the same vein we would expect "a + 1" to mean "take 1 a without any zeration operators between them", which would be just the value "a" - an uncomfortable set-up.

Another way we can approach this discrepancy between the definition of + and the higher operators is by indeed using "a [0] a" to actually mean "a + 1", and redefining the higher operators instead. A more consistent way would then be to define "a [N] b" as being "take b [N-1] operators and put a's around them." This does lead to "a * 1" to get the value 2a, so it gets weird fast from there. A benefit however is that the identity value for all operators N > 0 is 0, and e.g. "c = a * b" means c will be < a if b < 0 and > a for b > 0, which also seems nicer to me.

I haven't really looked at the further implications of these supposed alterations to the standard operators - maybe working it all out will make the new set-up fail miserably. I lack the knowledge to get deeper into e.g. deriving the correct reverse operators in the new situation (/, log, etc) or deriving a continuous function derivation for an altered to-the-power-of operator, but this angle does look interesting with regard to setting up the correct rules needed for zeration.

All insights appreciated!

Carl Colijn

• a [0] b = a + 1 for a > b

• a [0] b = b + 1 for a < b

• a [0] b = a + 2 = b + 2 for a = b

The last one is needed to get a [0] a = a + 2. It also causes a non-continuous function for y = a [0] c at a = c. This 3rd rule therefore seems a bit of a hack to me, needed to let the zeration operation fall in line with the rest of the hyperoperation sequence. However, if the definition of the other operators is looked at more closely, it seems they are not defined consistently with the + operator.

The expression "a * b" is taken to mean "take b a's and put plus signs between them." This also leads to "a * 1" meaning "taka just 1 a", which is just of course the value "a". In the same vein we would expect "a + 1" to mean "take 1 a without any zeration operators between them", which would be just the value "a" - an uncomfortable set-up.

Another way we can approach this discrepancy between the definition of + and the higher operators is by indeed using "a [0] a" to actually mean "a + 1", and redefining the higher operators instead. A more consistent way would then be to define "a [N] b" as being "take b [N-1] operators and put a's around them." This does lead to "a * 1" to get the value 2a, so it gets weird fast from there. A benefit however is that the identity value for all operators N > 0 is 0, and e.g. "c = a * b" means c will be < a if b < 0 and > a for b > 0, which also seems nicer to me.

I haven't really looked at the further implications of these supposed alterations to the standard operators - maybe working it all out will make the new set-up fail miserably. I lack the knowledge to get deeper into e.g. deriving the correct reverse operators in the new situation (/, log, etc) or deriving a continuous function derivation for an altered to-the-power-of operator, but this angle does look interesting with regard to setting up the correct rules needed for zeration.

All insights appreciated!

Carl Colijn