Some rough notes on modularity, composite state maps, cascades and aperiodics.
The larger question is how to factor discrete systems into modules. Suppose \(f:E^*\to X\) where \(E^*\) is the set of finite sequences over \(E\). This question is then about the structure of \(g(s)= h(g_1(u_1),\dots g_n(u_n))\) so that \(g(s)=f(s)\) and the \(u_i\) are dependent sequences as defined previously. To start with something simple, suppose, \(f(s)\) counts events mod \(k\) for some \(k > 1\).
\[f\epsilon = 0\quad f(s.e)= f(s)+1\bmod k\]
where \(\epsilon\) is the empty sequence and \(s.e\) is the sequence obtained by appending event \(e\) on the right to sequence \(s\). These two equations define \(f\) on all elements of \(E^*\). One question is whether \(f\) can be constructed from 2-state components as a cascade product even though it cycles. The answer is obvious to anyone who has seen a carry-lookahead adder or possibly to anyone who has seen Lagranges theorem in group theory or the Krohn-Rhodes theorem. If \(k= 2^n\) for some \(n>1\) then the answer is “yes, \(n\) binary-state components in series (for the carry) do the job (the underlying group for \(f\) then is \(Z_k\) ).
Let \(E_2=\{0,1\}\) and \(\beta(\epsilon) =0 \) and \(\beta(s.e) = \beta(s)+e\bmod 2\). Then let \(g(s) = \Sigma_{i=1}^n 2^{i-1}*\beta(u_i(s)))\) where \(u_i\epsilon =\epsilon\) as usual, \(u_1(s.e)=u_1(s).1\) and
\[ \mbox{for }i>1, u_i(s.e) = u_i(s).x\mbox{ where }x = \Pi_{j=1}^{i-1}f_{j}(u_{j}(s)))\]
Then \(g(s) = f(s)\) and the \(x\) above is the computed carry. But the implication is that if, for example \(k\) is prime, then we need a cycle somewhere. If that’s right, it’s amazing. Number theory is the king of all.
Aperiodic state maps, for example, ones that do saturation counting \(0,1,2,\dots n,n,n\dots\) should also factor according the the Krohn-Rhodes insight. But in that case, I don’t see good “modular” decomposition. Say \(f\) is aperiodic iff there is some \(n\) so that for any sequence \(s\) of length \(n\) or more and any event \(e\) in the alphabet of \(f\), it must be that \(f s = f(s.e)\). Then let \(E^f\) be the finite set of all sequences over \(E\) of length \(n\) or less and let \(h(s) = f(s)\) for \(s\in E^f\). We define a store machine so that
\[ store(\epsilon)\not\in E,\quad store(s.e) = e \mbox{if }store(s)\not\in e\mbox{ and }store(s.e)=store(s)\mbox{ otherwise}\]
Let \(g(s) = h(concat(store(u_1(s)),\dots, store(u_n(s))))\). Set \(u_1(s)=s\). Set \(u_{i+1}(s.e) = u_{i+1}(s)\) if \(\beta(u_i(s)) \not\in E\) and \(u_{i+1}(s.e) = u_{i+1}(s).e\) otherwise. Then the factors “remember” the first \(n\) events. But each component has \(|E|+1\) possible states and so, even though these “store” machines are themselves aperiodic and can be easily reduced to \( log_2(|E|+1)+1 \) bits, the reduction is hard to prize.