Here is the usage:
-- uses "" as the default pattern. active-buffer[N,t] some-superposition -- uses your chosen pattern (we can't use "" as the pattern, due to broken parser!) active-buffer[N,t,pattern] some-superposition where: N is an int -- the size of the active buffer t is a float -- the drop below threshold pattern is a string -- the pattern we are usingHere is the python:
def console_active_buffer(one,context,parameters): # one is the passed in superposition try: N,t,pattern = parameters.split(',') N = int(N) t = float(t) except: try: N,t = parameters.split(',') N = int(N) t = float(t) pattern = "" except: return ket("",0) one = superposition() + one # make sure one is a superposition, not a ket. result = superposition() data = one.data for k in range(len(data)): for n in range(N): if k < len(data) - n: y = superposition() y.data = data[k:k+n+1] # this is the bit you could call the buffer. result += context.pattern_recognition(y,pattern).drop_below(t) return resultYeah, the code is a bit ugly! Not currently sure the best way to tidy it. Probably when I have working sequences.
1) this is implemented using superpositions, but really it would work better with sequences. Unfortunately I haven't fully worked out how sequences will work, let alone written any code.
2) The code above uses fixed sized buffers (N). It seems likely in the brain there are end-buffer kets that signal the end of the buffer and trigger processing of the buffer, rather than a fixed size. eg, for letters non-word chars trigger processing of a word.
3) the general supervised pattern recognition algo somewhat decreases the use case for this code.
Examples in the next post!