So in the new_context() class (this being the beasty that stores all state in a convoluted hash table, something I'm hoping to replace with something more elegant later), we have two key pieces: context.learn(a,b,c) and context.recall(a,b).
The first of these does the hard work in learning knowledge.
The second of these does the hard work when answering questions:
def recall(self,op,label,active=False): coeff = label.value if type(label) == ket else 1 # coeff = 1 # use this to switch off the multiply(coeff) feature op.label.split("op: ")[-1] if type(op) == ket else op label = label.label if type(label) == ket else label ...where:
op is the operator, either ket("op: friends") or a direct string: "friends" label is the ket label from our learn rule, either ket("Fred") or a direct string "Fred" active is a variable that keeps track of when we want to activate stored rules.OK. So we have that. Now in the ket() class we have:
def apply_op(self,context,op): return context.recall(op,self,True)And in the superposition() class we have:
def apply_op(self,context,op): result = superposition() for x in self.data: result += context.recall(op,x,True) return resultwhere in the superposition class, self.data stores the list of kets (perhaps later we will re-implement this list of kets as an ordered dictionary because the list representation sometimes has terrible big-O), and so this for loop is what implements the linearity of operators.
Where by "linearity of operators", recall:
if: op1 |x> => |a> + |b> + |c> + |d> + |e> then: op2 op1 |x> = op2 (|a> + |b> + |c> + |d> + |e>) = op2 |a> + op2 |b> + op2 |c> + op2 |d> + op2 |e>So what point am I trying to make? Just trying to make it clearer what cell.apply_op() meant in the walking-our-grid post.
Anyway, a couple of examples:
sa: friends |Fred>is translated to this python:
ket("Fred").apply_op(context,"friends")and
op (2|x> + 3|y> + |z>)is translated to this python (noting that when you add kets you get a superposition):
(ket("x",2) + ket("y",3) + ket("z")).apply_op(context,"op")I guess that is enough for now. I hope things are a little clearer!
No comments:
Post a Comment