forked from PacktPublishing/Graph-Machine-Learning
-
Notifications
You must be signed in to change notification settings - Fork 1
Closed
Description
In the notebook 02_Shallow_embeddings.ipynb the rows of the matrix returned by glp.predict_proba(G) are not summing to 1 (as expected, since each row should be a probability assigment over two classes).
I think the issue is due to the fact that the probability distributions are not explicitly normalized at each iteration. In the iterative update step within your while loop, there should be a normalization step after computing the new Y:
while it < self.max_iter and c_tool > self.tol:
Y = A * Y_prev # Matrix multiplication for label propagation
Y = Y / Y.sum(axis=1, keepdims=True) # Normalize rows to sum to 1
Furthermore, I think the hadamard product (*) there is not intended. Instead, it should be a matrix multiplication. Therefore the final code should be:
while it < self.max_iter and c_tool > self.tol:
Y = A @ Y_prev # Matrix multiplication for label propagation
Y = Y / Y.sum(axis=1, keepdims=True) # Normalize rows to sum to 1
Both in the GraphLabelPropagation and in the GraphLabelSpreading class
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels