Skip to content

Another way of gradient backward backpropagation #81

Open
@ickma2311

Description

@ickma2311

Hi @karpathy, I followed your code and created my own implementation. I also made some modifications that I believe enhance the code's clarity and understanding.

When forward(add operation as example):

out.backwards.extend([(self, 1), (other, 1)])

When backward:

    def backward(self):


        for node, partial_derivative in self.backwards:
            node.grad += self.grad * partial_derivative
            node.backward()
            if not isinstance(node, Parameter):
                node.zero_grad()

I tested it, and it appears to be working well. On the Iris dataset, I achieved an accuracy of 93%. I believe it could reach 100% if I use a more effective loss function.
My code: https://github.com/ickma/picograd

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions