Softmax回归的从零开始实现,自动求导失败

执行书中l.backward()的时候报下面这个错误,麻烦问一下这个是什么原因:
src/imperative/imperative.cc:293: Check failed: !AGInfo::IsNone(*i) Cannot differentiate node because it is not in a computational graph. You need to set is_recording to true or use autograd.record() to save computational graphs for backward. If you want to differentiate the same graph twice, you need to pass retain_graph=True to backward

我也遇到这个问题。我开发环境用的datalore在线环境。

错误在于原书代码没有为求导变量提前分配内存
需要在with语句前加 X.attach_grad()
b.attach_grad()
w.attach_grad()

1赞

感谢~成功解决