出现USE_INT64_TENSOR_SIZE=1的问题

Traceback (most recent call last):
File “F:/Python_Project/SemiSegment/TrainFirst.py”, line 37, in
y_hats = [net(X) for X in Xs]
File “F:/Python_Project/SemiSegment/TrainFirst.py”, line 37, in
y_hats = [net(X) for X in Xs]
File “E:\Anaconda\lib\site-packages\mxnet\gluon\block.py”, line 693, in call
out = self.forward(*args)
File “E:\Anaconda\lib\site-packages\mxnet\gluon\block.py”, line 1158, in forward
return self.hybrid_forward(ndarray, x, *args, **params)
File “E:\Anaconda\lib\site-packages\mxnet\gluon\nn\basic_layers.py”, line 119, in hybrid_forward
x = block(x)
File “E:\Anaconda\lib\site-packages\mxnet\gluon\block.py”, line 693, in call
out = self.forward(*args)
File “E:\Anaconda\lib\site-packages\mxnet\gluon\block.py”, line 1158, in forward
return self.hybrid_forward(ndarray, x, *args, **params)
File “E:\Anaconda\lib\site-packages\mxnet\gluon\nn\conv_layers.py”, line 149, in hybrid_forward
act = self.act(act)
File “E:\Anaconda\lib\site-packages\mxnet\gluon\block.py”, line 693, in call
out = self.forward(*args)
File “E:\Anaconda\lib\site-packages\mxnet\gluon\block.py”, line 1158, in forward
return self.hybrid_forward(ndarray, x, *args, **params)
File “E:\Anaconda\lib\site-packages\mxnet\gluon\nn\activations.py”, line 53, in hybrid_forward
return act(x, act_type=self._act_type, name=‘fwd’)
File “”, line 44, in Activation
File “E:\Anaconda\lib\site-packages\mxnet_ctypes\ndarray.py”, line 107, in _imperative_invoke
ctypes.byref(out_stypes)))
File “E:\Anaconda\lib\site-packages\mxnet\base.py”, line 278, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
mxnet.base.MXNetError: [18:43:17] C:\Jenkins\workspace\mxnet\mxnet\src\c_api\c_api_ndarray.cc:59: Check failed: inp->shape().Size() < (int64_t{1} << 31) - 1 (2516582400 vs. 2147483647) : [SetNDInputsOutputs] Size of tensor you are trying to allocate is larger than 2^31 elements. Please build with flag USE_INT64_TENSOR_SIZE=1

代码大致和FCN那章相同,但是网络的前几层我改成了VGG-11,然后出现了这个问题,看起来好像是ndarray的size太大,要改flag,但是不清楚这个在哪里改

希望各位大大能够解答

你好 ,我也遇到了类似问题 请问你的问题解决了吗