BUG: linalg: emit a MemoryError on a malloc failure (#29811) #29839
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Backport of #29811.
Otherwise, a malloc failure in
init_gesv(...)is not acted upon, and the python return value is silently wrong---inv(non_zero_array)returns an array of zeros.On main, using the script under the fold,
numpy._core._exceptions._ArrayMemoryError: Unable to allocate 191. MiB for an array with shape (5000, 5000) and data type float64from somewhere else in the gufunc machinery;resbe an array of all zeros, no error.Note that I don't know what I'm doing with
NPY_ALLOW_C_API_DEFetc ---here I simply parrot the macros from elsewhere in this source file, e.g. https://github.com/numpy/numpy/blob/main/numpy/linalg/umath_linalg.cpp#L1202The same issue seems present in other linalg functions, so if this fix looks reasonable, I'll extend to other functions.