luni
Well-known member
Triggered by this https://forum.pjrc.com/threads/6820...ill-a-bad-idea?p=288694&viewfull=1#post288694 I was doing a few experiments with detecting out of memory conditions and operator 'new' and stumbled over some very weird behavior. The code below tries to allocate chunks of 150kB using new, new(std::nothrow) and malloc. It checks the returned pointer and stops if it gets a nullptr.
Might be some strange cache effect?
This prints:
So, 'new' seems to return a nullptr, Serial.printf prints it correctly but the if clause is not able to detect it?? Thus it tries to write to the nullpointer and crashes. Crashreport correctly complains about writing to a null pointer.
Never saw such a thing. As mentioned in the linked thread above, new tries to throw an exception, so I'd expect a crash but not that weird behavior. If you don't write to the nullpointer the program doesn't even crash but the if clause still doesn't detect the null pointer.
If you use the new(std::nothrow) version or malloc everything works as expected. This time the if clause can detect the null pointer as it should.
Using the nothrow version of new is a bit tedious and I assume that most of the libraries don't do this. The gcc compiler flag -fcheck-new seems to fix this. In this case all three versions return a "checkable" 0 if out of memory and everything works as expected. The description of this flag is a bit confusing. But it might be worth testing this in the next beta version.
Might be some strange cache effect?
Code:
#include "Arduino.h"
#include <memory>
void setup()
{
while (!Serial) {}
if (CrashReport)
Serial.println(CrashReport);
}
void loop()
{
uint8_t* ptr = new uint8_t[1024 * 150]; // CRASHES without -fcheck-new
//uint8_t* ptr = new (std::nothrow) uint8_t[1024 *150]; // OK
//uint8_t* ptr = (uint8_t*) malloc(1024 * 150); // OK
Serial.printf("Addr: %p\n", ptr);
if (ptr)
{
Serial.print(" try write...");
ptr[0] = 42;
Serial.println(" ok, still alive\n");
}
else
{
Serial.printf(" OUT OF MEMORY %p\n", ptr);
while (true) yield();
}
}
This prints:
Code:
Addr: 0x20203068
try write... ok, still alive
Addr: 0x20228870
try write... ok, still alive
Addr: 0x2024e078
try write... ok, still alive
Addr: 0x0
try write...
So, 'new' seems to return a nullptr, Serial.printf prints it correctly but the if clause is not able to detect it?? Thus it tries to write to the nullpointer and crashes. Crashreport correctly complains about writing to a null pointer.
Never saw such a thing. As mentioned in the linked thread above, new tries to throw an exception, so I'd expect a crash but not that weird behavior. If you don't write to the nullpointer the program doesn't even crash but the if clause still doesn't detect the null pointer.
If you use the new(std::nothrow) version or malloc everything works as expected. This time the if clause can detect the null pointer as it should.
Using the nothrow version of new is a bit tedious and I assume that most of the libraries don't do this. The gcc compiler flag -fcheck-new seems to fix this. In this case all three versions return a "checkable" 0 if out of memory and everything works as expected. The description of this flag is a bit confusing. But it might be worth testing this in the next beta version.
-fcheck-new
Check that the pointer returned by operator new is non-null before attempting to modify the storage allocated. This check is normally unnecessary because the C++ standard specifies that operator new only returns 0 if it is declared throw(), in which case the compiler always checks the return value even without this option. In all other cases, when operator new has a non-empty exception specification, memory exhaustion is signalled by throwing std::bad_alloc. See also ‘new (nothrow)’.