[ODE] ODE memory bug

Thomas Harte thomasharte at lycos.co.uk
Fri Sep 20 15:10:02 2002


This message is in MIME format. Since your mail reader does not understand
this format, some or all of this message may not be legible.

--=_NextPart_Caramail_0218711032559787_ID
Content-Type: text/plain; charset="us-ascii"
Content-Transfer-Encoding: 7bit

I am writing a win32 program, using SDL, ZLib and ODE 0.03. However I am suffering a 
curious problem. My program crashes at exit. I have traced the source of the crash & the 
problem may be within ODE.

ODE is not ostensibly causing the crash. It simply happens to be fprintf'ing to stderr as 
part of the atexit / destructor stage. As far as I've been able to track, this causes a crash 
because of the way SDL provides stderr and stdout files for you in its debug build - win32 
(non-console) not offering them itself ordinarily. That obviously isn't ODE's fault.

My question is - why is ODE trying to fprintf? This is the entirety of my program :

#ifdef _WINDOWS
#define WIN32_LEAN_AND_MEAN
#include <windows.h>
#endif

#include "SDL.h"
#include <GL/gl.h>
#include <GL/glu.h>
#include <math.h>
#include <malloc.h>
#include <ode/ode.h>

int main( int argc, char* argv[] )
{
	dWorldID world;
	
	world = dWorldCreate();
	dWorldDestroy(world);

	return 0;
}

And the problem is that dAllocDontReport decides there has been some error in memory 
allocation with ODE. Why does it decide this and how can I avoid the problem?

-Thomas
______________________________________________________
Check out all the latest outrageous email attachments on the Outrageous Email Chart! - http://viral.lycos.co.uk	


--=_NextPart_Caramail_0218711032559787_ID--