C binary reaches 1.2G memory limit -


i dealing c program reads in file many lines of 60 characters each , allocates string in memory requesting more memory reads file in. after each malloc request, checks function oom() if request more memory successful.

i have tested program increasingly larger input file, , oom() reports "out of memory" more or less when memory usage reaches 1.2g when looking @ top command while program running. on 64bit linux machine plenty more memory available. output file /my/binary/program:

elf 64-bit lsb executable, x86-64, version 1 (sysv), dynamically linked (uses shared libs), gnu/linux 2.6.18, not stripped 

my question is: why reaching 1.2g limit? remember sysadmin used binaries able use 1.2g, coincidentally seeing here.

when run qsub same execution on node of same 64bit linux sge grid reserving 50gb of memory, reports goes "out of memory" , following sge log memory footprint:

max vmem         = 2.313g 

any ideas why program reaching memory limit? there compilation flags should aware of cause/solve this?

find below relevant flags in current makefile:

cc = gcc cflags = -wall -o3 -funroll-loops -dndebug -fomit-frame-pointer -std=gnu99 -msse2 -wno-unused-function -wno-unused-result cflagssfmt = -msse2 -dhave_sse2 -o9 -finline-functions -fomit-frame-pointer \ -dndebug -fno-strict-aliasing --param max-inline-insns-single=1800 -std=c99 ld = ld ldflags =  -lm -lc -lblas -llapack incflags =  defines = -d_gnu_source -duse_blas 

some of relevant code belo:w

in mystring.h:

#ifndef _mystring_h_ #define _mystring_h_  struct __mystring_struct {    char * string;    int len, maxlen; }; typedef struct __mystring_struct * mystring; #define mystring_size sizeof(struct __mystring_struct)  mystring new_mystring (const int len); void free_mystring (mystring string); void append_char_to_mystring ( const char c, mystring string); char * cstring_of_mystring(const mystring string); mystring mystring_of_cstring (const char * str); #endif 

in mystring.c:

#include <string.h> #include "mystring.h"  #define oom(a) { if (null==(a) ){fputs("out of memory\n",stderr); exit(exit_failure);} } static void check_is_mystring (const mystring string); static void double_length_of_mystring ( mystring string); 

later on:

static void double_length_of_mystring (mystring string){    char * new_mem;    check_is_mystring(string);     new_mem = malloc(string->maxlen * 2 * sizeof(char)); oom(new_mem);    memcpy (new_mem,string->string,string->len * sizeof(char));    free(string->string);    string->string = new_mem;    string->maxlen *= 2;     check_is_mystring (string); } 

it seems use int keep size of string. in gcc (and in other pc platform compilers) type is 32b on 64b platform. should use size_t instead.

the mechanism of failed allocation follows:

  • 1.2 gb ~= 1288490189
  • 2 * 1.2 gb ~= 2576980378, on 2^31 (2147483648), after overflow -1717986918 in 2nd complement arithmetics
  • when calling malloc, -1717986918 sign extended 64 b, , casted unsigned 64b, gives 2^64 - 1717986918, marginally less 2^64, , more memory have in system.

Comments

Popular posts from this blog

qt - Using float or double for own QML classes -

Create Outlook appointment via C# .Net -

ios - Swift Array Resetting Itself -