Efficient memory management in R -


i have 6 gb memory in machine (windows 7 pro 64 bit) , in r, get

> memory.limit() 6141 

of course, when dealing big data, memory allocation error occurs. in order make r use virtual memory, use

> memory.limit(50000) 

now, when running script, don't have memory allocation error more, r hogs memory in computer can't use machine until script finished. wonder if there better way make r manage memory of machine. think can use virtual memory if using physical memory more user specified. there option that?

look @ ff , bigmemory packages. uses functions know r objects keep them on disk rather letting os (which knows chunks of memory, not represent).


Comments

Popular posts from this blog

monitor web browser programmatically in Android? -

Shrink a YouTube video to responsive width -

wpf - PdfWriter.GetInstance throws System.NullReferenceException -