My general rule of thumb is that I should have 3-4 times as much RAM as the
largest object that I am working with. So hopefully you have at least 4 GB
of RAM on your system. Also exactly what processing (packages, functions,
algorithms, etc.) are you using. So functions may create multiple copies,
or they may create temporary objects bigger than the original. So help us
out and provide more information. You might be able to add virtual memory,
but this may slow down your process quite a bit with paging. If you do go
this direction, then learn how to use the performance monitoring tools on
your system to see what is happening.?

Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.

On Sun, Nov 22, 2015 at 10:08 AM, Tamsila Parveen via R-help wrote:

Hello, Is there anyone to help me out how can I resolve memory
issue of R, when I want to analyze data of 1Gb file, R returns me Error:
not allocate of vector size of 1.8 GB.I tried on linux as well as on
windows with 64 bit system and using 64 bit R-3.2.2 version. So anyone who
knows please guide me to resolve this issue
[[alternative HTML version deleted]]

R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
PLEASE do read the posting guide
and provide commented, minimal, self-contained, reproducible code.

  [[alternative HTML version deleted]]

Search Discussions

Discussion Posts


Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 4 of 5 | next ›
Discussion Overview
groupr-help @
postedNov 22, '15 at 3:08p
activeNov 24, '15 at 2:07a



site design / logo © 2018 Grokbase