[BlindRUG] Problem with processing capacity.

shey lontum sheylontum at gmail.com
Wed Mar 25 04:19:42 UTC 2020


Greetings to the Blind RUG community. Please I need help with a
problem as follows:

I am working on a modeling project, part of which involves fitting a
Bayesian model with a uniform prior. I am using the bas.lm function
from the BAS package. The project is written on r markdown and
rendered using the rmarkdown package.

When I try to render the document, execution delays on the Bayesian
modeling chunk for ages, only to come up later with an error that
says: “Cannot allocate vector of size 10.3GB”. I used memory.limit()
to increase available memory, and this time the process freeses my
computer entirely for the entire day and I have to finally terminate
the process through a forced shut down.

I have tried R studio cloud and a number of cloud alternatives, but I
struggle to figure out how they work. Please can someone suggest a
cloud option that works, or any other possible solution to the
problem?

I use a Lenovo thinkpad computer with an intel core I-7 processor,
quart core at 2.2 GZ, and a 16 GB Ram. If someone has more superior
processing capabilities and can help me, I can send over the Rmd file
and the data.

Regards.
Shey.


-- 
Lontum E. Nchadze (M.Sc)
2017 Mandela Washington Fellow (Georgia State University)
Statistical Analyst (Cameroon Ministry of Finance)
+(237) 677 199 500
"To win a war is not to win every battle, but to win crucial battles."




More information about the BlindRUG mailing list