Memory Management for OoS-Analysis

Use this forum to post questions about syntax problems or general programming issues. Questions on implementing a particular aspect of econometrics should go in "Econometrics Issues" below.
jonasdovern
Posts: 97
Joined: Sat Apr 11, 2009 10:30 am

Memory Management for OoS-Analysis

Unread post by jonasdovern »

Dear RATS-Team,
I have problems understanding the memory demands of RATS during an out-of-sample forecasting analysis I am performing based on a really large VAR model (a global VAR of the type proposed by Pesaran et a. in a sequence of papers).

The basic structure of my code is as follows. In a first exercise I do the oos-analysis for a single GVAR specification:

Code: Select all

* There is an array that is used to store the forecasts made at each time for different horizons (dimensions refere to #countries[#variables[#fsteps]]; further dimensioning is done elsewhere)
decl vec[vec[vec[ser]]] gvarfcts(nc) 
do time=simstart,simend
   * This is a procedure that estimates the GVAR-Model, produces forecasts. This uses a lot of internal (local) arrays and also produces a number of global arrays that are used to return results
   @GVAR(...) ...
   * In a second step the forecasts are stored to the gvarfcts-array
   do c=1,nc
      do v=1,nv
         do f=1,fsteps
            set gvarfcts(c)(v)(f) time+f-1 time+f-1 = %GVARFCTS(c)(v)(t)
         end do f
      end do v
   end do c
end do time
This works fine and at the end 169,747,414 bytes are in use by RATS.

Now, in a second exercise I would like to make an additional loop over (nm) different model specifications. As a first attempt to save memory I do not save all of the forecasts to be able to aggregate them after the entire oos-analysis, but I just save the nm different forecasts at each simulation time, average them after looping over all models and use the same array to temporarily store the forecasts from individual models at each simulation time. The basic code looks as follows:

Code: Select all

* There is an array that is used to store the (average) forecasts made at each time for different horizons (dimensions refere to #countries[#variables[#fsteps]]; further dimensioning is done elsewhere) ...
* ... and an additional array that temporarily stores the forecasts from individual models (dimensions refere to #models[#countries[#variables[#fsteps]]]; further dimensioning is done elsewhere) 
decl vec[vec[vec[ser]]] gvarfcts(nc) 
decl vec[vec[vec[vec]]] tempvec(nm)
do time=simstart,simend
do m=1,nm
   * This is a procedure that estimates the GVAR-Model, produces forecasts. This uses a lot of internal (local) arrays and also produces a number of global arrays that are used to return results
   @GVAR(options[m]) ...
   * In a second step the forecasts are stored to the tempvec-array
   do c=1,nc
      do v=1,nv
         do f=1,fsteps
            comp tempvec(m)(c)(v)(f) = %GVARFCTS(c)(v)(time+f-1)
         end do f
      end do v
   end do c
end do m
* At this point, I average the forecasts stored in tempvec and store these in the gvarfcts-array
@aggregatefc gvarfcts tempvec time 
end do time
What I thought was that this would use a little more memory since RATS needs to store the new tempvec. What happens is however, that the loop stops at some point saying that "a request for additional memory cannot be ...". RATS uses something like 1,700,000,000 bytes at that point. The local and global arrays that are used/created by @GVAR are the same at each call of the procedure. Does it nevertheless use more memory to run the procedure twice than only once? How could I make my program release memory at the end of each loop-iteration? It is not clear to me which arrays that I do not necessarily need to store the basic forecast results are the ones that use up all the memory.
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Memory Management for OoS-Analysis

Unread post by TomDoan »

Memory leaks are usually the result of re-dimensioning upwards things like VECTORS, and a VEC[VEC[VEC[SERIES]]] would be a prime candidate for that. Does that (in each dimension) have the same dimensions each time through? (The problem is that you keep needing a bigger block each time the size goes up, and it can sometimes be hard to "garbage collect" the unused space).
jonasdovern
Posts: 97
Joined: Sat Apr 11, 2009 10:30 am

Re: Memory Management for OoS-Analysis

Unread post by jonasdovern »

The vec[vec[vec[vec]]] tempvec is actually only dimensioned before starting the loop structure for the oos-analysis and not changed in its structure at all during the loop. The same holds true for the vec[vec[vec[ser]]] gvarfcts (though during each iteration one additional forecast is stored into the series.

The only things that can change dimensions during each call of @gvar are local arrays that are used during the estimation of the model. These change size due to the increasing sample and the different specifications (e.g. lag orders). Would it help to explizitly release the memory used for these local arrays at the end of the procedure?
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: Memory Management for OoS-Analysis

Unread post by TomDoan »

Are they local arrays or local arrays of series? And how big are they likely to be?
jonasdovern
Posts: 97
Joined: Sat Apr 11, 2009 10:30 am

Re: Memory Management for OoS-Analysis

Unread post by jonasdovern »

Both: @GVAR itself constructs arrays of series with data (e.g. foreign variables for each country-specific model that are trade-weighted averages of the data of all other countries) which will be in the order of 30-by-7 (vec[vec[series]]). Furthermore, several sub-procedures construct also a lot of other simple arrays: Often simply country-specific with sizes of e.g. k-by-T where k is the modelsize (which differs across countries but can be as large as 20) and T equals roughly 125 is the sample size; but also arrays with results which will be of order 30-by(-by-7)-something.

So, yes the model is really large but it seems to not use too much memory when it is estimated only once.

I would send you the files but it is probably several 1000s of lines of code spread over a lot of files and I am not sure if I could compile a package in the first attempt that runs on a different computer.
jonasdovern
Posts: 97
Joined: Sat Apr 11, 2009 10:30 am

Re: Memory Management for OoS-Analysis

Unread post by jonasdovern »

I tried to include a release command to free-up memory used to store local arrays at the end of most of the procedures that I use. This does not, however, seem to reduce the problem.

Does writing arrays to text files use up a lot of memory?

Update: Even if I use the procedure to estimate the same model multiple times, each call of @GVAR increases the memory used by RATS:

Code: Select all

In use by RATS       24724882 bytes

In use by RATS       37929488 bytes

In use by RATS       41488530 bytes

In use by RATS       44379346 bytes

In use by RATS       47234318 bytes
The first status is from before running the procedure. The first estimation adds about 13,000k bytes to the memory that RATS uses. Each estimation of exactly the same specficiation adds another 3,000k bytes.
jonasdovern
Posts: 97
Joined: Sat Apr 11, 2009 10:30 am

Re: Memory Management for OoS-Analysis

Unread post by jonasdovern »

While I couldn't solve the real problem of memory-management in RATS I found the following way around it: I now build the outer-loop over the different model specifications and save the forecasts from each model to an Excel file before I overwrite these forecasts with the forecasts from the next model.

Later, I average the different forecasts in a different RATS program by reading the forecasts from the Excel files. Not very convenient ... but it works.
Post Reply