Hi,
I am observing that my VEE application utilizes too much memory especially on file reading operation.
If I read a file in a loop, like unix tail program does, my laptop runs our of memory very quickly. VEE application crashes.
Even I observe that other VEE components(trying to track them down) use a memory and do not free it. If my application runs a long time, it happens that memory is full again.
How memory is being handled by VEE? Does anybody have any tips how to avoid such kind of problem?
Thanks.
Win7 32bits and VEE 9.2
I am observing that my VEE application utilizes too much memory especially on file reading operation.
If I read a file in a loop, like unix tail program does, my laptop runs our of memory very quickly. VEE application crashes.
Even I observe that other VEE components(trying to track them down) use a memory and do not free it. If my application runs a long time, it happens that memory is full again.
How memory is being handled by VEE? Does anybody have any tips how to avoid such kind of problem?
Thanks.
Win7 32bits and VEE 9.2
Bill Ossmann (2005 vrf):
I had a similar problem myself a few years ago. I had only about 2k points per trace, but I was passing the data along through many objects to extract measurements. With debug features turned on, VEE saves a copy of the data on every input pin, so pretty soon I ran out of virtual memory. I could not turn the debug features off because I needed to use them in development.
What worked for me was to save the data in global variables with names derived from a time stamp, e.g., "x"+asText(intPart(1000*now())). Then I could just pass the variable name around with the extracted measurements, basically just using a pointer. Depending on how much data you have, and how many objects you have to pass it through, this may or may not be useful.
Some observations:
1. Turning debug features off could save lots of memory if your data pass through multiple objects.
2. If you can't turn debug features off, using pointers as described above, or files as Shawn suggested might help, or try to minimize the number of objects the data pass through.
3. As Shawn points out, you are dealing with a gross amount of data; just plotting it will chew up loads of virtual memory, and likely be slow besides.
4. Given 3, give serious thought to Shawn's suggestion to plot a decimated (thinned) set of points. On screen display will not show a horizontal resolution of 2M points. Of course, if you want to be sure to catch narrow spikes, etc. this will take some intelligent selection of points. If you want to be able to zoom in to small sections, that is probably even more tedious.