You are not logged in.
Hi Shahin,
i use build 2436, i execute python code to control light engine, after couple of hours the codes won't work but the plate goes up and down. when i checked the memory usage, i found that every time i add a plate, it grows about 30% each time until about 85% the commands will not be executed. i surprised that even i delete the plate the memory will not get empty and the only way to empty it is to restart the pi. another thing is that when i disable 3d preview, memory usage drops a bit for each plate adding. is there any way to empty memory without rebooting?
Offline
Hi,
Just add a plate and it keep adding up? Very strange will check see if I could reproduce it.
Offline
Memory usage is pretty stable on my tests. Make sure it is nothing to do with python calls.
Offline
upgraded to 2454
non of python scripts run when adding plates. testing results for a 42 MB plate :
boot up nano dlp: memory 2%
adding first plate: memory 34%
waiting to finish slicing, still 34% , no decrease in memory usage.
adding the same plate again: 57%
and add it again: 57%
and add a 150 MB plate: cpu 104% and 101% !!!
cpu decreases after finish slicing but not memory.
hardware: RPi 3 B+
installed a fresh image from site, same thing happened.
checking memory by htop command in ssh, memory usage by ./printer is the same even when remove the plate, memory usage will not decrease
how much does it take to free the memory after slicing?
Last edited by F.m (2019-12-01 12:49:53)
Offline
It is OK, it not suppose to decrease after slicing. It just keep memory and do not release it until OS demand it.
My guess about increase after second upload is that, before validation for the first plate get ended second get started so memory usage increases.
Offline
thanks shahin, but the problem is that when memory usage goes above 70%, the "exec" command to run python script will not executed in nanodlp and i need to reboot the nanodlp to free the memory. but it is interesting, when i run the scrip in bash, even if the memory usage is above 70%, the script works well.
Last edited by F.m (2019-12-02 09:42:31)
Offline
Is there any way to free memory each time it increases without interrupting the nanodlp job?
Offline
What is the error?
try run command using sh -c your_command on terminal, see if it still runs or not.
Offline
i told you in the latest post, after memory usage goes above 70%, nanodlp don't execute the command but with putty (in bash) when i run the scrip by "python script_address", the python script works well. the problem looks like the post here: https://www.nanodlp.com/forum/viewtopic.php?id=1358
Last edited by F.m (2019-12-07 14:23:15)
Offline
Run you script using
sh -c python script_address
See if it still working on terminal after you reached 70%
Offline
When i run sh -c python (my script address) in terminal, it just opens python and it does nothing. the scripts are located at /home/pi
pi@raspberrypi:~ $ sh -c python /home/pi/PROJ_ON.py
Python 2.7.16 (default, Oct 10 2019, 22:02:15)
[GCC 8.3.0] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>
Last edited by F.m (2019-12-08 06:26:52)
Offline
Does it work correctly in lower memory usage?
Offline
"sh - c python script_address" in bash not working on high or low memory and just opens python program. but, as i told before, "python script_address" in bash works always in high or low memory
Offline
Use "sh -c python script_address" instead of "sh - c python script_address"
Offline
i typed with my mobile phone and it generated auto space, check below results, i typed "sh -c python script_address" and it did nothing. just opens python and not running the script in both high or low memory usage.
pi@raspberrypi:~ $ sh -c python /home/pi/PROJ_ON.py
Python 2.7.16 (default, Oct 10 2019, 22:02:15)
[GCC 8.3.0] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>
Offline
Mr. Shahin, any comments?
Offline
Not sure about this issue.
Offline
Is there any way to free memory without reboot?
Offline
Call http://ip/printer/mem/free on the latest beta see if it does any help.
Offline
did not have any impact on occupied memory. why when memory usage is high nano dlp can't run the script but it is run on bash?
Offline
dear Shahin, upgraded to rpi 4, 4gb of memory, memory usage solved, but i want to know that is there any memory limiter on nanodlp because i tested lower builds and the memory never goes above a certain quantity.
Last edited by F.m (2019-12-28 12:45:37)
Offline
Memory usage on older build also goes up after plate upload. So nothing new this regard.
There is a functionality on nanodlp which slow down processing when memory usage get high.
Offline
here is the screenshot from latest beta build, memory goes above 90% (see the memory level on image) and it restarts. the plate size 140 MB.
it doesn't release memory until restart
Last edited by F.m (2020-01-15 12:16:47)
Offline
I need the STL file and a debug file to reproduce it locally. It is very high memory usage for such small file.
Offline
when the 3d preview is off, memory usage doesn't increase this much. how can i send stl file?
Offline