Everything 1.4.0.713b Beta for Windows
Windows 7 x64
I tried to do Folder Indexing on very big network share.
After some hours of indexing I get this error:
---------------------------
Everything
---------------------------
.\src\mem.c(566): mem_alloc(): Fatal error: out of memory 00010000
---------------------------
ОК
---------------------------
I got the same error after adding this big share to File List Editor for an hour with "Everything (Not responding)" window.
1) Can you write new data to disk by chunks to avoid this bloating?
2) Can you show some progress during Folder Indexing and adding File Lists? Ie number of files processed and time elapsed, maybe with total number and time estimated?
3) What is the easiest way to generate EFU File list on Linux? Maybe bash/awk/python script?
Out of memory during Folder Indexing and adding File List
Re: Out of memory during Folder Indexing and adding File Lis
> very big network share
How big?
EFU is a CSV.
Filename,Size,Date Modified,Date Created,Attributes
Full filename in quotes
(Unicode, if I'm saying it correctly, allowed)
Size in decimal
(bytes)
Dates, I don't recall which method is used, but it is not based on Unix time
(Seem to recall its mention in these forums?)
Attributes, not sure?
https://en.wikipedia.org/wiki/65536_%28number%29
How big?
EFU is a CSV.
Filename,Size,Date Modified,Date Created,Attributes
Full filename in quotes
(Unicode, if I'm saying it correctly, allowed)
Size in decimal
(bytes)
Dates, I don't recall which method is used, but it is not based on Unix time
(Seem to recall its mention in these forums?)
Attributes, not sure?
https://en.wikipedia.org/wiki/65536_%28number%29
Re: Out of memory during Folder Indexing and adding File Lis
There are 6 800 000 files in this share.
I made list by script, filelist.efu 1.2GB.
Now Everything craches during loading it:
---------------------------
Everything
---------------------------
.\src\mem.c(566): mem_alloc(): Fatal error: out of memory 01b53d74
---------------------------
ОК
---------------------------
At this time Everything process used ~700MB of ram (WorkingSet)
and 4GB of ram is still free
So why didn't use chunks?
I made list by script, filelist.efu 1.2GB.
Now Everything craches during loading it:
---------------------------
Everything
---------------------------
.\src\mem.c(566): mem_alloc(): Fatal error: out of memory 01b53d74
---------------------------
ОК
---------------------------
At this time Everything process used ~700MB of ram (WorkingSet)
and 4GB of ram is still free
So why didn't use chunks?
Last edited by md55 on Mon Feb 01, 2016 2:23 pm, edited 1 time in total.
Re: Out of memory during Folder Indexing and adding File Lis
Good Evening @md55
Have a look at this post viewtopic.php?f=5&t=5279 I had a very similar issue and after I defragmented my HDD all was good.
Hope it helps.
Have a look at this post viewtopic.php?f=5&t=5279 I had a very similar issue and after I defragmented my HDD all was good.
Hope it helps.