Hello, everybody,
I would like to achieve the following and unfortunately cannot find an option in the help. Maybe someone can help me or show me a creative workaround.
I have a running instance of Everything on a file server with ~6 million files. I would like to do a regular export of the file list to a text file. (GUI -> File | Export). In the GUI this is no problem but i would like to automate this and do it every 3 hours.
But if I start the export via the cli with the parameter -create-filelist", a new instance of Everything is started, which reads 6 million files, uses a lot of memory and takes several minutes. Is there a way to have the export executed by the running instance? In the GUI it only takes a few seconds.
Export file list automatically from running process
-
- Posts: 9
- Joined: Wed May 11, 2016 11:17 am
Export file list automatically from running process
Last edited by kurzvorknapp on Sun Dec 29, 2019 4:03 pm, edited 1 time in total.
Re: Export file list automatically from running process
Please try the command line interface: ES.
With ES you can automate exporting your Everything index to a text file, file list or csv file:
es.exe -export-txt backup.txt
-export-txt <out.txt>
With ES you can automate exporting your Everything index to a text file, file list or csv file:
es.exe -export-txt backup.txt
-export-txt <out.txt>
-
- Posts: 9
- Joined: Wed May 11, 2016 11:17 am
Re: Export file list automatically from running process
Thanks, that's what I was looking for.
Re: Export file list automatically from running process
@void:
Would using db2efu be an option too?
Like this:
The EFU created this way is slightly different from the ES.exe version. Don't know if that matters.
Can't judge what would be faster (I don't have 6 million files ...), but it looks like it uses less RAM at least (which surprises me).
Would using db2efu be an option too?
Like this:
Code: Select all
Everything.exe -update
db2efu.exe Everything.db out.efu
Can't judge what would be faster (I don't have 6 million files ...), but it looks like it uses less RAM at least (which surprises me).
-
- Posts: 9
- Joined: Wed May 11, 2016 11:17 am
Re: Export file list automatically from running process
Since the running everything process with the 6 million files is also used, it's not too bad if ~1.3 GB of Ram are used. But if the export is started per everythinh cli, ~1.3 GB Ram will be used again and this doesn't have to be the case. I don't care if es.exe or db2efu.exe occupy a few more or less MB RAM. Both tools seem to do their job.
Thanks.
Thanks.
Re: Export file list automatically from running process
Yes, this would work too.Would using db2efu be an option too?
You could also backup only the Everything.db if space is a concern.
The important thing here is you call Everything.exe -update to force Everything to flush its database to the Everything.db on disk.
The columns are ordered differently, ES leaves unknown extended information as blank (such as date created), while db2efu will use 0 for unknown extended information.The EFU created this way is slightly different from the ES.exe version. Don't know if that matters.
You may wish to zip your exported filelist.
I recommend the 7z command line interface.
"C:\Program Files\7-Zip\7z.exe" a filelist.zip filelist.txt
-
- Posts: 9
- Joined: Wed May 11, 2016 11:17 am
Re: Export file list automatically from running process
Thanks for your tips. They all work out fine. My goal is to make several fileserver / desktops etc. searchable from my workplace. To achieve this, the running computers simply draw an export every x hours via FIleList and then store it in a fileshare. The EFU files are then read by Everything at the workstation. So the search is not always but mostly up to date. Unfortunately, an ETP server does not work m:n. That would be perfect