systemx17 Posted July 22, 2009 Share Posted July 22, 2009 Any suggestions.. I want some software that will allow me to store an index of files, paths and attributes from various hard drives that I no longer access regularly. I originally was going to head down the FreeNAS path, but I don't need to have the drives chewing power that often. - SystemX17 Quote Link to comment Share on other sites More sharing options...
TheFu Posted July 22, 2009 Share Posted July 22, 2009 Is this a trick question? `ls -lR > file_list.txt` has worked for years. In fact, I maintain my DVD collection this way by numbering the disks and placing the output of `ls -lR > disk-###.txt`. `find / -ls -print > diskname.txt` might be more what you want since it captures user/group and permissions and filename data on a single line. If you want to perform full file indexing - lots of .docs, pdf, txt, html, etc. files, then take a look at http://swish-e.org/ and htdig. My collection gets indexed in this way, so a search files these text files too. You might take a look at "cops" from 15 years ago. Now I think it is commercial, but the older versions were able to track ownership, groups, permissions, directories and files to see if anything changed. It was used my sysadmins to tell whether their systems were changed by crackers. http://docstore.mik.ua/orelly/networking/tcpip/ch12_04.htm Quote Link to comment Share on other sites More sharing options...
systemx17 Posted July 23, 2009 Author Share Posted July 23, 2009 Hmm, yeah I didn't quite think that through, I can just (cause I'm on Vista): dir *.* /S > .\drivemap.txt then import via php onto my mysql cluster for fast searching. Thanks heaps for the push in the right direction. I did check that swish-e link, but I am just doing it for non-web files. Cheers, SystemX17 Is this a trick question? `ls -lR > file_list.txt` has worked for years. In fact, I maintain my DVD collection this way by numbering the disks and placing the output of `ls -lR > disk-###.txt`. `find / -ls -print > diskname.txt` might be more what you want since it captures user/group and permissions and filename data on a single line. If you want to perform full file indexing - lots of .docs, pdf, txt, html, etc. files, then take a look at http://swish-e.org/ and htdig. My collection gets indexed in this way, so a search files these text files too. You might take a look at "cops" from 15 years ago. Now I think it is commercial, but the older versions were able to track ownership, groups, permissions, directories and files to see if anything changed. It was used my sysadmins to tell whether their systems were changed by crackers. http://docstore.mik.ua/orelly/networking/tcpip/ch12_04.htm Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.