2011년 6월 20일 월요일 오후 4:25
I'm looking at running the OMPM tool on network file shares. I know there are a lot of variables (complexity of macro's, etc, etc) but is there a general guideline on how long the scanning tool takes to run??
ie: 60 minutes per 100,000 files???
I have a client who would like a rough time estimate of how long this might take for resource scheduling purposes.
2011년 6월 20일 월요일 오후 7:06
I'm afraid we don't have specific guidance yet in this area. As you might guess, the performance can vary and depends on multiple factors. We work closely with our consultants who use OMPM and publish their findings from time to time in our blog. Best I can do is suggest you monitor the blog for updated OMPM information as it becomes available:
- 답변으로 표시됨 Harry Yuan 2011년 6월 23일 목요일 오전 7:44
2011년 6월 23일 목요일 오후 9:10
I am working as an Office consultant for three years now and delivered several OMPM projects. And I can tell you one thing: There is no rule of thumb. It really depends on your network and how many network segments you have between your scanning machine and your targeted fileshare.
I would recommend you the following approach:
Configure one scan job to "flatscan" your targeted shares (set deepscan=0 in the offscan.ini) and kick it off. A flatscan is really fast, it should scan several thousand files in a minute. When the scan is finished the output at the console will state the number of created xmls (that information will also be written to the logfile). That gives you a rough estimation of your number of office files. Eg. "number of xmls created 100000" means you have about 100,000 office documents on your fileshare.
Configure a second scan job as deepscan and point it to a share or subdirectory on the same server which holds 50 to 100 Gb of data. After the scan is finished it will show you the duration at the console (E.g. seconds: 600) and the number of xmls created.
Now you can calculate the average time needed per file (devide seconds by number of xml files). Multiple it with the number of office documents discovered from the first scan and you have the total amount of time you would need if you employ one scan job which scans all files.
Now start to play around with the number of scan jobs and the total number of files you will scan (you can and should set a timelimit (LastModifiedDate) to reduce the number of files which are in scope of your scan).
Hope that helps, greetings from Germany.
Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.
2011년 6월 27일 월요일 오후 3:15
Thanks for the replies.
@M. Nothnagel Many thanks for sharing your real-world experience and solutions. Much appreciated!
2012년 2월 21일 화요일 오후 8:29
is there any way to scan enternal sharepoint using OMPM.
2012년 11월 15일 목요일 오후 11:40
As many have posted that there are no generic guidelines to estimate. Usually based on our experience it takes on an average 15 sec per file for Deep Scan.
2012년 11월 16일 금요일 오후 2:02
I used this robocopy script to discover where the files are and the numbers etc. Then performed a cooupld of light scans then a test deep scan on a small percentage of the total that needed scanning to make my calculations.
I found this script online so I can't take any credit for it but am happy to pass this useful info on.
robocopy \\servername c:\temp\DocIno /xj /w:5 /r:2 /s /ndl /l /if *.xls /if *.xlt /if *.xla /if *.xlc /if *.xlm /if *.ppt /if *.pot /if *.pps /if *.ppa /if *.doc /if *.dot /if *.wiz >> "c:\temp\DocInfo-DocumentInfo.log"
Not only does this help in making your calculations but also helps in accuratly aiming your scans. Very useful.
I hope its of some use to you.
Also, distribute your scans as much as possible. This helps in narrowing down errors and speeds things up. One thing I have learnt recently is that more is better in terms of machines to run scans from.