none
Having issues with the files coming up slow RRS feed

  • Question

  • Hello

    We are using MS Project 2007 and have multiple files with a Master. About a month ago i moved Baseline into Baseline1 which increaed the size of the files. However, I am not sure that is the only thing causing more of a delay than we used to get.

    Am having issues with the time it takes to open up the Master file that has 3o files inserted and linked. It takes a couple minutes to open it and then when we try and update anything it takes 5 seconds or longer for each update.

    I have listed the size of the files below with the total being 110,323KB.

    We all have at least 2.89 GIG of Ram with 2.9 Ghz

    I have a couple questions..........

    1. How can we get this to be faster when we have the Master open and we want to update?

    2. Would running a viewer on all the files through the Master be a lot faster? We have many people that all they want to do is view the files?

    3. Is it normal to get this slow of a response on this size 110,323KB.

    4. What are the causes of file size and can we cut down the file size in other ways in order to speed up updating when the entire Master is open?

    5. If a viewer is faster could we use the viewer and than open the individual file and make an update?

    6. Is there something like an external add on device that would speed up the processing?

    Thanks

    I have 31 files including a Master and I just copied Baseline into Baseline1.

    All of these are in KB size denotation....................

    3955
    1712
    4032
    10064
    6416
    2944
    1728
    3856
    5360
    3040
    1472
    1792
    1840
    2128
    12128
    6496
    8960
    1248
    5232
    1760
    1744
    2400
    3008
    768
    1568
    1696
    2240
    992
    2944
    5344
    1456
    Saturday, July 30, 2011 3:48 PM

Answers

  • Hotmail1,

    First, you're welcome.

    Yes, each file that is open in the foreground occupies space in working memory. WIth the added baseline, unfortunately once an application has allocated space for data, deleting that data won't necessarily free up the space. Computer operating systems just don't work that way. However, there may be some ways you can clear a lot of the "bulk" out of your file and make them leaner.

    Again, a little history. Our master file/subproject linked structure used various methods to keep file size minimized.

    First, even though our program spanned multiple years, the detail of our individual project files only covered a current working schedule window of about 9 months. Activities beyond that were laid out in planning packages. A planning package contains a general description of the expected work and a budgeted cost associated with that package, but it does not contain actual schedule working detail. The detail is not expanded until that planning package comes into the working window. The premise is that attempting to fully plan out the detail of a multi-year project is fruitless. So many things can happen in the near term that will negate future detail plans thus causing unnecessary rework of the plan.

    Second, once a inter-project link becomes historical (i.e. the tasks associated with it are complete), that link can be deleted. It no longer provides any useful information and can add to the fodder for possible corruption.

    Third, as our program progressed, several users complained that their files were more difficult to read because of all the included past history. I agreed and developed a macro that extracted historical data such as cost and automatically restructured the files to replace detail historical data with a summarized equivalent - basically the planning package reborn. Since we periodically (monthly) saved a snapshot of our whole project, an audit of historical data was always possible.

    The above suggestions will help cut down your file bulk, but they do require effort and probably some new thinking to initiate the change.

    John

    Monday, August 1, 2011 2:18 AM

All replies

  • Firstly linked files or any links between any files are corruption breeding grounds. If you ever rename a file, move one or over-write any of them then the corruption dice get rolled. It's when, not if, but it may be 5 days or 5 years. Corruption may be responsible for the slow performance.

    Another cause is slow or less than perfect network. Do not link files over a WAN!

    Try taking a copy of all files and saving in a folder on your C:drive. Create a new Master and compare performance. If its faster then the difference is due to network performance.

    Personally I only ever use masters where projects are inserted with NO link. THis creates a snapshot for audit purposes and consolidtes all resource information. As there are no links there is no corruption risk. I record a macro to automate the process (make sure File, New is the first recorded command).


    Rod Gill

    The one and only Project VBA Book Rod Gill Project Management
    Saturday, July 30, 2011 10:36 PM
    Moderator
  • Hello

     

    Would it be possible for someone to answer the specific questions listed in my first submittal?

    I am NOT using Project server but we are using MS Project 2007 Professional.

     

    Thanks

    Sunday, July 31, 2011 1:20 PM
  • The "someone" was Rod Gill and he answered.
    --rms www.rmschneider.com
    Sunday, July 31, 2011 3:15 PM
  • Thanks

    I was wondering about these if people had experience with these size files, etc. before

    Is it normal to get this slow of a response on this size 110,323KB.

    Is there something like an external add on device that would speed up the processing?

    How can we get this to be faster when we have the Master open and we want to update?

    Would running a viewer on all the files through the Master be a lot faster? We have many people that all they want to do is view the files?

     

     

    Sunday, July 31, 2011 3:41 PM
  • Hard to conclude from afar, but as Rod suggested, your files may be corrupted.

    Re the viewer ... contact the vendor of the viewer program about how their product performs.  Re an "external add-on device" ... best performanc would be of course loading from local drive. 

    File loading performance will generally, I think, be related to the machine and it's speed with the disk.  Opening from a local machine should be relatively fast.  From a remote network drive, depending on bandwidth, probably much slower.  If really show, could be corrupted files ... which is what Rod was suggesting and your other question suggests all that.


    --rms www.rmschneider.com
    Sunday, July 31, 2011 3:48 PM
  • Hotmail1,

    Rod gave one viewpoint, let me offer a counterpoint and after that I'll even answer your questions. It just doesn't get any better than that.

    First, I agree with Rod that a linked file structure is fertile ground for corruption. A linked structure is not advisable in the absence of very strict user rules. Working with a linked structure requires a great deal of discipline to create and maintain and in my opinion, most organizations do not have that level of user discipline.

    A bit of history. A linked structure is workable and can exist corruption free. I know because we had such a structure. It consisted of 70 linked files and a master that I rebuilt each month for reporting metrics. The links between subproject tasks were prior to the introduction of external predecessor/successor, that is, they were achieved using the much more fragile paste link process. To make things even more interesting, the whole structure resided on a WAN server. Although our subproject files were not as large as yours, we had several hundred inter-project links. We operated the structure for at least 2 years and I do not recall ever having a corruption problem. However, our users received formal training on how Project worked and what they were and were not allowed to do with their individual project files. One person built and maintained the master structure. It was a very disciplined approach and therefore it worked. The bottom line is that a linked file structure does work and it will work over a network.

    Rod suggested copying your files to your local C: drive and checking performance. A word of caution, copying a linked structure can in fact duplicate the link structure and increase the probability of corruption. If the whole linked structure, all 30 subprojects and the master, are in a single folder on a WAN, then moving that whole folder to your local PC should work without problems. We had to do that once when our IT department changed servers.

    Okay, that's my take on a linked file structure. Now to address your questions:

    1. How can we get this to be faster when we have the Master open and we want to update?

    That's a little difficult to answer without actually seeing your whole structure but generally, I suggest that updates be done at the individual subproject level. Then open the master, let the project calculate and then save all files.

    2. Would running a viewer on all the files through the Master be a lot faster? We have many people that all they want to do is view the files?

    I'm not sure what "viewer" you are referring to. A third part viewer will certainly allow users to view files but not make changes to them but it will have no effect on how fast files open.

    3. Is it normal to get this slow of a response on this size 110,323KB.

    If you really have a total [master] file size of 110M then indeed your structure is huge and I'm not surprised it takes a long time to open. Looking at the sizes you list for each individual file I wonder if your subproject files really have that much data or if they are just bloated. I know we've discussed your file bloat issues before on other posts in this forum. What I don't know is whether those issues were ever fully resolved or whether you still have residual bloat and or corruption.

    4. What are the causes of file size and can we cut down the file size in other ways in order to speed up updating when the entire Master is open?

    I think we've mentioned before in previous posts that if you have the master and all subproject files fully open (i.e. in the foreground), then the whole structure will be very sluggish. When working with the master, it is best to open it by itself and let all subproject be open only in background mode.

    5. If a viewer is faster could we use the viewer and than open the individual file and make an update?

    Again, I don't know what viewer you are referring to but see my answer to questions 1 & 2 above.

    6. Is there something like an external add on device that would speed up the processing?

    None that I know of. Since you have had continuing problems with your file structure I suggest you consider hiring a qualified consultant to come in and analyze your file structure and make a recommendation. There is just no way we can adequately give you an on-line synopsis.

    Hope this helps.

    John

     

     

     

    Sunday, July 31, 2011 4:24 PM
  • John

    Thanks for taking the time and answering these questions!

    You mentioned the following............Perhaps I am just confused........I'm not sure I understand opening the Master by itself and the subprojects only in background mode. How does one do this?

     think we've mentioned before in previous posts that if you have the master and all subproject files fully open (i.e. in the foreground), then the whole structure will be very sluggish. When working with the master, it is best to open it by itself and let all subproject be open only in background mode.

    There is a lot of data in each of the files in a number of test fields including a save of Baseline into Baseline1 throughout ALL the files. Probably really don't need the baseline1 since I can do a compare of the files the client has the last time we did a baseline.

    Some of these subprojects have 2,500+ tasks with 20+ text fileds being used (data).

    Anyway..............

     

     

    Sunday, July 31, 2011 6:56 PM
  • Hotmail1,

    What I meant by opening only the master is just that. If you open the master, Project will open the subproject files in the background. However some users open each subproject and the master. Doing so opens everything in the foreground. You can verify that the subprojects are only open in the background by opening the master and then going to Window/New Window. The only project that will appear in the New Window window is the master. If any subprojects also appear it means they are open in the foreground.

    Baselines are normally used for earned value metrics and for variance analysis, so saving a baseline is good practice. If you do not use earned value and don't care about variances, then you probably could do without a baseline. However, deleting the data already saved in the baseline is not going to have a large impact on your file sizes. I would take a serious look at the 20+ text fields and make sure you really need all that extra data. But again, depending on the type and quantity of data in those fields, deleting it probably won't have a significant impact on file size. The bottom line answer may be that you really need Project Server.

    I still recommend you bite the bullet and get some qualified professional help, at least to take a live look at your whole file structure. A good Project consultant can also set up training and help establish operational protocol.

    John

    Sunday, July 31, 2011 8:25 PM
  • Thanks John!!

    We only ever open the Master and then perhaps 1 other file and that is it. We never open all of the files up as project will update them anyway. Never thought of foreground/background.

    Does opening up all in the foreground take more memory? Sounds logical that it would but again we don't anyway.

    When we have 2 files open it is the Master to view overallocations and perhaps move resources from one subproject to another through the Resource usage view. We do NOT have a Resource Pool we instead have set up a Group By resource. that shows the resource across subprojects.

    In regard to baselines when I added the baseline1 for every file I believe it added about 15 to 20% to the size of the files if I remember correctly.

    Thanks again for the input and advice!

    Sunday, July 31, 2011 10:21 PM
  • Hotmail1,

    First, you're welcome.

    Yes, each file that is open in the foreground occupies space in working memory. WIth the added baseline, unfortunately once an application has allocated space for data, deleting that data won't necessarily free up the space. Computer operating systems just don't work that way. However, there may be some ways you can clear a lot of the "bulk" out of your file and make them leaner.

    Again, a little history. Our master file/subproject linked structure used various methods to keep file size minimized.

    First, even though our program spanned multiple years, the detail of our individual project files only covered a current working schedule window of about 9 months. Activities beyond that were laid out in planning packages. A planning package contains a general description of the expected work and a budgeted cost associated with that package, but it does not contain actual schedule working detail. The detail is not expanded until that planning package comes into the working window. The premise is that attempting to fully plan out the detail of a multi-year project is fruitless. So many things can happen in the near term that will negate future detail plans thus causing unnecessary rework of the plan.

    Second, once a inter-project link becomes historical (i.e. the tasks associated with it are complete), that link can be deleted. It no longer provides any useful information and can add to the fodder for possible corruption.

    Third, as our program progressed, several users complained that their files were more difficult to read because of all the included past history. I agreed and developed a macro that extracted historical data such as cost and automatically restructured the files to replace detail historical data with a summarized equivalent - basically the planning package reborn. Since we periodically (monthly) saved a snapshot of our whole project, an audit of historical data was always possible.

    The above suggestions will help cut down your file bulk, but they do require effort and probably some new thinking to initiate the change.

    John

    Monday, August 1, 2011 2:18 AM