none
The Content library cleanup tool, kickoff multiple cleanups locally at the same time or not? RRS feed

  • Question

  • Hi

    When reading the documentation for the content library tool the requirement states

    Only run the tool against a single distribution point at a time.

    I know that you can't run multiple instances of contentlibrarycleanup.exe on the same system (like starting multiple cleanups on remote systems from the same server)

    But does that also mean that it's not recommended to run a task locally on all my DP's, at the same time to cleanup the content library for the DP's, meaning you should do the cleanup for all DP's sequentially. Completing one before running the cleanup on the next DP? Or is the perfectly all right to kick off contentlibrarycleanup.exe regularly(like every 90 days) locally on all DP's at the same time 

    Kind Regards, 

    Jakob

    Sunday, October 13, 2019 8:17 PM

Answers

  • meaning you should do the cleanup for all DP's sequentially

    Correct, that's exactly what the statement that you quoted means.

    Why would you kick it off regularly? Orphaned content certainly can and does happen and may build up over time, but it's not hurting anything and there's a fair amount of overhead when running the tool so they don't recommend this either.


    Jason | https://home.configmgrftw.com | @jasonsandys

    Sunday, October 13, 2019 10:41 PM

All replies

  • meaning you should do the cleanup for all DP's sequentially

    Correct, that's exactly what the statement that you quoted means.

    Why would you kick it off regularly? Orphaned content certainly can and does happen and may build up over time, but it's not hurting anything and there's a fair amount of overhead when running the tool so they don't recommend this either.


    Jason | https://home.configmgrftw.com | @jasonsandys

    Sunday, October 13, 2019 10:41 PM
  • Hi Jason, 

    Thank you for your response. Well I thought a bit of cleanup now and then(the 90 days was just an example) wouldn't hurt. Running the process off hours, run  less than a hour, saving 80 GB of space(18% of the contentlibrary) (yes I know diskspace is cheap and shouldn't be an issue, well sometimes it's just easy to automate a process, let it kick off and extend the lifetime of an investment for 1-2 years instead of getting a green light for adding more and more  diskspace(especially if the diskspace isn't actually used for anything)  in a remote dp far far away from the modern civilization. So to kick of a automated cleanup(at least the what-if scenario)  when a certain amount of free diskspace was reached just sounded appealing to me. 

    I also made the post, as I was hoping for an explanation for the limitation of only running the cleanup for a single dp at a time. Is it due to the overhead?

    Kind Regards,

    Jakob

    Wednesday, October 16, 2019 7:47 AM
  • Yep, I understand the value of automating it, I guess my main thought though is that the whole point is cleaning up orphaned content which can certainly build up over time due to cleanup failures, but having orphaned content is not normal or expected and so running the cleanup on a regular basis just seems not right to me.

    As for running it concurrently, yes, overhead as the process is intensive on the DP as well as on the DB depending on the amount of content.


    Jason | https://home.configmgrftw.com | @jasonsandys

    Wednesday, October 16, 2019 2:53 PM