none
reading value of docaclms property in custom pipeline stage RRS feed

  • Question

  • Hi,

    We are required to use the ACL of crawled documents for a custom logic step in our company. after viewing the spy.txt file we found the attribute that contains the SDDL data we need is called docaclms, but the field is listed without its proerty set guid so we cant list it in the pipelineextensibility.xml file for viewing in the custom pipeline stage. further attempts to see this field using powershell Get-FASTSearchMetadataCrawledProperty or c# QueryCrawledProperties didnt return this property...

    our question is: how can the property set Guid of this attribute be obtained? 

    is there another property that can provide us the acl data?

     

    thanks

     

    Thursday, November 24, 2011 9:48 AM

Answers

  • Hi,

    there is no way to get this value without modifying files you shouldn't modify :)

    If you edit "C:\FASTSearch\etc\processors\customerextensibility.xml" you can add other internal fields as well to be exposed via the propertyset 11280615-f653-448f-8ed8-2915008789f2. Remember that this is not supported in any way.

    Regards,
    Mikael Svenson 


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    http://techmikael.blogspot.com/
    • Marked as answer by ISPDev Thursday, November 24, 2011 10:55 AM
    Thursday, November 24, 2011 10:52 AM

All replies

  • Hi,

    there is no way to get this value without modifying files you shouldn't modify :)

    If you edit "C:\FASTSearch\etc\processors\customerextensibility.xml" you can add other internal fields as well to be exposed via the propertyset 11280615-f653-448f-8ed8-2915008789f2. Remember that this is not supported in any way.

    Regards,
    Mikael Svenson 


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    http://techmikael.blogspot.com/
    • Marked as answer by ISPDev Thursday, November 24, 2011 10:55 AM
    Thursday, November 24, 2011 10:52 AM
  • Hi,

    thanks for your quick reply, i did as you described (in our dev environment) just to gain experience..

    i edited the customerextensibility.xml as follows:

                <param name="InternalFieldPropertySet" value="11280615-f653-448f-8ed8-2915008789f2" type="str"/>
                <param name="ExposedInternalFields" value="url;body;data;docaclms" type="str"/>

    i added the following to the Input element in pipelineextensibility.xml

        <CrawledProperty propertySet="11280615-f653-448f-8ed8-2915008789f2" varType="31" propertyName="docaclms" />

    and run psctrl reset

    after crawling, i got the following per crawled item:

    The Content Plugin received a "Processing Error" response from the backend server for the item. ( ProcessorDeploymentException: For pipeline 'Office14 (webcluster)', creating processor CustomerExtensibility failed: ValueError: 'docaclms' is not a valid internal name

    i know docaclms is a valid name because i can see it in the spy.txt file with the values we need to use...

    what am i missing?


    Thursday, November 24, 2011 1:35 PM
  • Hi,

    Seems to be a delay. If you do "nctrl stop procserver_1" on all procservers and then start them again, it works.

    Regards,
    Mikael Svenson 


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    http://techmikael.blogspot.com/
    • Proposed as answer by Darsh1 Thursday, May 10, 2012 5:45 PM
    • Unproposed as answer by Darsh1 Thursday, May 10, 2012 5:46 PM
    Thursday, November 24, 2011 8:15 PM
  • Hi Mikael

    Is there a way to write a value back to docaclms field from the pipeline extensibility stage with out using the Python AttributeCopy stage?

    Thanks
    Darsh

    Thursday, May 10, 2012 5:47 PM
  • Hi,

    I suspect not.

    Regards,
    Mikael Svenson


    Search Enthusiast - SharePoint MVP/WCF4/ASP.NET4
    http://techmikael.blogspot.com/
    Author of Working with FAST Search Server 2010 for SharePoint

    Thursday, May 10, 2012 5:49 PM
  • Hi Mikael


    Have you or do you know anyone who's used a BCS Connector to feed Security descriptor using the WindowsSecurityDescriptorField Field? I've been referring to all the articles about it and I've been trying this for the last two days.  But I haven't been able to get this to work successfully.  It's not picking up the ACLs and it doesn't populate any of the acl fields in the FS4SP document.

    Thanks

    Darsh

    Tuesday, May 15, 2012 6:46 PM
  • Hi Mikael

    Have you or do you know anyone who's used a BCS Connector to feed Security descriptor using the WindowsSecurityDescriptorField Field? I've been referring to all the articles about it and I've been trying this for the last two days.  But I haven't been able to get this to work successfully.  It's not picking up the ACLs and it doesn't populate any of the acl fields in the FS4SP document.

    Does this even work for FAST Search for SharePoint?

    Thanks

    Darsh

    Tuesday, May 15, 2012 7:36 PM
  • Hi,

    Did you try code similar to that posted here: http://stackoverflow.com/questions/2596534/implementing-security-on-custom-bcs-net-class ?

    What have you tried so far, and you should also try to add Spy stages in the pipeline to see what data is sent to the pipeline, and which fields they are assigned to in the security steps.

    Thanks,
    Mikael Svenson


    Search Enthusiast - SharePoint MVP/MCT
    http://techmikael.blogspot.com/
    Author of Working with FAST Search Server 2010 for SharePoint

    Wednesday, May 16, 2012 8:35 AM
  • Hi Mikael


    Yes I'm following the step same article you've pointed out.

    The spy stages or the FDDumper don't show any of the docacl or docaclms fields populated. Probably there is some thing not correct in the BCS settings.


    But after a spending 3 days on this I decided to remove the extra details from the Type name and just leave it as TypeName="System.Byte[]". This probably did the trick. It's not working for me.

    <TypeDescriptor TypeName="System.Byte[], mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" IsCollection="true" Name="SecurityDescriptor">
    

    Thanks for you response and I appreciate all the knowledge you've shared. Thanks for making our lives easier :)

    Cheers

    Darsh


    Wednesday, May 16, 2012 9:36 PM
  • Hi Darsh,

    Do you mean "it's now working for you"? Or still "not working"? Either way, glad I can sort of help without really doing much ;)

    Thanks,
    Mikael Svenson


    Search Enthusiast - SharePoint MVP/MCT
    http://techmikael.blogspot.com/
    Author of Working with FAST Search Server 2010 for SharePoint

    Thursday, May 17, 2012 7:47 PM
  • Sorry I meant it's working for me.
    Friday, July 20, 2012 1:53 PM
  • We are facing the same issue in one of the Environment.

    So just wanted to confirm, was the issue fixed that you were facing. We applied the same approcah suggested by Mikael in below post but no luck.

    Please suggest if you have some other solution to fix it.

    Friday, November 15, 2013 1:59 PM
  • Hi Mikael,

    We applied the same fix suggested by you, but still same issue. We are in very criticle face as one Environment is having this issue, and on lower environemt its working fine without any fix. We troubleshoot alot but no success.

    Please share if there is any other solution for this.

    Friday, November 15, 2013 2:04 PM
  • We were facing the same issue "The Content Plugin received a "Processing Error" response from the backend server for the item. ( ProcessorDeploymentException: For pipeline 'Office14 (webcluster)', creating processor CustomerExtensibility failed: ValueError: 'docaclms' is not a valid internal name"

    and a restart of document processors helped to crawl the documents successfully.


    Vineet Joshi

    Saturday, September 19, 2015 8:56 AM