Regarding multivalued property in sharepoint RRS feed

  • Question

  • HI

    I have a multilookup field in sharepoint.The fast search doesnot take into multivalue and show all values as single text line.Due to which I applied custom extensibility.I followed the following link for custom extensibility

    Following is my code

      class Program
            public static readonly Guid CrawledCategorySharepoint = new Guid("00130329-0000-0130-c000-000000131346");
         //   public static readonly Guid CrawledCategoryUserDefined = new Guid("D5CDD505-2E9C-101B-9397-08002B2CF9AE");
            public static readonly String timestamp = DateTime.Now.ToString("yyyyMMddHHmmss.ffff");
            // Write the input file to a location the application has access to write in.
            static void WriteLogFile(string inputFile, string suffix,bool check)
                String pipelineInputData = @"c:\users\" + Environment.UserName +
                // Enable/disable debugging in real-time by creating/renaming the log directory
                if (Directory.Exists(pipelineInputData))
                    string outFile = Path.Combine(pipelineInputData, timestamp + "-" +
                                           MethodBase.GetCurrentMethod().DeclaringType.Name +
                                           suffix + ".xml");
                        File.Copy(inputFile, outFile);
                    File.AppendAllText(outFile, inputFile);
            // Handles the basic logging and exception handling
            static int Main(string[] args)
                    DoProcessing(args[0], args[1]);
                catch (Exception e)
                    // This will end up in the crawl log, since exit code != 0
                    Console.WriteLine("Failed: " + e + "/" + e.StackTrace);
                    return 1;
                return 0;
            // Actual processing
            static void DoProcessing(string inputFile, string outputFile)
                WriteLogFile(inputFile, "-input"true);
                char currentSeprator = ',';
                char seprator = '\u2029';
                XDocument inputDoc = XDocument.Load(inputFile);
                // Fetch the content type property from the input item
                var res = from cp in inputDoc.Descendants("CrawledProperty")
                          where new Guid(cp.Attribute("propertySet").Value).
                                        Equals(CrawledCategorySharepoint) &&
                              cp.Attribute("propertyName").Value == "ows_kbtags" &&
                              cp.Attribute("varType").Value == "4127"
                          select cp.Value;
                var res1 = from cp in inputDoc.Descendants("CrawledProperty")
                          where new Guid(cp.Attribute("propertySet").Value).
                                        Equals(CrawledCategorySharepoint) &&
                              cp.Attribute("propertyName").Value == "ows_kbtags" &&
                              cp.Attribute("varType").Value == "31"
                          select cp.Value;
                // Create the output item
                XElement outputElement = new XElement("Document");
                WriteLogFile(res.Count().ToString(), "-input1",false);//shows count 1
                WriteLogFile(res.First(), "-input11"false);//shows empty value
                WriteLogFile(res1.FirstOrDefault(), "-input112"false);//show empty value
                //// Add a crawled property if a content type was present
                if (res.Count() > 0 && res.First().Length > 0)
                    WriteLogFile(res.First(), "-input12"false);
                    var val = res.First().Replace(currentSeprator, seprator);
               WriteLogFile(val.ToString(), "-input122",false);
                new XElement("CrawledProperty"new XAttribute("propertySet", CrawledCategorySharepoint),
                    new XAttribute("propertyName""ows_kbtags"),
                    new XAttribute("varType", 4127), val)
                WriteLogFile(outputFile, "-output",true);

    I have created log file to see what happens at every step What I found is that I am not getting any value in the crawled property.It finds the property but the variable is empty why is it so that I am not getting any value in custom processing file.

    Any idea on this!

    Update on this is that it is not working for other properties.I tries some other fields and for that value is empty.Is this the normal behavior or specific to my environment


    • Edited by agarwal Monday, December 26, 2011 2:50 PM Update
    Monday, December 26, 2011 12:56 PM


All replies

  • Make sure that all of the crawled properties referenced in your code are listed in the <Input> section of the pipelineextensibility.xml with the correct propertySet, propertyName/propertyId and varType. Keep in mind that you need to execute the

    psctrl reset

    command for the pipelineextensibility.xml changes to take effect.

    Integrating an External Item Processing Component:

    Monday, December 26, 2011 5:00 PM
  • I have done the reset because In first step where I am writing the input file received as args in my log there I am getting the property which I have mentioned But the issue is that the value is empty.

    Normally I am doing the incremental crawl after psctrl reset. the log from the input file doesn't show any value in that.Following are the contents of input file

    <?xml version="1.0" encoding="UTF-8"?>
        <CrawledProperty propertySet="00130329-0000-0130-c000-000000131346" varType="31" propertyName="ows_activityfrom"></CrawledProperty>


    <?xml version="1.0" encoding="UTF-8"?>
        <CrawledProperty propertySet="00130329-0000-0130-c000-000000131346" varType="31" propertyName="ows_kbtags"></CrawledProperty>
        <CrawledProperty propertySet="00130329-0000-0130-c000-000000131346" varType="4127" propertyName="ows_kbtags"></CrawledProperty>

    I am not getting any value in the input file while in msdn it shows that the format will be as follows in input file

     <CrawledProperty propertySet='GUID' propertyName='PropertyName' propertyId='PropertyId' varType='PropertyType'>propertyValue</CrawledProperty>


    So I need to know  why I am not getting the input file content in proper format? What I am missing





    • Edited by agarwal Tuesday, December 27, 2011 5:28 AM Update to the content
    Tuesday, December 27, 2011 5:22 AM
  • How do you know that the items being processed by the incremental crawl actually have those fields populated? Are you adding new list items or updating existing ones before re-running the incremental crawl? Have you tried running a full crawl (you may want to take out some of the verbose logging as the log file would be huge)?
    Tuesday, December 27, 2011 3:50 PM
  • First of all I am editing item which have that field and then doing the incremental crawl .

    Secondly I am not doing verbose logging but If you see my code pasted above there is a function WriteLogFile() which writes input and output file which comes through pipeline extensibility and in that I am not getting the result .I will try the full crawl and see what happens.

    Tuesday, December 27, 2011 7:25 PM
  • Hi,

    If the input documents you are logging are empty, then there is no data for those fields. I recommend adding "url" to the input fields in order to more easily figure out which log file corresponds to which item in SharePoint.

    You also might want to enable FFDDumper or add a Spy stage to inspect what the data looks like before your custom code is executed ( and

    Mikael Svenson 

    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    • Marked as answer by agarwal Thursday, December 29, 2011 5:55 AM
    Tuesday, December 27, 2011 11:06 PM
  • Hi Mikael

    Can be you more elaborate How to add "url" field to input field . there is no crawld property with the name "Url" in SharePoint category.

    Moreover My pipeline extensibility is as follows just to confirm that it is fine

    <Run command="E:\FASTSearch\bin\Multivalued.exe %(input)s %(output)s">
                <CrawledProperty propertySet="00130329-0000-0130-c000-000000131346" varType="31" propertyName="ows_kbtags"/>
                    <CrawledProperty propertySet="00130329-0000-0130-c000-000000131346" varType="4127" propertyName="ows_kbtags"/>
                <CrawledProperty propertySet="00130329-0000-0130-c000-000000131346" varType="4127" propertyName="ows_kbtags"/>

    I have added ffddumper and when I do incremental crawl it creates multiple folder with files in it and in one of file I have the property with values in it


    Wednesday, December 28, 2011 7:26 AM
  • Hi,

    The registration is <CrawledProperty propertySet="11280615-f653-448f-8ed8-2915008789f2" varType="31" propertyName="url"/> and it's described on MSDN (

    If you have files with your data in the FFDDumper log files, then you should have files with the data in the ones you log yourself as well.

    Mikael Svenson 

    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    Wednesday, December 28, 2011 12:35 PM
  • Thanks I got it

    the issue was the name of the field which I was adding in custom extensibility property was "ows_kbtags". I thought it was case insensitive but it is not so. Orginally the field name was ows_KBTags.That was themistake I was doing

    Thanks everyone

    • Marked as answer by agarwal Thursday, December 29, 2011 5:55 AM
    Thursday, December 29, 2011 5:55 AM
  • Hi,

    I was thinking about this as well, but all the data you posted had lowercase so I assumed this is what was entering the pipeline.

    Using the spy stage is a good way to check the actual casings, as the crawled props are case sensitive inside the pipeline.


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    Thursday, December 29, 2011 12:16 PM