In the one of the last posts I've showed you how you can index data from external source. I've used there XML file as my input. By using custom crawler, in results we've got indexed documents within SOLR core. It is good base to continue work. Let's extend it a little bit!
Today we will configure our CustomCrawler with standard SitecoreItemCrawler to add possibility to indexing Sitecore items and external sources to one Solr core. Then we will prepare mechanism to search trough all indexed data.
Let's start from input sources. In Sitecore I've prepared structure which I am going to put in index. It is presented below on screen shot
After those steps, rebuilding index should put our data from Sitecore and from XML to index core. You can check it in Solr panel. Here is presented how it looks in my Solr panel.
Indexed product:
The difference now is that whole content from Sitecore item is indexed as _content field. Situation looks similar in case of Sitecore item name, because it is indexed as _name field. Now we have to modify indexed products from XML to fit to indexed Sitecore items. we will index product name as _name field and description with name as _content. To achieve this, we need Product Model which we've prepared in previous post.
After those changes and rebuilding index, product in Solr should look in a following way:
When we already have our DTO, we can move on to the creation of SearchService.
Our last step will be adding controller to testing functionality written in service few minutes ago.
OK, that's all what we need. Now we can test our search mechanism in browser by calling controller method. Let's check it using vary phrases.
As you can see, in results are products from XML and items from Sitecore, so our achievement is reached for now. I hope that this tutorial satisfied you and it will help you save your time during implementing similar solution this solution. It was second and last part of this topic. If you've missed first part, you can check this here.
In case of any questions, don't hesitate to ask in comments.
Thank you for your time, I'll be back soon :)
Stay tuned!
Today we will configure our CustomCrawler with standard SitecoreItemCrawler to add possibility to indexing Sitecore items and external sources to one Solr core. Then we will prepare mechanism to search trough all indexed data.
Let's start from input sources. In Sitecore I've prepared structure which I am going to put in index. It is presented below on screen shot
Also, I've used XML file from previous post and I've added there Name field :
<?xml version="1.0"?> <Products> <Product> <Id>1</Id> <Name>Product 1</Name> <Description>Lorem Ipsum</Description> </Product> <Product> <Id>2</Id> <Name>Product 2</Name> <Description>Dolor Sit Etem</Description> </Product> <Product> <Id>3</Id> <Name>Product 3</Name> <Description>Sed do eiusmod tempor</Description> </Product> </Products>
When we have our both input sources, it is time to extend our previous index configuration and add there SitecoreItemCrawler to allow indexing Sitecore items:
<?xml version="1.0" encoding="utf-8" ?> <configuration xmlns:patch="http://www.sitecore.net/xmlconfig/"> <sitecore> <contentSearch> <configuration type="Sitecore.ContentSearch.ContentSearchConfiguration, Sitecore.ContentSearch"> <indexes hint="list:AddIndex"> <index id="custom_index" type="Sitecore.ContentSearch.SolrProvider.SolrSearchIndex, Sitecore.ContentSearch.SolrProvider"> <param desc="name">$(id)</param> <param desc="core">$(id)</param> <param desc="propertyStore" ref="contentSearch/indexConfigurations/databasePropertyStore" param1="$(id)" /> <configuration ref="contentSearch/indexConfigurations/defaultSolrIndexConfiguration"> <indexAllFields>true</indexAllFields> <fieldMap ref="contentSearch/indexConfigurations/defaultSolrIndexConfiguration/fieldMap"/> <documentOptions type="Sitecore.ContentSearch.SolrProvider.SolrDocumentBuilderOptions, Sitecore.ContentSearch.SolrProvider"> </documentOptions> </configuration> <strategies hint="list:AddStrategy"> <strategy ref="contentSearch/indexConfigurations/indexUpdateStrategies/onPublishEndAsync" /> </strategies> <locations hint="list:AddCrawler"> <!--already added custom crawler--> <crawler type="SitecoreBlog.Search.Crawlers.CustomCrawler, SitecoreBlog.Search"> </crawler> <!--here we must add new SitecoerItemCrawler with db name and root item--> <crawler type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch"> <Database>master</Database> <Root>/sitecore/content/Home</Root> </crawler> </locations> </index> </indexes> </configuration> </contentSearch> </sitecore> </configuration>
As you probably noticed, I've added there new crawler with two parameters which will fit crawler to our need. In the Root parameter I've used node which we want to index.
After those steps, rebuilding index should put our data from Sitecore and from XML to index core. You can check it in Solr panel. Here is presented how it looks in my Solr panel.
Indexed product:
Part of indexed Sitecore item:
The difference now is that whole content from Sitecore item is indexed as _content field. Situation looks similar in case of Sitecore item name, because it is indexed as _name field. Now we have to modify indexed products from XML to fit to indexed Sitecore items. we will index product name as _name field and description with name as _content. To achieve this, we need Product Model which we've prepared in previous post.
using SitecoreBlog.Search.Attributes; namespace SitecoreBlog.Search.Model { public class Product { [IndexInfo("productid")] public int Id { get; set; } [IndexInfo("_name")] public string Name { get; set; } public string Description { get; set; } [IndexInfo("_content")] public string Content => string.Format("{0} {1}", Name, Description); } }
After those changes and rebuilding index, product in Solr should look in a following way:
It is final and proper form. Now can move on to search mechanism. On the beginning we have to to create DTO which will be returned as result.
namespace SitecoreBlog.Search.Dto { public class ResultItem { public string Name { get; set; } public string Content { get; set; } } }
When we already have our DTO, we can move on to the creation of SearchService.
using System.Collections.Generic; using System.Linq; using Sitecore.ContentSearch; using Sitecore.ContentSearch.SearchTypes; using SitecoreBlog.Search.Dto; namespace SitecoreBlog.Search.Service { public class SearchService { public ICollection<ResultItem> Search(string phrase) { using (var searchContext = ContentSearchManager.GetIndex("custom_index").CreateSearchContext()) { var results = searchContext.GetQueryable<SearchResultItem>().Where(x => x.Content.Contains(phrase)) .Select(x => new ResultItem() { Name = x.Name, Content = x.Content }).ToList(); return results; } } } }
Our last step will be adding controller to testing functionality written in service few minutes ago.
using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.Mvc; using SitecoreBlog.Search.Service; namespace SitecoreBlog.Website.Controllers { public class SearchController : Controller { [HttpGet] public ActionResult Search(string q) { var service = new SearchService(); var results = service.Search(q); return Json(results, JsonRequestBehavior.AllowGet); } } }
OK, that's all what we need. Now we can test our search mechanism in browser by calling controller method. Let's check it using vary phrases.
As you can see, in results are products from XML and items from Sitecore, so our achievement is reached for now. I hope that this tutorial satisfied you and it will help you save your time during implementing similar solution this solution. It was second and last part of this topic. If you've missed first part, you can check this here.
In case of any questions, don't hesitate to ask in comments.
Thank you for your time, I'll be back soon :)
Stay tuned!