Monday, June 10, 2019

Sitecore JSS - Real Time Personalization

Sitecore JSS is really powerfull framework which brings a lot of advantages to your websites. After a few days of digging into this technology, I fell in love with this headless approach which is great and I can see a bright future ahead of this. Once I’ve started playing with it, I noticed that only JSON data is coming from a server and we’re building whole HTML on client side. Based on that, I decided to put some challenges for myself to find a way how we can use it for a real time personalization - it means that content will be changed automatically once provided rule will be fulfilled and all will be done without any page reloading. Long story short - it is enough to get newest data from API and replace it in the store. Let’s take a look at what I found out.


Let’s start

In this article I’ll use JSS with Vue JS in connected mode and I’ll base my mechanism on RouteHandler.vue from JSS offitial github. As I am mainly back end developer, please forgive me any potenial lacks in JS code, and if you find something to improve, please let me know.


Monitoring changes

To get information when personalized content is changed, we have to keep updated information and refresh it from time to time. For this purpose we need to create worker which will make calls to get up to date route data using sitecore API. Also, to avoid unnecessary refereshing components that contain any changes, it would be good to check if old and new object values are deeply equal (Just to clarify, == and === will not fly here, because they compare instances of the objects). For this purpose I’ll use lodash.isequal.

...
import isEqual from 'lodash.isequal';
import EventBus from '../../Foundation/Events/EventBus';

export default {
  name: 'Route-Handler',
  data() {
    return {
        ...,
        interval: null,
        };
  },
  created(param) {
    ...
    this.enableWorker();
  },
  beforeDestroy() {
    ...
    this.disableWorker();
  },
  methods: {
    ...,
    updateComponents() {
      getRouteData(this.sitecoreRoutePath, this.language).then((routeData) => {
        if (routeData !== null && routeData.sitecore.route) {
            // here we're checking if route data returned from api is different than current one
          if (!isEqual(this.appState.routeData, routeData.sitecore.route)) {
            //in case if it is, we have to change data in actual jss store
            this.$jss.store.setSitecoreData(routeData);
          }
        }
      });
    },
    enableWorker() {
      if (!this.context.pageEditing && !this.interval) {
        this.interval = setInterval(() => this.updateComponents(), 5000);
      }
    },
    disableWorker() {
      if (!this.context.pageEditing && this.interval) {
        clearInterval(this.interval);
        this.interval = null;
      }
    },
  },
};

Refreshing components

Once we’ve updated route data object in JSS store, we can use Vue watchers to handle moment when we should make our component rendered again. Let’s imagine that I would like to rerender my section component with all children in case if any data within section or children is changed. For this purpose we need to watch rendering property and make a deep comparision between old one and the new one. In case if changes are detected, we’ll trigger code responsible for rerender Section component.

<template>
  <section :class="backgroundClass" :id="sectionId" v-if="renderComponent">
    <div class="container">
      <placeholder name="jss-section" :rendering="rendering" />
    </div>
  </section>
</template>

<script>
 import { Placeholder } from "@sitecore-jss/sitecore-jss-vue";
 import isEqual from "lodash.isequal";

 export default {
   data() {
  return {
    renderComponent: true
  };
   },
   props: {
  rendering: {
    type: Object
  }
   },
   components: {
  Placeholder
   },
   watch: {
  rendering(val, oldVal) {
    // here we're checking if route data returned from api is different than current one
    if (!isEqual(val, oldVal)) {
   this.refresh();
    }
  }
   },
   methods: {
  // renderers component
  refresh() {
    this.renderComponent = false;
    this.$nextTick(() => {
   this.renderComponent = true;
    });
  }
   }
 };
</script>

The nice thing about this solution is that we don’t render whole layout again, but only those places what are handled by our watchers and only those that were changed. Besides real time personalization, usage of this solution can be helpfull in many other situations like:

  • dynamically changed content/datasource without reloading page
  • handling content for singed in/signed off users
  • synchronizing page content opened in few tabs/windows

That’s how it works in practice

On our demo page, I’ve created Section component where I’ve put minor components. For one of them I’ve set personalization rule to use different datasource once specific timestamp will be reached. You will notice that component will be changed automatically without reloading the page. It is good to know that only components from one Section were rerendered, because only there data were changed.


Few words on the end

This soultion can be really helpfull, but we should be aware that traffic on the server will increase, because every active instance will made some requests in background to get up to date data. Once we want to use it on production site, we must observe statistics and choose right interval time for our worker or get rid of worker and make hooks for some events, but it is up to the situation.

Thank you for reading and I hope you enjoyed it.

Tuesday, April 16, 2019

Updating items in Sitecore indexes

Sometimes, when we work on indexes in Sitecore, it is not enough to use just out of the box fields that are stored in indexes. When we want to reduce or avoid DB calls during running a search query, we need to store more information in index. In such situations, it is a common thing to use computed fields. It helps us to add information to index in easily way. When we extend data stored in computed fields, it is more than sure that we’ll use some information from another Sitecore items. It is nothing wrong, because we want to speed up our search as much as it is possible, but in such situations we should be aware of potential issues.


What kind of danger can we face?

When computed field stores information from another Sitecore items, we need to be careful to keep data up to date in our indexes. Once we change something in the Sitecore item, based on index update strategy, index document for this one will be updated as well. In such situations we should also remember about updating index documents that are using changed item in computed fields. Otherwise we’ll keep depracated data in our index what can cause inconsistent results presented for end user.


Example of issue

Tree structure presented above shows the potential issue. We have particular types of items:
- Car store - location where the store is. It contains vary car brands inside.
- Car brand - item responsible for keeping data about brand.
- Car model - item what contains information about model and price.

Let’s imagine that we want to find all stores that offering cars within specified price range. Based on structure from example, price field exists only in car models items. In this situation we could go through all cars in index, choose those which contain proper price for our range and take the related car stores to results. It sounds good, but in case of large amount of items, it can take some time.


To improve it, we can prepare a simple computed field to keep price range for each car store item. Our custom computed field will check all cars within car store and select min and max values. Then our search will go only through store items instead of all cars that we have. For this purpose we’re creating computed field to keeping price range and we’re rebuilding index. For now it looks good, because our index was rebuilt recently. Please notice that once we update single car price in Sitecore, index document for this item will be updated with a new value, but store will still keep the old price range in its index document. To avoid this issue we can create custom indexing strategy, but it would be pretty big piece of code. Let’s see some simple tricks that you will be able to use in similar situation.


Updating Index Item

To make things simpler it will be fine to handle some sitecore events like item:saved or publish:end and update indexes only for particular items.

Inside an event handler, we should refresh index documents related to changed one. After finding out all required items, let’s update them by using one of the proposed snippets of code:

public static void Update(string indexName, ICollection<Item> items)
{
    var index = ContentSearchManager.GetIndex(indexName);
    foreach (var item in items)
    {
        var uniqueId = new SitecoreItemUniqueId(item.Uri);
        IndexCustodian.UpdateItem(index, uniqueId);
    }
}

Code presented above will update index documents for items passed in parameters. It will work in asynchronous way, because IndexCustodian creates updating process as Sitecore job and puts it in the job queue. I would recommend this way instead of synchronous way. Because if we would use it in Sitecore event handlers, we have to be aware that they work in synchronous way too, so in case a large amount of items, it can freeze our instance for a while.

But if you still want to do it in synchronous way, example presented below shows how to achieve it without IndexCustodian:

public static void Update(string indexName, ICollection<Item> items)
{
    var index = ContentSearchManager.GetIndex(indexName);
    foreach (var item in items)
    {
        var uniqueId = new SitecoreItemUniqueId(item.Uri);
        index.Update(uniqueId);
    }
}


Refreshing Index Item

Beyond index updating, we are able also to refresh index item. Main difference is that refreshing process updates indexes for pointed item and its descendants. Please take a look at the code below:

public static void Refresh(string indexName, Item item)
{
    var index = ContentSearchManager.GetIndex(indexName);
    var indexableItem = (SitecoreIndexableItem) item;
    IndexCustodian.Refresh(index, indexableItem);
}

In the code presented above we used IndexCustodian like in updating example, so this process will be handled asynchronously.

If we want do it in synchronous way, just take a look at the following example:

public static void Refresh(string indexName, Item item)
{
    var index = ContentSearchManager.GetIndex(indexName);
    var indexableItem = (SitecoreIndexableItem)item;
    index.Refresh(indexableItem);
}


Conclusion

Computed fields are really helpful in scope of indexes customising. They bring us possibilities to store additional information to speed up search and other features like faceting, pagination and so on. It is important to use them wisely by remembering about all relations to other items and keeping index documents up to date. For this purpose we could rebuild index, but it will cost too much time and effort in case of single item changes. The better solution is updating or refreshing single index items, but we need to be aware of differences between those two processes and use them in proper situations.

Thank you for reading this post, I hope it’ll help some of you.

Tuesday, September 26, 2017

Indexing from external sources - Part 2

In the one of the last posts I've showed you how you can index data from external source. I've used there XML file as my input. By using custom crawler, in results we've got indexed documents within SOLR core. It is good base to continue work. Let's extend it a little bit!

Today we will configure our CustomCrawler with standard SitecoreItemCrawler to add possibility to indexing Sitecore items and external sources to one Solr core. Then we will prepare mechanism to search trough all indexed data.

Let's start from input sources. In Sitecore I've prepared structure which I am going to put in index. It is presented below on screen shot

Also, I've used XML file from previous post and I've added there Name field :
<?xml version="1.0"?>
<Products>
  <Product>
      <Id>1</Id>
      <Name>Product 1</Name>
      <Description>Lorem Ipsum</Description>
  </Product>
  <Product>
      <Id>2</Id>
      <Name>Product 2</Name>
      <Description>Dolor Sit Etem</Description>
  </Product>
    <Product>
      <Id>3</Id>
      <Name>Product 3</Name>
      <Description>Sed do eiusmod tempor</Description>
  </Product>
</Products>

When we have our both input sources, it is time to extend our previous index configuration and add there SitecoreItemCrawler to allow indexing Sitecore items:
<?xml version="1.0" encoding="utf-8" ?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
  <sitecore>
    <contentSearch>
      <configuration type="Sitecore.ContentSearch.ContentSearchConfiguration, Sitecore.ContentSearch">
        <indexes hint="list:AddIndex">
          <index id="custom_index" type="Sitecore.ContentSearch.SolrProvider.SolrSearchIndex, Sitecore.ContentSearch.SolrProvider">
            <param desc="name">$(id)</param>
            <param desc="core">$(id)</param>
            <param desc="propertyStore" ref="contentSearch/indexConfigurations/databasePropertyStore" param1="$(id)" />
            <configuration ref="contentSearch/indexConfigurations/defaultSolrIndexConfiguration">
              <indexAllFields>true</indexAllFields>
              <fieldMap ref="contentSearch/indexConfigurations/defaultSolrIndexConfiguration/fieldMap"/>
              <documentOptions type="Sitecore.ContentSearch.SolrProvider.SolrDocumentBuilderOptions, Sitecore.ContentSearch.SolrProvider">
              </documentOptions>
            </configuration>
            <strategies hint="list:AddStrategy">
              <strategy ref="contentSearch/indexConfigurations/indexUpdateStrategies/onPublishEndAsync" />
            </strategies>
            <locations hint="list:AddCrawler">
              <!--already added custom crawler-->
              <crawler type="SitecoreBlog.Search.Crawlers.CustomCrawler, SitecoreBlog.Search">
              </crawler>
              <!--here we must add new SitecoerItemCrawler with db name and root item-->
              <crawler type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch">
                <Database>master</Database>
                <Root>/sitecore/content/Home</Root>
              </crawler>
            </locations>
          </index>
        </indexes>
      </configuration>
    </contentSearch>
  </sitecore>
</configuration>
As you probably noticed, I've added there new crawler with two parameters which will fit crawler to our need. In the Root parameter I've used node which we want to index. 

After those steps, rebuilding index should put our data from Sitecore and from XML to index core. You can check it in Solr panel. Here is presented how it looks in my Solr panel.
Indexed product:
Part of indexed Sitecore item:


The difference now is that whole content from Sitecore item is indexed as _content field. Situation looks similar in case of Sitecore item name, because it is indexed as _name field. Now we have to modify indexed products from XML to fit to indexed Sitecore items. we will index product name as _name field and description with name as _content.  To achieve this, we need Product Model which we've prepared in previous post.

using SitecoreBlog.Search.Attributes;

namespace SitecoreBlog.Search.Model
{
    public class Product
    {
        [IndexInfo("productid")]
        public int Id { get; set; }

        [IndexInfo("_name")]
        public string Name { get; set; }

        public string Description { get; set; }

        [IndexInfo("_content")]
        public string Content => string.Format("{0} {1}", Name, Description);
    }
}

After those changes and rebuilding index, product in Solr should look in a following way:
It is final and proper form. Now can move on to search mechanism. On the beginning we have to to create DTO which will be returned as result.

namespace SitecoreBlog.Search.Dto
{
    public class ResultItem
    {
        public string Name { get; set; }
        public string Content { get; set; }
    }
}

When we already have our DTO, we can move on to the creation of SearchService.
using System.Collections.Generic;
using System.Linq;
using Sitecore.ContentSearch;
using Sitecore.ContentSearch.SearchTypes;
using SitecoreBlog.Search.Dto;

namespace SitecoreBlog.Search.Service
{
    public class SearchService
    {
        public ICollection<ResultItem> Search(string phrase)
        {
            using (var searchContext = ContentSearchManager.GetIndex("custom_index").CreateSearchContext())
            {
                var results = searchContext.GetQueryable<SearchResultItem>().Where(x => x.Content.Contains(phrase))
                    .Select(x => new ResultItem()
                    {
                        Name = x.Name,
                        Content = x.Content
                    }).ToList();

                return results;
            }
            
        }
    }
}

Our last step will be adding controller to testing functionality written in service few minutes ago.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;
using SitecoreBlog.Search.Service;

namespace SitecoreBlog.Website.Controllers
{
    public class SearchController : Controller
    {
        [HttpGet]
        public ActionResult Search(string q)
        {
            var service = new SearchService();
            var results = service.Search(q);

            return Json(results, JsonRequestBehavior.AllowGet);
        }
    }
}

OK, that's all what we need. Now we can test our search mechanism in browser by calling controller method. Let's check it using vary phrases.


As you can see, in results are products from XML and items from Sitecore, so our achievement is reached for now. I hope that this tutorial satisfied you and it will help you save your time during implementing similar solution this solution. It was second and last part of this topic. If  you've missed first part, you can check this here.

In case of any questions, don't hesitate to ask in comments.

Thank you for your time, I'll be back soon :)
Stay tuned!

Tuesday, September 19, 2017

Template inheritors

Hi there!

Today I am going to show you how you can quickly get inheritors of specific template. Let's imagine situation when you have base template, for instance _ProductBase and some templates which inherit from this one, let's say ProductA, ProductB, ProductC. We know that all future product templates
will also have _ProductBase in theirs base templates. In case when you will need get all product types available in project(and it is really probable :)), this solution will fit very well to this need.

Let's look to code. I've prepared it as the extension to TemplateItem.

public static class TemplateExtensions
    {
        /// <summary>
        /// Gets template inheritors 
        /// </summary>
        /// <param name="template"></param>
        /// <returns></returns>
        public static ICollection<string> GetTemplateInheritors(this TemplateItem template)
        {
            if (template != null)
            {
                var inheritors = Globals.LinkDatabase.GetReferrers(template)
                    .Where(l => l.GetSourceItem().Paths.FullPath.StartsWith("/sitecore/templates/") 
                    &&!l.GetSourceItem().Paths.FullPath.StartsWith("/sitecore/templates/system"))                   
                    .Select(l => l.SourceItemID.ToString())
                    .ToList();

                return inheritors;
            }
            return new List<string>();
        }
    }

It is important to take only items from templates root and exclude paths with standard Sitecore templates, it is /sitecore/templates/system, because we will not need this here.

I hope that you're enjoyed by this short post. Thank you for your time and stay tuned!


Tuesday, September 12, 2017

Indexing from external sources - Part 1

Introduction

I am sure that major part of you guys were using indexes in Sitecore and you were configuring it by yourself. It may be problematic at first time, but in every next time it is easier and finally you are going to do it automatically like a robot!
Last time I was faced task where I had to prepare search mechanism (AGAIN!), so the first though was that I need to index content and prepare service to search - piece of cake! But after reading acceptance criteria, I've noticed that I have to search not only by sitecore content, but also by huge XML with items provided by 3rd party service. Then I realised that it will be something new, so I was looking for the best solution
I was aware that I have to create new crawler but I didn't know how I can do it. Very helpful for me was this article - many thanks to author! (If you read this - I owe you a beer! :D )

Input source

Let's say that we have XML file which we want to index and it looks like this:
<?xml version="1.0"?>
<Products>
  <Product>
      <Id>1</Id>
      <Description>Lorem Ipsum</Description>
  </Product>
  <Product>
      <Id>2</Id>
      <Description>Dolor Sit Etem</Description>
  </Product>
    <Product>
      <Id>3</Id>
      <Description>Sed do eiusmod tempor</Description>
  </Product>
</Products>

Custom Crawler Configuration

To have a possibilty of indexing data from outside of the sitecore, we must create custom crawler. Let's start from adding it within our index configuration. For the demo purpose I've created new index configuration.
<?xml version="1.0" encoding="utf-8" ?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
  <sitecore>
    <contentSearch>
      <configuration type="Sitecore.ContentSearch.ContentSearchConfiguration, Sitecore.ContentSearch">
        <indexes hint="list:AddIndex">
          <index id="custom_index" type="Sitecore.ContentSearch.SolrProvider.SolrSearchIndex, Sitecore.ContentSearch.SolrProvider">
            <param desc="name">$(id)</param>
            <param desc="core">$(id)</param>
            <param desc="propertyStore" ref="contentSearch/indexConfigurations/databasePropertyStore" param1="$(id)" />
            <configuration ref="contentSearch/indexConfigurations/defaultSolrIndexConfiguration">
              <indexAllFields>true</indexAllFields>
              <fieldMap ref="contentSearch/indexConfigurations/defaultSolrIndexConfiguration/fieldMap"/>
              <documentOptions type="Sitecore.ContentSearch.SolrProvider.SolrDocumentBuilderOptions, Sitecore.ContentSearch.SolrProvider">
              </documentOptions>
            </configuration>
            <strategies hint="list:AddStrategy">
              <strategy ref="contentSearch/indexConfigurations/indexUpdateStrategies/onPublishEndAsync" />
            </strategies>
            <locations hint="list:AddCrawler">
              <!--here we have to add our custom crawler-->
              <crawler type="SitecoreBlog.Search.Crawlers.CustomCrawler, SitecoreBlog.Search">
              </crawler>
            </locations>
          </index>
        </indexes>
      </configuration>
    </contentSearch>
  </sitecore>
</configuration>
As you can see, I've added crawler in our configuration, so now it is time to add implementation.

Custom Crawler Implementation

using System.Collections.Generic;
using Sitecore.ContentSearch;
using SitecoreBlog.Search.Model;

namespace SitecoreBlog.Search.Crawlers
{
    public class CustomCrawler : FlatDataCrawler<IndexableProduct>
    {
        protected override IndexableProduct GetIndexableAndCheckDeletes(IIndexableUniqueId indexableUniqueId)
        {
            return null;
        }

        protected override IndexableProduct GetIndexable(IIndexableUniqueId indexableUniqueId)
        {
            return null;
        }

        protected override bool IndexUpdateNeedDelete(IndexableProduct indexable)
        {
            return false;
        }

        protected override IEnumerable<IIndexableUniqueId> GetIndexablesToUpdateOnDelete(IIndexableUniqueId indexableUniqueId)
        {
            return null;
        }

        protected override IEnumerable<IndexableProduct> GetItemsToIndex()
        {
            var list =  new List<IndexableProduct>() { new IndexableProduct(new Product()
            {
                Description = "lorem ipsum"
            }),
    
            };

            return list;
        }
    }
}
To achieve our goal we have to add inheritance in our crawler from FlatDataCrawler and use generic type with definition of indexable item. In our case it will be IndexableProduct. In method GetItemsToIndex we have to return collection of items which we want index, so it is perfect place to return elements from provided XML.

Indexable Product

Here we are collecting properties from Product model and checking if they contains IndexInfo attribute.
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using System.Reflection;
using Sitecore.ContentSearch;
using SitecoreBlog.Search.Attributes;
using SitecoreBlog.Search.Model;

namespace SitecoreBlog.Search.Crawlers
{
    public class IndexableProduct : IIndexable
    {
        private readonly Product _product;

        public IndexableProduct(Product product)
        {
            _product = product;
        }

        public void LoadAllFields()
        {
            Fields = _product.GetType()
                .GetProperties()
                .Where(fi => fi.GetCustomAttribute<IndexInfo>() != null)
                .Select(fi => new IndexableProductDataField(_product, fi));
        }

        public IIndexableDataField GetFieldById(object fieldId)
        {
            return Fields.FirstOrDefault(f => f.Id.Equals(fieldId));
        }

        public IIndexableDataField GetFieldByName(string fieldName)
        {
            return Fields.FirstOrDefault(f => f.Name.Equals(fieldName));
        }

        public IIndexableId Id => new IndexableId<string>(Guid.NewGuid().ToString());

        public IIndexableUniqueId UniqueId => new IndexableUniqueId<IIndexableId>(Id);

        public string DataSource => "Product";

        public string AbsolutePath => "/";

        public CultureInfo Culture => new CultureInfo("en");

        public IEnumerable<IIndexableDataField> Fields { get; private set; }
    }
}

Indexable Product Data Field

In this place, properties are prepared to be indexed. Property Product model gets name from IndexInfo attribute and sets it as a field name of indexable data field. It means that field in indexed document will have name from attribute in model.
using System;
using System.Reflection;
using Sitecore.ContentSearch;
using SitecoreBlog.Search.Attributes;
using SitecoreBlog.Search.Model;

namespace SitecoreBlog.Search.Crawlers
{
    public class IndexableProductDataField : IIndexableDataField
    {
        private readonly Product _product;
        private readonly PropertyInfo _fieldInfo;

        public IndexableProductDataField(Product concreteObject, PropertyInfo fieldInfo)
        {
            _product = concreteObject;
            _fieldInfo = fieldInfo;
        }

        public Type FieldType => _fieldInfo.PropertyType;

        public object Id => _fieldInfo.Name.ToLower();

        public string Name
        {
            get
            {
                var info = _fieldInfo.GetCustomAttribute<IndexInfo>();
                return info.Name;
            }
        }

        public string TypeKey => string.Empty;

        public object Value => _fieldInfo.GetValue(_product);
    }
}

Index Info Attribute

This attribute will help us to determinate how name of property will look within the index.

using System;

namespace SitecoreBlog.Search.Attributes
{
    [AttributeUsage(AttributeTargets.Property)]
    public class IndexInfo : Attribute
    {
        public string Name { get; private set; }

        public IndexInfo(string name)
        {
            Name = name;
        }
    }
}
The usage this attribute in our model object properties let us to add those properties to index and define theirs names. In case of lack this attribute in property, property will not be indexed. Let's see usage of this attribute in model

Product Model

In this step we will define model which will be used to prepare and store documents ready to indexing
using SitecoreBlog.Search.Attributes;

namespace SitecoreBlog.Search.Model
{
    public class Product
    {
        [IndexInfo("productid")]
        public int Id { get; set; }

        [IndexInfo("description")]
        public string Description { get; set; }
    }
}
As you can see, we have used here attribute which was presented before. So in results, in our Solr core, we will have documents with two fields: "document_id" and "text"

Results

When the all steps presented above are done, there is a need to rebuild index and our XML file should be indexed in Solr core. As a prove I am attaching screenshot from my Solr panel.



In the next post I am going to show you how index data from few sources into one core and configure search mechanism to work with all this data.

Stay tuned! :)




Monday, February 6, 2017

Switching to Solr - trick with Global.asax

Hey ya!

In the web there are a lot of tutorials which can help you with switching indexes from Lucene to Solr. In most of them we have part where we need to edit Global.asax file and change inheritance to use IOC container. Here is the snippet from Sitecore documentation about which I am talking about. It is totally OK, but we have to keep in mind that method Application_Start from Global.asax.cs will not be triggered, so for instance when we want to register our custom routes in Global.asax.cs - we can't do it now. Now we should do it by usage pipelines etc

Workaround

There is smart workaraound which allow us to use Globax.asax.cs in traditional way. Instead changing inheritance within Global.asax, we can use approach presented below:

    public class MvcApplication : System.Web.HttpApplication
    {
        protected void Application_Start()
        {
            AreaRegistration.RegisterAllAreas();
            RouteConfig.RegisterRoutes(RouteTable.Routes);

            var container = new WindsorContainer();
            
            try
            {

                //this line is crucial. it initialize IOC for Solr
                new Sitecore.ContentSearch.SolrProvider.CastleWindsorIntegration.WindsorSolrStartUp(container).Initialize();
            }
            catch (Exception ex)
            {
                Sitecore.Diagnostics.Log.Error("Solr init error", ex, this);
            }
        }
    }

In this code we've initialized IOC components for Castle Windsor to Solr and now it should work correctly.

It is worth to say that we can do it for other IOC Containers as well. We just have to use proper dll. More details about rest of components are in table.

See you soon!

Friday, February 3, 2017

Switching to Solr - Watch out!

I can bet that during switching from Lucene to Solr indexes you are using scripts to disable Lucene configs and enable Solr configs.

Am I right?

If yes, please be aware, because since Sitecore 8.2 we have additional config file related to Solr, it is called "Sitecore.ContentSearch.SolrCloud.SwitchOnRebuild.config" and your scripts can accidentally enable this config file on your environment, which is unnecessary for you until you don't use Solr Cloud.

During rebuilding my primary indexes, I was facing with HTTP ERROR 404 issues:

  • Problem accessing /solr/sitecore_master_index_rebuild/update
  • Problem accessing /solr/sitecore_web_index_rebuild/update
  • Problem accessing /solr/sitecore_core_index_rebuild/update

First strange thing which I saw, was the akward names of indexes. I've never seen "rebuild" in names of indexes, it was weird for me...

Solution

If you are facing with similar issue, please try disable Sitecore.ContentSearch.SolrCloud.SwitchOnRebuild.config. It helped for me and I hope that it will help for you!