When I say improving, I actually mean is replacing...

I wanted to add a search to my site. So my initial idea was to use the functionality available through Contentful's Content Delivery API. But with the abstracted approach I have taken to my Content Model's, the search would become very complex.

I then decided to change my approach and use a 3rd party search, such as Solr or Elasticsearch (which I have a lot of experience with). As I didn't want to spend any money and Microsoft Azure Cognitive Search has a free tier (and looked interesting and easy to implement) I opted for this. There is also the benefit of upskilling with Azure Cognitive Search, so I can consider it for future commercial work.

The benefit of this approach is I control what is indexed. In my instance, I decided to call my Content Service, which consumes content from the Contentful Content Delivery API and creates ViewModels. This allowed me to index the full page, with all the child content into a single record. Making the querying of the index very simple and very quick.

The diagram below explains the approach I took when implementing this. Using a mediator design pattern and the Mediatr package.

Azure Search
Azure Search

I like that the index configuration is controlled by decorating properties with attributes as shown:

namespace TDB.Core.Entities
{
    public class SearchItemEntity
    {
        [SimpleField(IsKey = true)]
        public string Id { get; set; }

        [SearchableField()]
        public string Title { get; set; }

        [SimpleField()]
        public string Slug { get; set; }

        [SearchableField(IsFilterable = true, IsFacetable = true)]
        public string[] Categories { get; set; } = {};

        [SearchableField(IsFilterable = true, IsFacetable = true)]
        public string[] Tags { get; set; } = { };

        [SearchableField()]
        public string[] Content { get; set; } = { };

        [SimpleField(IsFilterable = true, IsSortable = true)]
        public string Type { get; set; }

        [SimpleField()]
        public string ThumbnailUrl { get; set; }

        [SearchableField()]
        public string Summary { get; set; }

        [SimpleField(IsSortable = true)]
        public DateTimeOffset DatePublished { get; set; }
    }

}

This allows for a new index to be created per environment without the need to login to the Azure Portal. This I have found very useful, as should I need to I can delete an index and the code will recreate the index and there is no risk of human error.
var indexes = _searchIndexClient.GetIndexNames().ToList();

if (!indexes.Contains(_options.IndexName))
{
    _logger.LogDebug("Creating index {_options.IndexName}", _options.IndexName);
    CreateIndex();
    _logger.LogDebug("Created index {_options.IndexName}", _options.IndexName);
}
private void CreateIndex()
{
    _searchIndexClient.CreateIndex(new SearchIndex(_options.IndexName)
    {
        Fields = new FieldBuilder().Build(typeof(SearchItemEntity)),
        Suggesters = {new SearchSuggester("Main", nameof(SearchItemEntity.Title))},
        ScoringProfiles =
        {
            new ScoringProfile("Main")
            {
                Functions =
                {
                    new TagScoringFunction(nameof(SearchItemEntity.Type), 4, new TagScoringParameters("high")),
                    new TagScoringFunction(nameof(SearchItemEntity.Type), 2, new TagScoringParameters("medium"))
                }
            }
        },
        DefaultScoringProfile = "Main"
    });
}

I wanted to make sure my blog posts appeared before my photos in the search. So as you can see above (via code, also ignore the magic strings, I added them to the example code to make it more readable) I added a scoring profile. Which can then be used when querying to tag types either high or medium.
var results = await _searchClient.SearchAsync<SearchItemEntity>(query, new SearchOptions
{
    ScoringParameters = { "high-blogPost", "medium-project" },
    /*...*/
    OrderBy = { "search.score() desc", $"{nameof(SearchItemEntity.DatePublished)} desc"}
});

You can see the results on the live search page which performs very well. As you will see from the code I have already planned to add some faceting to the search and an autocomplete. I will add this when I get a chance.

Andy Blyth
Andy Blyth PgDip BSc(Hons) FdSc

Andy Blyth is a technical architect/senior C# developer, studies martial arts and attempts to write blog posts (when he remembers). He currently works as an Optimizely (Episerver) Technical Architect at the Dept in Manchester, UK.