Sitecore Xperiences - The things I've seen as a Sitecore Developer

ERROR [Content Testing]: Cannot find PhantomJS executable at ' (...) /data/tools/phantomjs/phantomjs.exe'. Aborting screenshot generation.

In Sitecore 8 and above, while starting a Multivariate Tests, you may see broken thumbnails and popups with warnings such as below, and your Sitecore Log showing the error at this blog title.

Error Running Test - PhantomJS - 1
Error Running Test - PhantomJS - 2
According to what Sitecore Development Team announced at this community post for why Sitecore adopted PhantomJS:

Content Testing in Sitecore uses the Phantom JS tool for generation of the screenshot image files. In case you didn’t know, Sitecore has had screenshot generation features for quite some time. We use it to generate icons for items that you’ll be listing in the UI like renderings. These icon generation features are based on the System.Windows.Forms.WebBrowser control built into .net. So why did we not use the existing screenshot features for Content Testing? In early testing we found discrepancies with the WebBrowser control. WebBrowser uses Internet Explorer installed on the server where Sitecore is running. But the specific version that it uses isn’t always the latest. There are registry updates one can make to force the appropriate IE version, but this seemed like a big ask of users. This was one of the reasons we chose to use Phantom JS instead.

Cause and Solution

The error in question will show the full path to where Sitecore is looking for PhantomJS. This path is defined at Sitecore.ContentTesting.config, at the entry “ContentTesting.PhantomJS.ExecutablePath”. Start your investigation by checking where your setting is currently pointing to.

By default this setting points to “$(dataFolder)/tools/phantomjs/phantomjs.exe”. If that is your case, check if your Data folder has the PhantomJS executable under that path. If this file is lacking, take a copy from a fresh Sitecore installation (make sure to use the very same revision your site is using).

See also: PhantomJS and security hardening

Posted in Uncategorized Tagged with: ,

Workflow not starting (state field is empty) when Versions.AddVersion after Sitecore upgrade from 6.5 to 8.1

UPDATE: A ticket at Sitecore Helpdesk demonstrated the issue happened to be something else. During the upgrade, the site definition we’re using got the attribute “enableWorkflow” removed, which in turn made the default value “false” take over, and that was causing ALL WORKFLOWS to be simply ignored.

If your site does have enableWorlflow=”true” then you don’t need to manually start workflows as shown in this article

After an upgrade, a certain code was simply not working as expected anymore. When you create a new version of an item (item.Versions.AddVersion), whose _Standard Values had a Default Workflow setup, the workflow simply was not starting, and thus the “State” field gets empty. The very same code, at the non-upgraded version, works perfectly.

Looking at the code, it was never touching Workflows, suggesting that Sitecore would be automatically handling the thing. Since it was an upgrade, I lost myself investigating for things I could have left behind: Pipelines, Scheduled Tasks, etc.

And I then discovered it was Sitecore that changed the way how the API acts… a change that made me struggle in for almost a week!  And I just learned that when I decided to create a small script to run the same code on both 6.5 and 8.1 instances.

The code was getting an item by it’s ID, adding a version and printing the number of the newly created version:

Workflow not starting - 1

Sitecore 6.5

When I run that on 6.5, which has 5 versions before:

Workflow not starting - 2

The new version is created:

Workflow not starting - 3

And the State is correctly filled:

Workflow not starting - 4

Sitecore 8.1

However when I run it on 8.1, which previously had 39 versions:

Workflow not starting - 5

The 40th version is created, but the State field is empty!

Workflow not starting - 6

Solution

So what I had to do, to make the same code run correctly on 8.1, was to force the item thru the workflow after a new version is created. That screw hard to find, easy to twist:

Item newVersion = myItem.Versions.AddVersion();

var defaultWorkflow = newVersion.Fields[FieldIDs.DefaultWorkflow].Value;
ID defaultWorkflowId;
if (!ID.TryParse(defaultWorkflow, out defaultWorkflowId))
    return;

var workflow = newVersion.Database.WorkflowProvider.GetWorkflow(defaultWorkflowId.ToString());
if (workflow!=null)
    workflow.Start(newVersion);
Posted in Sitecore API, Upgrades, Workflow

Developers working in parallel with Continuous Integration builds for Dev and QA

When multiple developers are working in parallel (at the same build, but working in different parts) it is very common to use branches at source control, to keep each developer working in isolation and eliminating any chance that one developer’s work interferes with the work of another. Of course one day their work will have to be integrated, and that is the moment when things can go wrong. The following article proposes one of many possible setups to enable that isolation.

Creating a branch from another branch is very simple and easy. When working with a Continuous Integration system, however, your branch is supposed to be monitored, built and deployed to a place people can access and test it.

A branch and its servers with Continuous Integration

The following image illustrates a scenario where multiple developers are working at the same branch, and a second branch is used as a basis to deployments. The blue boxes at the right represents servers with specific purposes. Being now a Sitecore good practice to develop and test in scaled environment, all environments will have two nodes, one for Content Management (CM) and another for Content Delivery (CD).

01 - All in the same branch

The Continuous Integration server monitors the branch “All developers” and deploys changes to DEV (at every change) and QA (manually). When the time for a deployment comes, the work is merged to Deploy Branch. The Continuous Integration server will then make the last integration to the servers “Deploy-CM” and “Deploy-CD”, where things can be tested for the last time before the actual deployment.

The Problem with this approach is that all developers are pushing their modifications to the very same branch, and thus deployments must carry everything at that branch (except of course, if very granular merges are made, but that can also be tricky and error-prone). To enable a better safe separation, the following setup is proposed.

Multiple Branches

The image below now shows a second branch, with its respective DEV and QA servers. Now each developer can make their work at their own branch, keeping a physical separation that guarantees one developer’s work won’t interfere with the other.
02 - Two DEV branches

The image also shows that branch “Dev1” (the older branch) were used to create branch “Dev2”, which guarantees that both branches are identical at that point in time. Obviously, each branch will have their modifications from that time on, and the code integration is now a bit harder to be executed. As time passes, each branch will be more and more different, bringing the need to a more careful exercise during the integration, executed at the “Deploy” branch.

Deployments and Reverse Merges

When all developers were working at the same branch, the act of checking in their changes were the integration itself. Continuous Integration server would push it to the servers, and integrations were continuously made due to the same code base. But now with physically separated branches, developers must communicate and keep their branches as much in sync as possible. The following images shows a sequence of deployments and reverse merges, that would keep both Dev branches in sync while sustaining their separations.

Image 1 – First Deployment (by Dev1) and Reverse Merge (by Dev2)

03 - First Deploy

Image 2 – Second Deployment (by Dev1) and Reverse Merge (by Dev2)

04 - Second Deploy

Image 3 – First Deployment (by Dev2) and Reverse Merge (by Dev1)

05 - Third Deploy

 

Posted in Architecture, Continuous Integration, Dev, Environments, QA

Health Check builds with Continuous Integration - How big and how often these needs to be?

Background

While refactoring the TFS and Continuous Integration struture of a couple projects, one thing toke my attention: Health Check builds were stealing most of the building times, which for me was representing long build queues to wait. The reason they were crowding the queue is because they were taking a long time to complete (about 10 to 15 minutes), and building too often (every 5 minutes when changes are pending).

Strength or kindness?

Since our CI tool has only two build agents, the obvious answer is to increase this number to properly serve the whole company. But does this tells the whole story? More build agents means more resources (disk space, licenses, etc) and unfortunately we still live in a world of limited resources. This also means we rarely will have resources enough to properly serve everyone at peak times.

Take as comparison the limited space most cities have to make their roads. Most of the time traffic can be ok, but when everyone goes out at same time, like in rushy times, there are no room for all and we have car jams.

Fortunately it’s easier to interfere and positively affect our Continuous Integration systems, so why not try a thing or two as a sign of civility, just like we do in a Drive Thru not taking too much to order your fast food?

Health Check builds, how big?

Depending on the Solution Architect who setup the project you can have different things, but most cases what I see is Health Checks doing everything but deployments. Since we use TDS in our projects, you can make it build a package of the whole deployment, along with some meta data. But is it really necessary? It’s a question I made myself, when decided to make it different.

My Health Check ended up being minimalist: just a compilation is executed, while TDS is kept totally off. No packages, nothing. Ended up with builds taking 45 to 90 secs to finish, much better!

Health Check builds: how often?

Being tiny means it can build more often. In my case I have it building after changes are detected, with a 5 minutes latency to avoid subsequent check-ins to causing multiple builds in sequence.

Full builds and Health Checks working together

No doubt the resulting Health Check builds are weaker, as it’s doing just a tiny part of the integration process. To fill the gaps, I also setup full builds working in conjunction with them. These are also set to run automatically when changes are pending, but with a much longer latency. Having it building each 4 or 8 hours will ensure a full integration is made once or twice in a normal day work.

Let’s also keep in mind these deploys can and must be manually triggered by developers, no matter the latency, as soon as they finish the user story they are working on, so they can test what they did at the integration servers before considering their work done.

In short: deployments will be made when developers finishes their work, or at minimum once or twice a day if pending changes are to be deployed.

Full builds with Asynchronous HTTP calls

Another improvement I made was replacing at full builds some Synchronous HTTP calls by Asynchronous. They were mainly used to wake up instances after deployments and publishing from CM to CD. Most cases, The build agent doesn’t really need to wait for these calls to respond before going to the next step, so we can save it’s precious time to another teams.

Your impressions?

What about your experiences, do you agree and disagree? What other factors are left behind at this analysis? Let me know your thoughts!

Posted in Continuous Integration, Health Check builds, Team City

Automatic check-up of Sitecore Server Roles with Powershell

This week I had a boring demand of checking up a Sitecore installation against the Excel spreadsheet provided by Sitecore, to make sure all configs are enabled or disabled according to the Server Role.

You can get cockeyed by doing this manually, so of course I looked for a way to automatize it. Ended up discovering this simple powershell script written by Michael West. The only issue with this script is it does touches the files while checking, something I’d never dare doing in production.

I have modified his script so that, instead of touching the files, it takes notes of files that are different than specified by Sitecore’s spreadsheet so one can compare it manually afterwards. Resulting script can be seen at this link.

Multiple roles

But what if you wish to have the same Sitecore installation handling multiple roles? According to Martina Welander, the general recommendation for mixed roles is that, if something must be enabled anywhere, that config must be enabled. Well, while this is not a rule we can trust 100%, still the script is not changing anything and thus it won’t hurt if it does this check.

You can check for multiple roles by adding them separated by commas (Eg: CM, RPT)

Posted in Uncategorized Tagged with: , ,

Less than a week to the Sitecore Symposium 2016 in New Orleans!

Countdown to SymposiumI’m totally excited for the days to come – starts next week the Sitecore Symposium and MVP Summit 2016, in New Orleans.

Will be my first experience in an official Sitecore event. Great opportunity to in person meetings with representatives of Sitecore, another partners and services providers. Particularly exciting is the chance to get together and taste what another Sitecore MVPs are thinking and saying.

I’m reaching the ground in New Orleans along with my colleagues from Nonlinear Creations, will be a week full of social and technical experiences.

When I come from this experience, expect to find articles and another materials with the inspirations I’ll bring!

Posted in MVP Summit, Sitecore Symposium

Slowness with Custom Properties on .NET Security Provider

If you ever used the standard .NET Security Provider with Sitecore, you may love how easy it is to create and use Custom Profile Properties, where data can be easily saved at user profiles. But a huge issue can emerge if you attempt to retrieve users at your database by one of these custom properties.

The Problem

Let’s say you have a custom property called Document ID, and you wish for some reason to retrieve the user that has a certain number on it – for instance, if your users can login to your website both using their Logins or Document IDs – then you may have something like this at your code:

var userFound = UserManager.GetUsers().FirstOrDefault(f => f.Profile["Document ID"] == inputDocumentId);

This simple code can bring you a big headache, because of the way .NET Security Provider builds the SQL Query responsible for executing your LINQ expression. Since all Custom Properties are stored as meta data, it is simply not directly queriable. Then what Security Provider does is one SELECT query for each user at your database, deserializing the Custom Properties on memory so it can be compared to the user input.

If you have few users at your database, which is usually the case when you are in development phase, you’ll probably not notice any issue. But after your go-live, as your user base starts growing, it will gradually get slower and slower. At the project that inspired this blog post, we had a sudden data load of more than 20k users, and as a consequence the system got impracticably slow overnight.

The Solution

One of the possible technical solutions, the one we used at the project in question, was to extend the table aspnet_Users at our Core database. That is the main table used by .NET Security Provider to store users. What we do is to create a new column Document ID, where this data will be stored in a clean format:

aspnet_Users

After that, we need to attach some code to both “user:created” and “user:updated” pipelines. This code will be responsible for updating the Document ID column when users are created or updated.

<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
 <sitecore>
 <events>
 <event name="user:created">
 <handler type="MyProject.Pipelines.UpdateUserProfileCache, MyProject" method="OnUserCreatedUpdated" />
 </event>
 <event name="user:updated">
 <handler type="MyProject.Pipelines.UpdateUserProfileCache, MyProject" method="OnUserCreatedUpdated" />
 </event>
 </events>
 </sitecore>
</configuration>

Then your code will probably look like the following. The class CustomProfileIndexManager, responsible for doing the actual Update and Select at the database is not there, but you can easily guess how to build yours.

namespace MyProject.Pipelines 
{
 public class UpdateUserProfileCache
 {
   public void OnUserCreatedUpdated(object sender, EventArgs args)
   {
     var scArgs = (SitecoreEventArgs)args;
     if (!scArgs.Parameters.Any())
       return;

     var user = (System.Web.Security.MembershipUser)scArgs.Parameters.First();

     // Will only act for "extranet" users
     var username = user.UserName;
     if (!username.StartsWith("extranet"))
       return;

     var scUser = Sitecore.Security.Accounts.User.FromName(username,false);

     // Following method is responsible for saving "Document ID" 
     // into the respective column at the database
     CustomProfileIndexManager.SaveUserToCache(scUser);
   }
 }
}

So your problematic code would be replaced by something like this:

// Will do a simple "SELECT * FROM aspnet_Users WHERE Document ID = 'xxxx'" 
var userFound = CustomProfileIndexManager.GetUserByDocumentId(inputDocumentId);

 

What about the existent base of users?

Of course that will only cover users that are new or updated. If you already have a certain base of users, you can build a simple script to “touch” all your users, such as this:

 var startProcess = DateTime.Now;
 Response.Write("<font color=\"red\">Loading users...</font>");
 Response.Flush();

 // Get all users
 var users = DomainManager.GetDomain("extranet").GetUsers();
 Response.Write(string.Format("OK! ({0}ms)<hr>", DateTime.Now.Subtract(startProcess).TotalMilliseconds));
 Response.Flush();

 // Loop into all users
 var counter = 0;
 foreach (var user in users)
 {
   var startUserProcess = DateTime.Now;

   counter++;
   if ((counter % 10) == 0)
   {
     Response.Write(string.Format("--- Total time: {0} minutes<br>", DateTime.Now.Subtract(startProcess).TotalMinutes));
     Response.Flush();
   }
   Response.Write(string.Format("User #{0} - Email: {1} - Processing...", counter, user.Profile.Email));

   // Following method is responsible for saving "Document ID" 
   // into the respective column at the database
   CustomProfileIndexManager.SaveUserToCache(user);

   Response.Write(string.Format(" OK! ({0}ms)<br/>", DateTime.Now.Subtract(startUserProcess).TotalMilliseconds));
 }
 Response.Write(string.Format("<h3>TOTAL TIME: {0} minutes</h3>", DateTime.Now.Subtract(startProcess).TotalMinutes));
 Response.Flush();
 Response.End();

 

Posted in Development, Security Provider

Faster Sitecore for Development - All in a single include

Since Sitecore 8, load times after builds are getting slower due to the high usage of Speak at the new UI. There is a great post by Kam Figy that summarizes reasons and how to improve this performance.

Tired of having to get back to his article repeating the steps for all my project, I decided to isolate everything in a single Include. There are three options to improve speed, and the patch authomatically covers 2 of them:

  1. Disable SPEAK Precompilation – Covered!
  2. Optimize Compilation – Requires manual change at the Web.config (Check the patch file for instructions)
  3. Disable the SPEAK Experience Editor – Covered!

Please make sure you read the whole article and you’re aware of the impacts.

When you unzip this file will see the patch is inside a folder “XFaster”. Place this folder, and not the include alone, inside the Include folder. It has to be there due to a conflicting setting at “/Include/ContentTesting/Sitecore.ContentTesting.config” that would overwrite our settings.

Posted in Development

SPEAK: ItemTreeView with multiple Roots

I’ve being studying SPEAK recently for a module I’m building, and it’s being a lot of fun so far. Of course, when learning anything that imposes a different paradigm, it’s common to get stuck with “simple” things that, in another programming realities, are easy to accomplish.

And then it happened to me: the ItemTreeView component offers exactly the UI experience I want to provide. It lists Sitecore items in a Tree, just like the Content Editor, but also offers a way for the user to select items with checkboxes. That is perfect, but there’s one problem: you can only have one single Root for each ItemTreeView component.

ItemTreeView - Out of the Box

This limitation simply made impossible one requirement I have, which is to list a certain Template and all their fields and Standard Values (Ok) along with all their Base Templates (impossible). Since we can’t predict how many Base Templates (if any) a certain template has, we can’t simply add one or two ItemTreeViews to the SPEAK page and expect they will cover all your needs. Instead, we need to be able to dynamically add multiple Roots to an ItemTreeView. But how?

First Try – Google

As a good boy I first asked Google, but looks like SPEAK components are not yet very widely being subjects of debates in forums and communities worldwide. Most of the documentation are introductory step-by-steps, along with the official documentation which is well descriptive, but not very detailed in terms of what you can do with a certain SPEAK component. Nothing really targeting the ItemTreeView component in general, or my need of having multiple Roots in specific.

Second try – Sitecore Community

For those who don’t know yet, the new Sitecore Community website, built on top of Zimbra/Telligent Community, is the main resource for trading Sitecore experiences. After a quick search, I also see no entries discussing what I needed, and then decided to start up my own question topic: “ItemTreeView with SPEAK : Having multiple RootsItems? Dynamically adding a new one? (with C#)

Again nothing came out from that topic, so I started to think this may be a good thing to contribute, and at the same time to achieve what I’m trying to have in my component. This post brings the whole solution, along with a Sitecore package with the final code and items that you can download and use.

Ok, nevermind, I’ll build my own

The ItemTreeView component does not support multiple roots out-of-the-box, so I ended up solving this issue by creating a new version of the ItemTreeView component that does it.

Looking the original ItemTreeView component (core:/sitecore/client/Business Component Library/version 1/Layouts/Renderings/Lists and Grids/ItemTreeView) brought me to its View file (\sitecore\shell\client\Business Component Library\Layouts\Renderings\ListsAndGrids\TreeViews\ItemTreeView.cshtml).

These are the important parts we want to pay a special attention:

  • Line 17 – It’s where the rootId configured by the user is taken:
    var rootItemId = userControl.GetString("RootItem");
  • Lines 56 to 65 – RootId is used to retrieve the real Sitecore item:
    Item rootItem = null;
    if (!string.IsNullOrEmpty(rootItemId))
    {
        rootItem = database.GetItem(rootItemId, Language.Parse(contentLanguage));
    }
    if (rootItem == null)
    {
        rootItem = database.GetRootItem();
    }
  • Lines 67 to  69 – The TreeView component is setup with the Root Item configured
    var rootItemIcon = Images.GetThemedImageSource(!string.IsNullOrEmpty(rootItem.Appearance.Icon) ? rootItem.Appearance.Icon : "Applications/16x16/documents.png", ImageDimension.id16x16);
    userControl.SetAttribute("data-sc-rootitem", rootItem.DisplayName + "," + rootItem.Database.Name + "," + rootItem.ID + "," + rootItemIcon);
    userControl.SetAttribute("data-sc-rootitempath", rootItem.Paths.Path);
  • Lines 86 to 88 – The div container is output with its configurations
    <div @htmlAttributes>
    <ul></ul>
    </div>

My first experience were to duplicate the div markup, so I had it twice:

<div @htmlAttributes>
<ul></ul>
</div>
<div @htmlAttributes>
<ul></ul>
</div>

This ended up showing two identical roots – and the “twin” root worked perfectly. That was the confirmation I needed to create an improved version of the ItemTreeView component that can spit multiple roots. With the goal of simplicity, my component will act the same as the original ItemTreeView, but its “RootItem” property now will accept not just one, but multiple IDs (in a pipe-delimited string).

Here are the steps I take:

STEP 1 – Create the new component

  • Duplicated the original component and gave it the name “ItemTreeView2″
  • At the duplicated item, deleted the children “ItemTreeView Parameters” (we are going to use the original parameters template)
  • “Parameters Template” field – it should stay poiting to the original (Client/Business Component Library/version 1/Layouts/Renderings/Lists and Grids/ItemTreeView/ItemTreeView Parameters)
  • At the “Path” field I made it point to my new View file: /sitecore/shell/client/Business Component Library/Layouts/Renderings/ListsAndGrids/TreeViews/ItemTreeView2.cshtml

STEP 2 – Make it accept multiple Roots

  • Comment the original line:
    //var rootItemId = userControl.GetString("RootItem");
  • Add our new logic:
    var rootItemIds = userControl.GetString("RootItem").Split('|');
    var rootItemId = rootItemIds.FirstOrDefault();

Most of the original component will stay as is, and the first Root will be handled exactly how it is natively done.

STEP 3 – Make it spit the other Roots

  • At the very bottom of the View, we are going to create a new list of HtmlAttributes, which is here used to represent the Tree setup:
    @{
        var lstAttributes = new List<HtmlString>();
        for (var i = 1; i < rootItemIds.Length; i++)
        {
            rootItem = null;
            if (!string.IsNullOrEmpty(rootItemId))
            {
                rootItem = database.GetItem(rootItemIds[i], Language.Parse(contentLanguage));
            }
            if (rootItem == null)
            {
                rootItem = database.GetRootItem();
            }
            rootItemIcon = Images.GetThemedImageSource(!string.IsNullOrEmpty(rootItem.Appearance.Icon) ? rootItem.Appearance.Icon : "Applications/16x16/documents.png", ImageDimension.id16x16);
            userControl.SetAttribute("data-sc-rootitem", rootItem.DisplayName + "," + rootItem.Database.Name + "," + rootItem.ID + "," + rootItemIcon);
            userControl.SetAttribute("data-sc-rootitempath", rootItem.Paths.Path);
            lstAttributes.Add(userControl.HtmlAttributes); 
        } 
    }
  • And then have the loop for the HTML container:
    @foreach (var htmlAttr in lstAttributes)
    {
        <div @htmlAttr>
        <ul></ul>
        </div>
    }
    

And that’s all!

Now I can add a pipe-delimited list of IDs, either by adding it tothe rendering setup or by using my PageCode file:

TreeDsBaseTemplates.Parameters["RootItem"] = String.Join("|",
ComponentItem.DatasourceTemplate.BaseTemplates.Select(p => p.ID.ToString()).ToArray());
TreeDsBaseTemplates.Parameters["Database"] = ComponentItem.DatasourceTemplate.Database.Name;

This way my module now has two ItemTreeViews: first is a normal (native) one, the other is my extended control with roots being dynamically added:

ItemTreeView2

The package

Here is the resulting package of the new ItemTreeView2 SPEAK component. Fell free to download and use it!

Enjoy!

Posted in Development, SPEAK

Mass data processing with Rules and Actions - The Sitecore Rule Processor Module

One of the most laborious things in any CMS, not different in Sitecore, is the mass processing and transformation of data. It’s specially remarkable when you spend a day or two doing manual selection and edition of individual items for any reason. First time I felt that it was so painful, I ended up obsessed for a tool that could minimally automatize these kind of jobs.

A bit of History

Surprisingly I couldn’t find anything, then the only options would be sit and wait for something to arise, or create my own tool. As a born eager, I could not stay still and wait to, sooner or later, be parachuted to my next nightmare. Can’t sleep with that!

ETL?

When I started looking for options, based on some of my experiences with other systems, my first thought was to build or extend an ETL tool. ETL, which stands for “extract, transform and load”, is commonly used in data warehouse systems to massively retrieve, filter and modify data across different sources (SQL servers, XML files, Excel spreadsheets, web services, etc).

With so many good tools at the market, building an ETL tool from scratch would not be smart from my part. My best choose would be to pick an existent tool and make it able to read and write on Sitecore databases. The ItemWebAPI would make it possible, but still very low-level. That would also require some significant effort to create the configurations, business logic and interfaces to make it connect and speak with Sitecore.

It stood reverberating on my mind…

Sitecore Rocks?

I also considered creating an extension to Sitecore Rocks. That sounded like a good idea, as Sitecore Rocks already handles a considerable portion of what is needed to connect and interact with Sitecore databases. It also counts with XPath and query builders, whose logic could be used by my module to retrieve data from Sitecore. But in the other hand it would require Visual Studio to run, which would limit the module coverage, and still some UI and low-level implementation would be required.

And I let it reverberating on my mind…

A Sitecore Module?

Speaking in abrangence, my tool would preferable be a Sitecore Module instead of anything else. This way I would save time connecting to Sitecore, as the code would be executed inside of it. I would also save energy translating data back and forth as it reads and writes data on Sitecore, since I could simply use the Sitecore API as usual. That would also make things easier as the whole environment is familiar to me, and I wouldn’t have to dig down to technologies I’m not familiar with. I was trying to be pragmatic on my purpose of shielding from another nightmare with a minimum effort. Extending an existent Sitecore feature would be the best option.

Sitecore Buckets?

My first real attempt was to extend the Sitecore Buckets feature (image below). It has a nice UI for filtering and listing items, as well as some Search Operations that we can use to modify the items selected.

Sitecore Buckets

The native Buckets system would still have to be modified in order to be used for what I wanted, but it looked like the best option till that time. Some investigation demonstrated that the change wouldn’t be trivial, more studying and investigation would be needed. I was also not very satisfied with the way that items are filtered, conditions are programmed, and how user friendly it is to build a query.

For some time it simply stood reverberating on my mind…

The Rules Engine – An insight from Digital Marketing

It was during the preparation of a Sitecore training course for Content Editors, targeting Digital Marketing features, that I first considered using the Sitecore Rules Engine for my purposes. Despite it is originally used for other things, such as Content Personalization or Event-triggered custom tasks, the experience was very close to what I was thinking. The way conditions and actions are chained is perfect for flexibility and is also very user-friendly.

Rules

That is actually perfect for my intents:

  • Rules would now be a way to save a data transforming pattern
  • Conditions could be used to select and filter content
  • Actions would be responsible for data processing and transformations.
  • They are also very easy to code

Then the idea started to flow out of my mind into a real tool…

The Sitecore Rule Processor

Available at the Sitecore Marketplace, Sitecore Rule Processor is the result of my efforts to automatize data transforming in Sitecore. After installed, when a Rule is selected at the Content Editor an icon is shown.

Processor Icon

When clicked, it brings up a window to process the rule, where users can easily filter items that matches the rule’s conditions, then execute their actions against all or some of the items:

Set Root and Bring Results

This way any user can visually build queries to retrieve items, and setup actions to transform them.

The module comes with a series of custom Actions, to increase the number of things a user can do to process and modify items, such as:

  1. Add a version to the item at a certain language;
  2. Change the item’s template;
  3. Copy, Move or Clone items to a certain path;
  4. Delete items;
  5. Empty an item’s field;
  6. Log a message with info taken from the item;
  7. Publish the item;
  8. Replace a string in a field of the item;
  9. Replace a string in an item’s name;
  10. Run a script;
  11. Serialize the item;
  12. Set a value at an item’s field;
  13. Set the value of an item’s field as the value of another item’s field;
  14. Set the value of an item’s field as the ID of another item;
  15. Set a workflow state.

I still have some useful actions left to do in my backlog, which will expand even more the value of this module.

Real life experience: existent actions

Then it finally happened: the next “monkey-job” were lurking at the clouds, being parachuted on me without a notice. One of our projects came with this demand: a “news article” template had some of its look and feel updated, and it imposed the need to replace some content of a Richtext field in all items (there were around 200 items at the repository). We had to replace all entries of class=”old-style” to our new class=”new-stype”.

My rule was then composed by:

  • Conditions
    • Where template=’News Article’
  • Actions
    • Replace a string in a field of the item
      • Replace class=”old-style”
      • By class=”new-style”
      • At field “Body”

 

Real life experience: a custom action

The above covered part of our needs, unfortunately some replacements were not that straight-forward. For instance, some of the markup had to be replaced by different tags. In our case, some tables inserted by the client had to be replaced by better formed <figure> tags.

This demanded the creation of a custom action, which would parse the HTML and apply replacement logic at its Document Object Model, then save it back to the item. To learn how to create a custom action, please check this article from John West.

My rule ended up very similar to the previous:

  • Conditions
    • Where template=’News Article’
  • Actions
    • My custom action for DOM replacements

Please fell free to download and test the module (available on the Sitecore Marketplace at this link), and expand it as you need. Also please let me know if you have questions or any feedback!

Posted in Actions, Rules, Sitecore Rule Processor
Social Media Auto Publish Powered By : XYZScripts.com