Removing duplicates from a PageDataCollection
Posted on November 27, 2009 by Frederik Vig in C#, EPiServerToday I had to remove duplicate pages from a PageDataCollection. I checked for a method that would help me with this in the SDK, but couldn’t find one. I then went to the Filters namespace to see if there was a filter for this, but no luck.
I then remembered that System.Linq has a great extension method for collections that implement IEnumerable, called Distinct, that does exactly what I want.
The code is pretty simple.
// using System.Linq; myPageDataCollection.Distinct(); |
Simple and elegant, but of course it didn’t work!
Custom comparer
If you take a look at the documentation for the Distinct method you’ll see an overload method that takes IEqualityComparer(T) for comparing.
Below is a custom comparer for PageData, that implements the IEqualityComparer interface. Pretty simple code.
public class PageDataComparer : IEqualityComparer<PageData> { public bool Equals(PageData a, PageData b) { return a.PageLink.CompareToIgnoreWorkID(b.PageLink); } public int GetHashCode(PageData pageData) { return pageData.PageLink.GetHashCode(); } } |
The updated call to the Distinct method now looks like this.
myPageDataCollection.Distinct(new PageDataComparer()); |
Custom filter
We can do the same with a custom filter.
public class FilterRemoveDuplicates : IPageFilter { public void Filter(PageDataCollection pages) { var pageReferences = new List<PageReference>(); for (int pageIndex = pages.Count - 1; pageIndex >= 0; pageIndex--) { PageData pageData = pages[pageIndex]; if (pageReferences.Contains(pageData.PageLink)) { pages.RemoveAt(pageIndex); } else { pageReferences.Add(pageData.PageLink); } } } public void Filter(object sender, FilterEventArgs e) { this.Filter(e.Pages); } public bool ShouldFilter(PageData page) { throw new NotImplementedException(); } } |
new Filters.FilterRemoveDuplicates().Filter(myPageDataCollection); |
Discussion · No Comments
There are no responses to "Removing duplicates from a PageDataCollection". Comments are closed for this post.Oops! Sorry, comments are closed at this time. Please try again later.