http://darrelltunnell.net/Darrell's Blog20182018-08-11T17:56:44ZWelcome!http://darrelltunnell.net/blog/2017/01/24/csla-and-asp.net-coreCSLA and ASP.NET Core2017-01-24T17:50:00Z<p>I am a fan of CSLA and I recently came accross a need to make a shiny CSLA business layer work nicely within the context of an ASP.NET Core application.
This post is aimed at CSLA developers with a similar need.
As of the <a href="https://www.nuget.org/packages/CSLA-Core/4.6.500">current release</a>, there are a few things that you will need to take care of in order to get CSLA working smoothly, and I will cover those in this post.</p>
<!--more-->
<h3 id="csla.applicationcontext">Csla.ApplicationContext</h3>
<p>CSLA developers should be familiar with <code>Csla.ApplicationContext</code> which provides the means to access useful context, as well as the current <code>User</code> / <code>IPrinciple</code>.
However when running under ASP.NET Core, <code>Csla.ApplicationContext</code> doesn't work correctly - as CSLA decides that your application is not a web application (as <code>HttpContext.Current</code> is null under ASP.NET Core) and so
it ends up storing things that should be stored in <code>HttpContext</code> on the current <code>Thread</code> instead.</p>
<p>To overcome this, however, is pretty straight forward. You just need to implement an extensibility point that Rocky has provided, called an <code>IContextManager</code>.</p>
<h3 id="icontextmanager-for-asp.net-core">IContextManager for ASP.NET Core.</h3>
<p>Here is an implementation of an <code>IContextManager</code> that resolves the <code>HttpContext</code> using the <code>IHttpContextAccessor</code> instead:</p>
<pre><code class="csharp">
/// <summary>
/// Application context manager that uses HttpContextAccessor whenr esolving HttpContext
/// to store context values.
/// </summary>
public class HttpContextAccessorContextMananger : IContextManager
{
private const string _localContextName = "Csla.LocalContext";
private const string _clientContextName = "Csla.ClientContext";
private const string _globalContextName = "Csla.GlobalContext";
private readonly IServiceProvider _serviceProvider;
public HttpContextAccessorContextMananger(IServiceProvider serviceProvider)
{
_serviceProvider = serviceProvider;
}
protected virtual HttpContext GetHttpContext()
{
var httpContextAccessor = (IHttpContextAccessor)_serviceProvider.GetService(typeof(IHttpContextAccessor));
if (httpContextAccessor != null)
{
if (httpContextAccessor.HttpContext == null)
{
// Debugger.Break();
}
return httpContextAccessor.HttpContext;
}
// Debugger.Break();
return null;
}
/// <summary>
/// Gets a value indicating whether this
/// context manager is valid for use in
/// the current environment.
/// </summary>
public bool IsValid
{
get
{
return GetHttpContext() != null;
}
}
/// <summary>
/// Gets the current principal.
/// </summary>
public System.Security.Principal.IPrincipal GetUser()
{
return GetHttpContext()?.User;
}
/// <summary>
/// Sets the current principal.
/// </summary>
/// <param name="principal">Principal object.</param>
public void SetUser(System.Security.Principal.IPrincipal principal)
{
var context = GetHttpContext();
if (context != null)
{
context.User = (ClaimsPrincipal)principal;
}
else
{
// Debugger.Break();
}
}
/// <summary>
/// Gets the local context.
/// </summary>
public ContextDictionary GetLocalContext()
{
return (ContextDictionary)GetHttpContext()?.Items[_localContextName];
}
/// <summary>
/// Sets the local context.
/// </summary>
/// <param name="localContext">Local context.</param>
public void SetLocalContext(ContextDictionary localContext)
{
var context = GetHttpContext();
if (context != null)
{
context.Items[_localContextName] = localContext;
}
else
{
// Debugger.Break();
}
}
/// <summary>
/// Gets the client context.
/// </summary>
public ContextDictionary GetClientContext()
{
return (ContextDictionary)GetHttpContext()?.Items[_clientContextName];
}
/// <summary>
/// Sets the client context.
/// </summary>
/// <param name="clientContext">Client context.</param>
public void SetClientContext(ContextDictionary clientContext)
{
var context = GetHttpContext();
if (context != null)
{
context.Items[_clientContextName] = clientContext;
}
else
{
// Debugger.Break();
}
}
/// <summary>
/// Gets the global context.
/// </summary>
public ContextDictionary GetGlobalContext()
{
return (ContextDictionary)GetHttpContext()?.Items[_globalContextName];
}
/// <summary>
/// Sets the global context.
/// </summary>
/// <param name="globalContext">Global context.</param>
public void SetGlobalContext(ContextDictionary globalContext)
{
var context = GetHttpContext();
if (context != null)
{
context.Items[_globalContextName] = globalContext;
}
else
{
// Debugger.Break();
}
}
}
</code></pre>
<h3 id="configuring-csla">Configuring CSLA</h3>
<p>Now that you have this implementation, you need to tell CSLA to use it. In the ASP.NET Core world, we are used to nice extension methods that we can use in our
<code>startup.cs</code> classes, in order to configure things fluently.</p>
<p>CSLA doesn't provide one of these just yet, but we can create one fairly easily:</p>
<pre><code class="language-csharp">
public static class CslaConfiguration
{
public static IServiceCollection ConfigureCsla(this IServiceCollection services, Action<CslaOptions, IServiceProvider> setupAction = null)
{
services.AddSingleton<CslaOptions>((sp) =>
{
var options = new CslaOptions();
setupAction?.Invoke(options, sp);
return options;
});
return services;
}
public static IApplicationBuilder UseCsla(this IApplicationBuilder appBuilder)
{
// grab the options
var options = appBuilder.ApplicationServices.GetRequiredService<CslaOptions>();
// configure csla according to options.
Csla.Server.FactoryDataPortal.FactoryLoader = options.ObjectFactoryLoader;
Csla.ApplicationContext.WebContextManager = options.WebContextManager ?? new HttpContextAccessorContextMananger(appBuilder.ApplicationServices);
return appBuilder;
}
public class CslaOptions
{
public IObjectFactoryLoader ObjectFactoryLoader { get; set; }
public IContextManager WebContextManager { get; set; }
}
}
</code></pre>
<p>So we can now configure CSLA in our <code>startup.cs</code> like this:</p>
<pre><code class="language-csharp">
public class Startup
{
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
// Configure Csla
services.ConfigureCsla((options, sp) => {
// could configure other CSLA hooks / options here in future.
});
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
app.UseCookieAuthentication(new CookieAuthenticationOptions
{
AuthenticationType = DefaultAuthenticationTypes.ApplicationCookie,
LoginPath = new PathString("/Account/Login")
});
app.UseCsla();
}
}
</code></pre>
<p>CSLA <code>ApplicationContext</code> should now behave correctly thanks to the custom <code>IContextManager</code> for ASP.NET Core.</p>
<h3 id="conclusion">Conclusion</h3>
<p>I also went a few steps further in my particular application, to enable DI to flow from my ASP.NET Core <code>IServiceProvider</code> into my CSLA business tier, but that's a topic for another article.</p>
<p>I suspect that Rocky will release improved support for ASP.NET Core in the future, so that this post becomes obsolete.
Until then, Bon Appétit!</p>
<p>I am a fan of CSLA and I recently came accross a need to make a shiny CSLA business layer work nicely within the context of an ASP.NET Core application.
This post is aimed at CSLA developers with a similar need.
As of the <a href="https://www.nuget.org/packages/CSLA-Core/4.6.500">current release</a>, there are a few things that you will need to take care of in order to get CSLA working smoothly, and I will cover those in this post.</p>http://darrelltunnell.net/blog/2016/07/25/aspnet-core-taghelper-s-a-better-addtaghelper-type-resolverASPNET Core TagHelper's - A Better @addTagHelper type resolver2016-07-25T18:50:00Z<h3 id="whats-this-about">What's this about?</h3>
<p>This is about TagHelper's in ASP.NET Core, and how to get more flexible <code>@addTagHelper</code> directives.</p>
<!--more-->
<p>Suppose your application loads some assemblies dynamically - for example, from a plugins folder, and those assemblies contain <code>TagHelper</code>'s.</p>
<p>In startup.cs you would have something like this to register your assemblies with the MVC parts system:</p>
<pre><code class="language-csharp">
var assy = Assembly.LoadFile("C:\\SomePath\Plugin.Authentication.dll");
mvcBuilder.AddApplicationPart(assy);
var assy = Assembly.LoadFile("C:\\SomePath\Plugin.Markdown.dll");
mvcBuilder.AddApplicationPart(assy);
</code></pre>
<p>Now suppose you have a Razor View with some markup that can be targeted by those tag helpers:</p>
<pre><code class="language-html">
<plugin-authentication />
<plugin-markdown visible="true"/>
</code></pre>
<p>If you run your application, those TagHelper's won't work.
This is because you don't have any <code>@addTagHelper</code> directive yet in your razor view, and so razor doesn't know it should be using them. This is where things get a bit interesting!</p>
<!-- more -->
<h3 id="lets-add-an-addtaghelper-directive">Let's add an <code>addTagHelper</code> directive</h3>
<p>So we add the directive to our __ViewImports.cshtml file:</p>
<pre><code class="language-bat">@addTagHelper "*, Plugin.Markdown"
</code></pre>
<p>Now when we start our application, BOOM:</p>
<pre><code class="language-bat">
An error occurred during the compilation of a resource required to process this request. Please review the following specific error details and modify your source code appropriately.
/Views/_ViewImports.cshtml
Cannot resolve TagHelper containing assembly 'Plugin.Markdown'. Error: Could not load file or assembly 'Plugin.Markdown' or one of its dependencies. The system cannot find the file specified.
@addTagHelper "*, Plugin.Markdown"
</code></pre>
<p>This is because by defualt MVC does not resolve <code>TagHelper</code> assemblies registered with the parts system (atleast this is true as of RTM 1.0.0) so it complains when it processes that directive, saying it can't find such an assembly - because it can only see assemblies that are in the bin folder by default. So it can't see your plugin assembly.</p>
<h3 id="how-do-we-solve">How do we solve?</h3>
<p>Well if you add this line:</p>
<pre><code class="language-csharp">
mvcBuilder.AddTagHelpersAsServices();
</code></pre>
<p>That will register some replacement services that will check the application parts system when trying to resolve the tag helper assemblies based on the name provided by the addTagHelper directive.</p>
<p>However - this now works but it's still not ideal because we still have to add a directive for each <code>plugin</code> before it will work on our page/s. So when someone develops a new plugin, it won't work until we modify our <code>_ViewImports.cshtml</code> file and add another line:</p>
<pre><code class="language-csharp">@addTagHelper "*, Plugin.Markdown"
@addTagHelper "*, Plugin.Another"
@addTagHelper "*, Plugin.YetAnother"
</code></pre>
<p>This can be incredibly frustrating because if you are wanting an extensibile system where plugins can be installed on the fly, then they should just work without constant modifications to source code.</p>
<h3 id="so-can-we-do-better">So Can We Do Better?</h3>
<p>Yup. So here is my solution to this issue, and that is to allow <code>globbing</code> to be supported in the <code>addTagHelper</code> directive for the assembly name, just like it is for the TypeName portion.</p>
<p>So this is how you do that.</p>
<h3 id="itaghelpertyperesolver">ITagHelperTypeResolver</h3>
<p>We need to create an <code>ITagHelperTypeResolver</code> and implement it's <code>Resolve</code> method. This method takes the string provided by in the <code>addTagHelper</code> directive and returns all <code>TagHelper</code> type's that are matches to that string. We will make our implementation support globbing on the assembly name so it can match <code>TagHelper</code> types accross multiple assemblies registered with the <code>Application Parts</code> system, instead of just from a single one.</p>
<p>Here is my quick and dirty implementation, where I took a lot of the code from the microsoft implementation, and just added a few tweaks for globbing:</p>
<pre><code class="language-csharp">
public class AssemblyNameGlobbingTagHelperTypeResolver : ITagHelperTypeResolver
{
private static readonly System.Reflection.TypeInfo ITagHelperTypeInfo = typeof(ITagHelper).GetTypeInfo();
protected TagHelperFeature Feature { get; }
public AssemblyNameGlobbingTagHelperTypeResolver(ApplicationPartManager manager)
{
if (manager == null)
{
throw new ArgumentNullException(nameof(manager));
}
Feature = new TagHelperFeature();
manager.PopulateFeature(Feature);
// _manager = manager;
}
/// <inheritdoc />
public IEnumerable<Type> Resolve(
string name,
SourceLocation documentLocation,
ErrorSink errorSink)
{
if (errorSink == null)
{
throw new ArgumentNullException(nameof(errorSink));
}
if (string.IsNullOrEmpty(name))
{
var errorLength = name == null ? 1 : Math.Max(name.Length, 1);
errorSink.OnError(
documentLocation,
"Tag Helper Assembly Name Cannot Be Empty Or Null",
errorLength);
return Type.EmptyTypes;
}
IEnumerable<TypeInfo> libraryTypes;
try
{
libraryTypes = GetExportedTypes(name);
}
catch (Exception ex)
{
errorSink.OnError(
documentLocation,
$"Cannot Resolve Tag Helper Assembly: {name}, {ex.Message}",
name.Length);
return Type.EmptyTypes;
}
return libraryTypes;
}
/// <inheritdoc />
protected IEnumerable<System.Reflection.TypeInfo> GetExportedTypes(string assemblyNamePattern)
{
if (assemblyNamePattern == null)
{
throw new ArgumentNullException(nameof(assemblyNamePattern));
}
var results = new List<System.Reflection.TypeInfo>();
for (var i = 0; i < Feature.TagHelpers.Count; i++)
{
var tagHelperAssemblyName = Feature.TagHelpers[i].Assembly.GetName();
if (assemblyNamePattern.Contains("*")) // is it actually a pattern?
{
if (tagHelperAssemblyName.Name.Like(assemblyNamePattern))
{
results.Add(Feature.TagHelpers[i]);
continue;
}
}
// not a pattern so treat as normal assembly name.
var assyName = new AssemblyName(assemblyNamePattern);
if (AssemblyNameComparer.OrdinalIgnoreCase.Equals(tagHelperAssemblyName, assyName))
{
results.Add(Feature.TagHelpers[i]);
continue;
}
}
return results;
}
private class AssemblyNameComparer : IEqualityComparer<AssemblyName>
{
public static readonly IEqualityComparer<AssemblyName> OrdinalIgnoreCase = new AssemblyNameComparer();
private AssemblyNameComparer()
{
}
public bool Equals(AssemblyName x, AssemblyName y)
{
// Ignore case because that's what Assembly.Load does.
return string.Equals(x.Name, y.Name, StringComparison.OrdinalIgnoreCase) &&
string.Equals(x.CultureName ?? string.Empty, y.CultureName ?? string.Empty, StringComparison.Ordinal);
}
public int GetHashCode(AssemblyName obj)
{
var hashCode = 0;
if (obj.Name != null)
{
hashCode ^= obj.Name.GetHashCode();
}
hashCode ^= (obj.CultureName ?? string.Empty).GetHashCode();
return hashCode;
}
}
}
</code></pre>
<p>Now we just register this on startup, after we have registered <code>MVC</code>:</p>
<pre><code class="language-csharp">
services.AddSingleton<ITagHelperTypeResolver, AssemblyNameGlobbingTagHelperTypeResolver>();
</code></pre>
<p>Now we can just add one directive to our __ViewImports.cshtml file, like this:</p>
<pre><code class="language-csharp">@addTagHelper "*, Plugin.*"
</code></pre>
<p>Now that will include all TagHelpers that live in assemblies matching that glob. We can drop new plugins in and their tag helpers will light up automatically.</p>
<p>You are welcome.</p>
<p>This is about TagHelper's in ASP.NET Core, and how to get more flexible <code>@addTagHelper</code> directives.</p>http://darrelltunnell.net/blog/2016/03/19/dnn-extensions-sources-packages-what-are-theyDnn Extensions - Sources Packages?2016-03-19T17:50:00Z<p>I have been doing some work on DnnPackager recently, and I've come accross the concept of "Source" packages. I have to admit I am not entirely new to these, but I've never personally used them for my projects in the past.</p>
<p>Source packages are basically identical to the ordinary install zip's for your dnn module / extension, i.e you "install" them into your Dnn site like any other install package, except that they also include "source code" files within them, like .cs, .vb files etc.</p>
<!--more-->
<h3 id="why-would-you-want-to-include-source-code-in-your-install-zip">Why would you want to include source code in your install zip?</h3>
<p>Well this is where things get a little interesting.</p>
<p>The two main reasons I can fathom why you would want to include source code in an install zip are that:</p>
<ol>
<li>Your module uses dynamic compilation, and so unless you include source files with the module installation, then it just won't work.</li>
<li>You want to distribute your source code, so that developer's (who pay for it?) can open it up in VS and own it / make changes (improvements?). Usually you'd charge for this option, but it's faesible you are just an extremely generous developer (like me) who gives stuff away for free.</li>
</ol>
<p>Number 1 is a necessity really to cater for modules that use dynamic compilation.</p>
<p>Number 2 is an optional thing about you as a developer (or commercial entity), distributing your solution source code in a format that thrid parties can "own" it - irrespective of whether you have used dynamic compilation or not.</p>
<p>Note: If you are using Dynamic compilation for your module, then people allready have the ability to make changes to the code by simply going into the website directory after the module has been installed, and modifying the code files. But you already knew that right!! Whether they are legally entitled to do so ofcourse, would be down to the licence agreement.</p>
<p>Number 1 and 2 are different.</p>
<h3 id="why-are-they-different">Why are they different?</h3>
<p>Because in the first scenario, you are giving IIS the files it needs to compile and run your code within a Dnn website instance. In the second scenario, you are giving <strong>developers</strong> the files they need, to open up your project / solution and <strong>build</strong>, and compile your code, in a manner that spits out everything needed by scenario 1. In other words, the build and compilation that developers do, produces the output that's needed within the website for the compilation that IIS does.</p>
<h3 id="why-was-that-last-bit-important">Why was that last bit important</h3>
<p>Because files related to the build that developers do - i,e the ones that prodice the output that actually needs to be installed to the dnn site, arguable have no business being installed into a Dnn website. Key files, project files, solution files etc etc - there are all completely unrelated to the working / running of your module within Dnn, and have nothing to do with IIS dynamic compilation or anything. They shouldn't be installed in a website period (imho).</p>
<h3 id="dual-purpose">Dual purpose</h3>
<p>There seems to be a dual purpose for the sources package that doesn't sit right with me.
Using it to install source code into the website to support dynamic compilation seems like what it is meant for imho - it is a Dnn installation zip after all.</p>
<p>Using it to provide a third party with your VS solution / project files so that they can open up the solution in an IDE, build and compile the code is a completely different scenario, and I can't see how that second scenario can work reliably just by including a .csproj in a dnn sources install zip - except for in the most simplisitic and basic of scenarios, which rarely happen in the real world.</p>
<h3 id="example-of-some-issues-with-including-sln-csproj-in-a-sources-zip-package">Example of some issues with including Sln / Csproj in a sources zip package.</h3>
<p>Currently, if you use widely available project templates to produce "sources" packages, they will by default, produce a sources "zip" file for each of the module projects in your solution, and this will contain source code files copied form your project, as well as the csproj, and sln file. (I think the sln will only get included if it lives within the project directory).</p>
<p>Already we hit an issue, as if you have multiple projects in your solution, and the sln file lives in a parent directory of those projects like this:</p>
<pre><code>
solution/mysln.sln
solution/projectA/projectA.csproj
solution/projectB/projectB.csproj
</code></pre>
<p>(which is fairly normal) then the sln file usually won't be included in the sources packages for any of your particular modules as it doesn't live directly within a project directory.</p>
<p>Secondly, if ProjectA has a project reference to ProjectB, and someone downloads the sources package for your projectA module, and opens up the csproj file that you have included in that sources package - the project is going to have a missing project reference to projectB so it won't compile.</p>
<p>There are yet more problems. If your .csproj files reference assemblies from some lib directory within your checkout directory somewhere, as this lib directory won't be included in the sources package (because it doesn't live within the project dir), anyone opening the project file in VS will see missing assembly references, they will have to manually correct them - otherwise the solution won't compile.</p>
<p>If your project files include some custom build targets that live on your machine, or within your checkout directory somewhere, etc etc - you guessed it the person opening the .csproj file is going to have issues because they won't be included in the same location within the sources package.</p>
<h3 id="alternatives">Alternatives?</h3>
<p>If you want to give away your VS solution (or sell the source) to a third party, there are better / easier ways to provide access to it without shoehorning it in to the dnn install zip imho!</p>
<p>The easiest may be to just zip up your entire solution (checkout directory), and allow that to be downloaded from some protected location. This does not have to be in a "dnn" install package format, just a simple zip file that the person recieving can extract and then open up the VS sln file. You want it to be like they just checked out the solution from source control and are now opening up the VS sln file - just like you do right?</p>
<p>If there are some pre-requisites to being able to open and build the solution, add a readme to the zip that explains what a developer must do before attempting to open the solution. This is usually handy to have in your source control anyway - in case you ever need to checkout and open the solution on a new machine one day that doesn't have your dependencies set up. These should be the same steps that any developer new to the company has to go through (including you) when checking out the code for the first time and wanting to open it.</p>
<h3 id="how-does-this-all-tie-in-with-dnnpackager">How does this all tie in with DnnPackager?</h3>
<p>Well, for the next realease of DnnPacakger, I have added rudimentary support for Sources packages (thanks to <a href="https://github.com/nvisionative">@nvisionative</a> for requesting this feature) - so that it will now produce "sources" packages alongside the standard install zip. However this is currently for the purposes of supporting modules that need to include source files in their installation process into Dnn, which I suspect will mainly just be ones that use dynamic compilation.</p>
<p>It won't include .csproj files or .sln files because at this point in time, I can't see how including them would lead to a reliable experience for the developer opening these up at the other end.</p>
<p>Disagree? Leave some comments below, I'd love to be convinced - or to just hear your views!</p>
<p>I have been doing some work on DnnPackager recently, and I've come accross the concept of "Source" packages. I have to admit I am not entirely new to these, but I've never personally used them for my projects in the past.</p>
<p>Source packages are basically identical to the ordinary install zip's for your dnn module / extension, i.e you "install" them into your Dnn site like any other install package, except that they also include "source code" files within them, like .cs, .vb files etc.</p>http://darrelltunnell.net/blog/2016/03/19/dnnpackager-v2-0-6DnnPackager - v2.0.62016-03-19T17:50:00Z<h3 id="dnnpackager-release-v2.0.6">DnnPackager Release -v2.0.6</h3>
<p>A short post to announce that a new minor release of DnnPackager is out.</p>
<p>You can find the release notes here explaining what's new: <a href="https://github.com/dazinator/DnnPackager/releases/tag/2.0.6%2B2">https://github.com/dazinator/DnnPackager/releases/tag/2.0.6%2B2</a></p>
<p>For an introduction to DnnPackager in general <a href="http://darrelltunnell.net/blog/2015/12/01/dnnpackager-getting-started/">see here</a></p>
<p>Special thanks to <a href="https://github.com/nvisionative">@nvisionative</a> for requesting this feature.</p>
<!--more-->
<p>A short post to announce that a new minor release of DnnPackager is out.</p>
<p>You can find the release notes here explaining what's new: <a href="https://github.com/dazinator/DnnPackager/releases/tag/2.0.6%2B2">https://github.com/dazinator/DnnPackager/releases/tag/2.0.6%2B2</a></p>
<p>For an introduction to DnnPackager in general <a href="http://darrelltunnell.net/blog/2015/12/01/dnnpackager-getting-started/">see here</a></p>
<p>Special thanks to <a href="https://github.com/nvisionative">@nvisionative</a> for requesting this feature.</p>http://darrelltunnell.net/blog/2016/01/24/aurelia-and-asp-net-5-mvc-part2ASP.NET 5 Projects - NuGet-NPM-Gulp-Bower-Jspm-Aurelia-Part22016-01-24T17:50:00Z<p><strong>This post is part two of a series. For part one see <a href="http://darrelltunnell.net/blog/2015/08/16/aurelia-and-asp-net-5-mvc">here</a></strong></p>
<h3 id="part-2-replacing-bower-with-jspm">Part 2 - Replacing Bower with JSPM</h3>
<p>In <a href="http://darrelltunnell.net/blog/2015/08/16/aurelia-and-asp-net-5-mvc">part 1 of this series</a> we created a shiny new ASP.NET 5 project, and I introduced some fundamentals.</p>
<p>For reasons discussed in <a href="http://darrelltunnell.net/blog/2015/08/16/aurelia-and-asp-net-5-mvc">part 1</a>, let's now go ahead with our first task, which is to ditch Bower in favour of JSPM as our javascript package manager.</p>
<!--more-->
<h3 id="uninstall-bower">Uninstall Bower</h3>
<p>You will notice that your ASP.NET 5 application has a number of bower packages included by default:</p>
<p><img src="/img/bowerpackages.PNG" class="img-fluid" alt="bowerpackages.PNG" /></p>
<p>First, let's uninstall Bower. In your project is a <code>Bower.json</code> file. Delete it! (If you can't see it in Solution Explorer, you might need to 'show all files'</p>
<p><img src="/img/bowerjson.PNG" class="img-fluid" alt="bowerjson.PNG" /></p>
<p>When you install <code>Bower</code> packages, they are installed under the "lib" folder within your <code>wwwroot</code> directory. So, let's now delete this lib folder which will delete all of these packages.</p>
<p><img src="/img/wwwrootlibfolder.PNG" class="img-fluid" alt="wwwrootlibfolder.PNG" /></p>
<p>After those changes, your project should look something like this:</p>
<p><img src="/img/projectremovedbower.PNG" class="img-fluid" alt="projectremovedbower.PNG" /></p>
<p>With Bower gone and those javascript / css packages deleted, what happens if we run the application now? Let's run it and find out..</p>
<p><img src="/img/runappbowerremoved.PNG" class="img-fluid" alt="runappbowerremoved.PNG" /></p>
<p>As you can see, there are now errors displayed in the browser, and our site looks awful. This makes sense - our application is referencing javascript and css files that used to live in the lib folder, and now they are no longer found because we deleted them.</p>
<p>To fix this situation we'll need to add these packages back to our application, using <code>JSPM</code>, and then fix up the way our application is <code>loading</code> these dependencies (javascript, css files) at runtime.</p>
<h3 id="install-jspm">Install JSPM</h3>
<p>JSPM can be installed as a local <code>NPM</code> package.</p>
<ol>
<li>Open <code>Package.json</code></li>
<li>Add <code>JSPM</code> and whatever the latest version is:</li>
</ol>
<p><img src="/img/addjspmnodejspackage.PNG" class="img-fluid" alt="addjspmnodejspackage.PNG" /></p>
<ol start="3">
<li>Save the file.</li>
</ol>
<p>The <code>NPM</code> package for <code>JSPM</code> should now be downloaded and installed into your project. You will see that the package is installed into the "node_modules" folder within your project.</p>
<p><img src="/img/nodemodulesfolderjspm.PNG" class="img-fluid" alt="nodemodulesfolderjspm.PNG" /></p>
<h3 id="configure-jspm">Configure JSPM</h3>
<p>Now that the <code>JSPM</code> package has been installed, we need to configure <code>JSPM</code>.
The way to do this, is a little bit fiddely, as you have to drop to the command line - there is no fancy support for <code>JSPM</code> in Visual Studio at the moment like there is for <code>Bower</code>.</p>
<ol>
<li>Open a <code>command prompt</code> window, and <code>CD</code> to your project directory</li>
<li>Type <code>jspm init</code> and hit enter.</li>
</ol>
<p><img src="/img/commandlinejspminit.PNG" class="img-fluid" alt="commandlinejspminit.PNG" /></p>
<p>You will now be asked a series of questions. At the end of answering these questions, the relevent <code>config</code> will be produced within the project.</p>
<p>Here are the answers. Some of them you can just hit enter without typing anything, and the default value will be used.</p>
<p><img src="/img/jspminit.PNG" class="img-fluid" alt="jspminit.PNG" /></p>
<p>I'll quickly run through each option briefly.. But you should defer to the <code>JSPM</code> documentation site for further clarifications.</p>
<ol>
<li><p>"<strong>Would you like jspm to prefix the jspm package.json properties under jspm?</strong>"
We answer yes to this (the default) and this just means that JSPM will store its project configuration within a "jspm" section in our existing <code>package.json</code> file.</p>
</li>
<li><p>"<strong>Enter server baseURL (public folder path)</strong>"
The word URL is a bit confusing here. This is the relative path to your "public" folder within the project. By public folder, we mean the folder that will serve up static files and is therefore accessible to a browser. We need to set this to the path to our <code>wwwroot</code> directory. So the value we set for this question is <code>./wwwroot</code> as the value is relative to the current (project) directory.</p>
</li>
<li><p>"<strong>Enter jspm packages folder [wwwroot\jspm_packages]</strong>"
We accept the default value for this question. Previously, our Bower packages were installed under <code>wwwroot\lib</code> folder, so if you want to keep this consistent you could change this value to <code>wwwroot\lib</code>. I however am happy to keep the default.</p>
</li>
<li><p>"<strong>Enter config file path [wwwroot\config.js]</strong>"
This is the path to where you would like the config javascript file to be placed. Remember, <code>JSPM</code> is not just a package manager in a the sense of allowing you to adopt packages at <code>design time</code>. It also has features that are used your application when it runs. This means it has a <code>config</code> file (a javascript file) that your application will actually need to reference at runtime. This config file must therefore be placed in a directory that can be served up. We accept the default value (wwwroot\config.js)</p>
</li>
<li><p>"<strong>Configuration file wwwroot\config.js doesn't exist, create it? [yes]</strong>"
We accept the default which is <code>yes</code> as we want it to create this config file for us.</p>
</li>
<li><p>"<strong>Enter client baseURL (public folder URL) [/]</strong>"
This is the URL or path that the browser uses to browse to the public folder (wwwroot). We accept the default value, because our public folder (wwwroot) is served up as the root path ("/").</p>
</li>
<li><p>"<strong>Do you wish to use a transpiler? [yes]</strong>"
We accept the default answer of "yes" because transpilers are awesome. They allow us to write javascript using the latest language specifications, and then they "transpile" that javascript so that it can run in browsers that don't support the latest language specifications yet.</p>
</li>
<li><p>"<strong>Which ES6 transpiler would you like to use, Babel, Typescript, or Traceur? [babel]</strong>"
For the purposes of this blog, I am accepting the default of "Babel".</p>
</li>
</ol>
<p>The transpiler will just allow us to write javascript using ES6 language features, and this will be transpiled to run in browsers that don't support ES6 yet.</p>
<h3 id="installing-jspm-packages">Installing JSPM Packages</h3>
<p>Now that we have <code>JSPM</code> configured, it's time to install those packages that we previously had installed via <code>Bower</code>.</p>
<p>Back in the <code>command prompt</code> run the following commands:</p>
<ol>
<li><code>jspm install jquery</code></li>
<li><code>jspm install jquery-validation</code></li>
<li><code>jspm install github:aspnet/jquery-validation-unobtrusive</code></li>
<li><code>jspm install bootstrap</code></li>
</ol>
<p>Once that is done, those packages will now be installed under your <code>wwwroot\jspm_packages</code> folder:</p>
<p><img src="/img/jspmpackages.PNG" class="img-fluid" alt="jspmpackages.PNG" /></p>
<p>The next step is to fix up our MVC application so that it loads our javascript and css using the <code>module loader</code>.</p>
<h3 id="transitioning-to-modules">Transitioning to Modules.</h3>
<p>The changes we have been making up until now, have been about managing our packages in our project at design time. This next step is about making changes to our application so that rather than including javascript and css files directly into particular pages, we instead, write "modular" javascript, that declares any dependencies it has, and then allow a <code>module loader</code> (<code>SystemJS</code>) to satisfy those dependencies for us at runtime by loading any needed javascript / css dependencies that our module requires.</p>
<p>If this sounds overwhelming, don't worry, it's simple once you get your head around the basic concept. Hopefully things will become more clear as we continue.</p>
<p>First, we need to include the module loader itself, and it's configuration file, into our application.</p>
<p>If you open <code>_Layout.cshtml</code> you will see a section like this:</p>
<pre><code class="language-xml"> <environment names="Development">
<script src="~/lib/jquery/dist/jquery.js"></script>
<script src="~/lib/bootstrap/dist/js/bootstrap.js"></script>
<script src="~/js/site.js" asp-append-version="true"></script>
</environment>
<environment names="Staging,Production">
<script src="https://ajax.aspnetcdn.com/ajax/jquery/jquery-2.1.4.min.js"
asp-fallback-src="~/lib/jquery/dist/jquery.min.js"
asp-fallback-test="window.jQuery">
</script>
<script src="https://ajax.aspnetcdn.com/ajax/bootstrap/3.3.5/bootstrap.min.js"
asp-fallback-src="~/lib/bootstrap/dist/js/bootstrap.min.js"
asp-fallback-test="window.jQuery && window.jQuery.fn && window.jQuery.fn.modal">
</script>
<script src="~/js/site.min.js" asp-append-version="true"></script>
</environment>
</code></pre>
<p>Let's comment out that whole section and replace it with this:</p>
<pre><code class="language-xml">
<script src="~/jspm_packages/system.js"></script>
<script src="~/config.js"></script>
<script>System.import("js/site");</script>
</code></pre>
<p>At this point, let's run the application!</p>
<p>You should now be able to see that we no longer get any errors about failing to load javascript files <code>jquery.js</code> and <code>bootstrap.js</code>. In fact, <strong>those javascript files are not being loaded anymore.</strong></p>
<p><img src="/img/jspmmissingcss.PNG" class="img-fluid" alt="jspmmissingcss.PNG" /></p>
<h4 id="why-arent-we-loading-jquery-and-bootstrap-anymore">Why aren't we loading JQuery and Bootstrap anymore?</h4>
<p>We are no longer directly including the <code>bootstrap</code> and <code>jquery</code> scripts into our <code>_Layout.cshtml</code> file, so they aren't being loaded! So we aren't seeing any 404's anymore within the browser console window - which is good, but don't we need those files for our site to function?</p>
<p>This is the nature of <code>modular</code> javascript. What we are in the process of transitioning to now, is a Modular concept, where <code>bootstrap</code> and <code>jquery</code> are modules that will only be loaded, if some other module that we are loading via the module loader, requires them as dependencies.</p>
<p>With that in mind, let's look at the module we are currently loading via the module loader. It's one called <code>js/site</code></p>
<pre><code class="language-js"><script>System.import("js/site");</script>
</code></pre>
<p>This resolves (thanks to the <code>config.js</code> file) to the <code>js/site.js</code> file in our <code>wwwroot</code> directory. This file is currently empty, meaning it also has no dependencies declared in it for any other modules. This is why the module loader no longer bothers to load <code>JQuery</code> or <code>Bootstrap</code> anymore.</p>
<p>This is good, because we are not including any javascript or css by default anymore, until its actually required by something (with the exception of the module loader, and config.js file iteself).</p>
<p>Therefore, as our <code>js/site</code> module is being loaded in our <code>_Layout.cshtml</code> file - which means it's going to be loaded on <strong>every page</strong>, we can "force" JQuery and Bootstrap to be loaded on every page, by decalring them as depencies for our module. This could be viewed as a bit of a cheat as really we don't want to load dependencies just for the sake of it, we only want to load them if they are actually used.</p>
<p>So, let's now assume that we are willing to load <code>JQuery</code>, and <code>Bootstrap</code> as a dependency for every page:</p>
<ol>
<li>Open <code>site.js</code> and insert the following code, then save it an re-run the application:</li>
</ol>
<pre><code class="language-js">import $ from 'jquery';
import bootstrap from 'bootstrap';
</code></pre>
<p>This is <code>ES6</code> syntax for declaring a module dependency.</p>
<p>You should now see that <code>JQuery</code> and <code>Bootstrap</code> are loaded on every page:</p>
<p><img src="/img/jspmjqueryandbootstrapdependency.PNG" class="img-fluid" alt="jspmjqueryandbootstrapdependency.PNG" /></p>
<h3 id="what-about-css">What about CSS</h3>
<p>Now that we have got our javascript files loading again, we are still left with 404's for the bootstrap.css file.</p>
<p>Well we can use JSPM for CSS too, but we need to install the <code>CSS</code> plugin.</p>
<h4 id="installing-the-css-plugin-for-jspm">Installing the CSS plugin for JSPM</h4>
<p>Back in the <code>command prompt</code> in your project directory, run the following</p>
<pre><code class="language-bat">jspm install css
</code></pre>
<p>Now go back to your <code>site.js</code> file, and add an import for the bootstrap.css. It should now look like this:</p>
<pre><code class="language-js">import $ from 'jquery';
import bootstrap from 'bootstrap';
import 'bootstrap/css/bootstrap.css!';
</code></pre>
<p>Lastly, in <code>_Layout.cshtml</code>, comment out the link to the old - non existent, bootstrap.css file:</p>
<pre><code class="language-xml">
<environment names="Development">
@*<link rel="stylesheet" href="~/lib/bootstrap/dist/css/bootstrap.css" />*@
<link rel="stylesheet" href="~/css/site.css" />
</environment>
<environment names="Staging,Production">
@*<link rel="stylesheet" href="https://ajax.aspnetcdn.com/ajax/bootstrap/3.3.5/css/bootstrap.min.css"
asp-fallback-href="~/lib/bootstrap/dist/css/bootstrap.min.css"*@
asp-fallback-test-class="sr-only" asp-fallback-test-property="position" asp-fallback-test-value="absolute" />
<link rel="stylesheet" href="~/css/site.min.css" asp-append-version="true" />
</environment>
</code></pre>
<p>Now run your application!</p>
<p><img src="/img/jspmnoerrors.PNG" class="img-fluid" alt="jspmnoerrors.PNG" /></p>
<p>Wahoo! We now have no errors in the console window, our javascript and css is being loaded - and our application looks ok again.</p>
<h4 id="flash-of-unstyled-content">Flash of unstyled content</h4>
<p>You may notice that using the CSS plugin, your page is displayed in an unstyled form for a brief moment, whilst the CSS file is loaded asynchronosuly. This is known as a <a href="http://www.techrepublic.com/blog/web-designer/how-to-prevent-flash-of-unstyled-content-on-your-websites/">Flash of Unstyled Content</a> and is a problem with using the CSS plugin at present - <a href="https://github.com/systemjs/plugin-css/issues/57">see here</a>. Hopefully this will be addressed in the future, but in the meantime, feel free not to use the CSS Plugin if this is an issue, you can instead just directly reference the <code>Bootstrap.css</code> file in the <code>_Layout.cshtml</code> file as before, but from its new location under the <code>jspm_packages</code> directory.</p>
<h3 id="finishing-touches_validationscripspartial.cshtml">Finishing Touches - <code>_ValidationScripsPartial.cshtml</code></h3>
<p>Our application is running again, but you may notice a few of the pages have errors.</p>
<p>If you click on "Register" link for example you will see these errors in the browser console window:</p>
<p><img src="/img/jspmregisterpageproblems.PNG" class="img-fluid" alt="jspmregisterpageproblems.PNG" /></p>
<p>This is because many of the views within our MVC application are rendering a partial called <code>_ValidationScriptsPartial.cshtml</code></p>
<p>For example, if you look at the bottom of <code>Register.cshtml</code>, you will see the following:</p>
<pre><code class="language-csharp">
@section Scripts {
@{ await Html.RenderPartialAsync("_ValidationScriptsPartial"); }
}
</code></pre>
<p>If you look at the contents of <code>_ValidationScriptsPartial</code> we can see that it is actually including additional scripts onto the page:</p>
<pre><code class="language-xml"><environment names="Development">
<script src="~/lib/jquery-validation/dist/jquery.validate.js"></script>
<script src="~/lib/jquery-validation-unobtrusive/jquery.validate.unobtrusive.js"></script>
</environment>
<environment names="Staging,Production">
<script src="https://ajax.aspnetcdn.com/ajax/jquery.validate/1.14.0/jquery.validate.min.js"
asp-fallback-src="~/lib/jquery-validation/dist/jquery.validate.min.js"
asp-fallback-test="window.jQuery && window.jQuery.validator">
</script>
<script src="https://ajax.aspnetcdn.com/ajax/mvc/5.2.3/jquery.validate.unobtrusive.min.js"
asp-fallback-src="~/lib/jquery-validation-unobtrusive/jquery.validate.unobtrusive.min.js"
asp-fallback-test="window.jQuery && window.jQuery.validator && window.jQuery.validator.unobtrusive">
</script>
</environment>
</code></pre>
<p>As you can see, depending upon the environment that ASP.NET determines your application is running on, this renders some script includes to particular js files, used for forms validation. These were previoulsy located within <code>Bower</code> packages, that we have deleted.</p>
<p>To correct this, we'll just need to instruct the module loader to load the <code>jquery.validate.unobtrusive</code> module instead. Notice that you don't need to also instruct the module loader to load the <code>jquery.validate</code> module, because <code>jquery.validate</code> is a dependency of <code>jquery.validate.unobtrusive</code> so the module loader will resolve it automatically.</p>
<p>So change the contents of <code>_ValidationScriptsPartial.cshtml</code> to this:</p>
<pre><code class="language-js"><script>System.import("aspnet/jquery-validation-unobtrusive");</script>
</code></pre>
<p>And now - everything is working!</p>
<p><img src="/img/jspmallworking.PNG" class="img-fluid" alt="jspmallworking.PNG" /></p>
<h3 id="recap">Recap</h3>
<p>In this blog post, we took an out of the box ASP.NET 5 MVC application that we created in <a href="http://darrelltunnell.net/blog/2015/08/16/aurelia-and-asp-net-5-mvc">part 1</a>, and replaced <code>Bower</code> with <code>JSPM</code>. We changed the way our application resolves it's javascript and css files, to use a <code>module loader</code> instead.</p>
<p>We also saw that using the module loader to load CSS currently results in a "flash of unstyled content" issue, and so if that's an issue for your application then it's probably best to stick to directly linking to your css files as before, for the time being. That's a decision for you to make!</p>
<p>In the next blog post/s in this series, I will attempt to cover:</p>
<ol>
<li>Creating a basic Aurelia application on the Home page.</li>
<li>Introducing Linting, Bundling, and Minification into the build process, using Gulp.</li>
<li>Implementing "Automatic Browser Refresh" so our page refreshes as we make changes to javascript and css files.</li>
</ol>
<p><strong>This post is part two of a series. For part one see <a href="http://darrelltunnell.net/blog/2015/08/16/aurelia-and-asp-net-5-mvc">here</a></strong></p>
<p>In <a href="http://darrelltunnell.net/blog/2015/08/16/aurelia-and-asp-net-5-mvc">part 1 of this series</a> we created a shiny new ASP.NET 5 project, and I introduced some fundamentals.</p>
<p>For reasons discussed in <a href="http://darrelltunnell.net/blog/2015/08/16/aurelia-and-asp-net-5-mvc">part 1</a>, let's now go ahead with our first task, which is to ditch Bower in favour of JSPM as our javascript package manager.</p>http://darrelltunnell.net/blog/2016/01/17/imagining-a-dotnetnuke-project-type-for-visual-studioImagining a DotNetNuke Project Type for Visual Studio2016-01-17T17:50:00Z<h3 id="introduction">Introduction</h3>
<p>When developing DotNetNuke extensions, we typically use one of the existing Visual Studio Project Type's, for example - an ASP.NET Web Application project.</p>
<p>Even when using a Project Template such as Christoc's, the project template is still based upon one of the standard Visual Studio project types - usually an ASP.NET Web Application project.</p>
<p>However these Project Types do not "gel" well with DotNetNuke development in a number of areas, the main ones being:</p>
<ol>
<li>Running the project in VS (clicking play) - wants to run the extensions as a Web Application, but this makes no sense for a Dnn extension - which has to be hosted by the DotNetNuke website.</li>
<li>Deploying the extension - there is no support for that in the project system - you have to manually deploy your extensions to the Dnn instance.</li>
<li>Debugging the extension - you have to manually attach to process.</li>
</ol>
<p>So.. what if there was a new Project Type, one that was purpose built for DotNetNuke development? What would that look like?</p>
<!--more-->
<h3 id="introducing-the-dotnetnuke-project-type">Introducing the "DotNetNuke" Project Type</h3>
<p>I am currently developing a new VS Project Type explicitly for DotNetNuke development. The rest of this blog post will describe my vision for how this will work.</p>
<h4 id="installing-the-project-type">Installing the Project Type</h4>
<p>You would start by installing the VSIX package from the VS gallery. This will install the DotNetNuke project type, and make this project type available to you when you create new projects in VS.</p>
<h4 id="create-a-new-project">Create a New Project</h4>
<p>You can now create a new "DotNetNuke" project using Visual Studio.</p>
<p><img src="/img/new%20dnn%20project.PNG" class="img-fluid" alt="new dnn project.PNG" /></p>
<p>This creates your new project. It also imports the "DnnPackager" NuGet package automatically - <a href="http://darrelltunnell.net/blog/2015/12/01/dnnpackager-getting-started/">something I have blogged about seperately.</a></p>
<p><img src="/img/adding%20DnnPackager.PNG" class="img-fluid" alt="adding DnnPackager.PNG" /></p>
<p>Your new project, has it's own ".dnnproj" file. This is a new project type and that's why it has its own file extension ".dnnproj".</p>
<p><img src="/img/SolutionExplorer1.PNG" class="img-fluid" alt="SolutionExplorer1.PNG" /></p>
<h4 id="adding-content">Adding Content</h4>
<p>You can now add items to your project. If you "Add new item" - you will be able select from a number of standard DotNetNuke item templates. For example a "Module View".</p>
<p><img src="/img/AddModuleView.png" class="img-fluid" alt="AddModuleView.png" /></p>
<p>Initially, I will just show Dnn 7 compatible item templates, but eventually I'd also like to add a seperate group for Dnn 8 item templates, which would include item templates for the new MVC and SPA stuff.</p>
<p>When you add the new item, not only do the source code files get added to your project, but any required dependencies also get brought in by the magical power of NuGet:</p>
<p><img src="/img/AddingDotNetNukeCoreNuget.PNG" class="img-fluid" alt="AddingDotNetNukeCoreNuget.PNG" /></p>
<p>So for example, adding a Module View for Dnn 7, will automatically bring in the DotNetNuke.Core NuGet package for Dnn7 as depicted above.</p>
<p>In other words, you don't need to worry about adding any Dnn assembly references for the most part, as they will be bought in for you as you add items to your project. Ofcourse, you are still free to add references to other dependencies you might have as normal.</p>
<h4 id="running-and-debugging">Running and Debugging</h4>
<p>When you want to run and debug your extension, for those of you that have read my previous blog about DnnPackager, you may recall that this could be accomplished via a command that you could enter in the Package Manager Console window and DnnPackager would handle the deployment and attaching the debugger.</p>
<p>Well that approach was only ever necessary because there was not any first class support within VS itself. Something I am going to rectify with the DotNetNuke project type.</p>
<p>In VS, I am going to extend the debugging toolbar (where the "play" button is)</p>
<p><img src="/img/debug%20toolbar.PNG" class="img-fluid" alt="debug toolbar.PNG" /></p>
<p>You can see in the screenshot there is an empty dropdown at present, but this will list your DotNetNuke websites that you have on your local IIS. The first one in that list will be selected by default.</p>
<p>You may also notice there a new Debugger selected in that screenshot called "Local Dnn Website". That's my own custom debugger that's available only for this project type.</p>
<p>All you need to do is click "Play" and it will:</p>
<ol>
<li>Build your project to output the deployment zip.</li>
<li>Deploy your install zip to the Dnn website selected in the dropdown.</li>
<li>Attach the debugger to Dnn website's worker process that is selected in the dropwdown.</li>
<li>Launch a new browser window, navigated to that dnn websites home page.</li>
</ol>
<p>Therefore, to use a different Dnn website as the host for running and debugging your module, you would just select that website in the drop down instead, before you click the "play" button.</p>
<p>This is going to wayyyy better than previous workflows for Dnn development.</p>
<h3 id="what-now">What Now?</h3>
<p>Well.. I am pretty far into the development of this at the moment, which is why I have been able to include some screenshots. However it is a steep learning curve, and I am continuosly hitting hurdles with <a href="https://github.com/Microsoft/VSProjectSystem">Microsoft's new Project System (CPS)</a>. This is my first attempt at developing a VS project type and I don't have any in roads with microsoft or any support. So all of this means, I am "hoping" I can pull this off, and the signs are promising, but I'm not through the woods yet. The (very) dark, mystical woods, of VS project type development.</p>
<p>Still, I'd love to hear what others think of this - even though I appreciate it's very premature. Would you use such a system? Any ideas for improvements? I'll release a new blog post when things are looking a bit more finalised, and perhaps again when I have something for beta release.</p>
<p>Lastly, if there are any guru's out there who have expertise with <a href="https://github.com/Microsoft/VSProjectSystem">CPS</a> - I can always use a hand ;)</p>
<p>When developing DotNetNuke extensions, we typically use one of the existing Visual Studio Project Type's, for example - an ASP.NET Web Application project.</p>
<p>Even when using a Project Template such as Christoc's, the project template is still based upon one of the standard Visual Studio project types - usually an ASP.NET Web Application project.</p>
<p>However these Project Types do not "gel" well with DotNetNuke development in a number of areas, the main ones being:</p>
<p>So.. what if there was a new Project Type, one that was purpose built for DotNetNuke development? What would that look like?</p>http://darrelltunnell.net/blog/2015/12/01/dnnpackager-getting-startedDnnPackager Getting Started2015-12-01T17:50:00Z<h3 id="dnn-packager-super-smooth-module-development">Dnn Packager - Super Smooth Module Development</h3>
<p>In this post, I am going to show you how to get up and running with DnnPackager for your DotNetNuke module / extension development.</p>
<h3 id="tools-of-the-trade">Tools of the Trade</h3>
<p>I am using VS2015 Community Edition, but this should work equally well with previous versions.</p>
<!--more-->
<h3 id="installing-dotnetnuke-locally">Installing DotNetNuke Locally</h3>
<p>You will need a local instance of DotNetNuke website installed so that you have somewhere to deploy / run and debug your modules. There are plenty of tutorials out there that cover how to install a Dnn website so I am not going to cover this here. If you think this would be useful, leave a comment below and I might consider it! Once you have a working Dnn Website installed under your local IIS - please read on!</p>
<h3 id="create-a-project">Create a Project</h3>
<p>Open Visual Studio, and Create a New "ASP.NET Empty Web Application" project. Make sure you select ".NET 4" from the drop down at the top.</p>
<p><img src="/img/NewAspNetProject.PNG" class="img-fluid" alt="New Project" /></p>
<p>Note: Create your project wherever you like - where you put your source code - that's your business!</p>
<h3 id="tweak-web-project">Tweak Web Project</h3>
<p>The reason we choose to create a web project in the previous step, rather than say - a library project, is just so that we have appropriate context menu options in visual studio for doing things like adding javascript and ascx files etc. This is generally handy for Dnn module development. However our project can not actually run as a "standalone website" - as we are developing a Dnn module - which can only run within the context of the Dnn website that is hosting it. The approach described in this blog should work equally well if you prefer to create other types of projects - but then you might not have those familiar menu options available, so you may have to add things like javascript files to your project by hand.</p>
<p>Select the project in Solution Explorer window, then in the properties window, change "Always Start When Debugging" to false.</p>
<p><img src="/img/alwaysstartwhendebuggingfalse.PNG" class="img-fluid" alt="alwaysstartwhendebuggingfalse.PNG" /></p>
<p>This will help later as it will prevent Visual Studio from needlessly trying to host your module project as its own website whenever you try and debug your module - which will be running in your local Dnn website instead.</p>
<h3 id="add-dnnpackager-nuget-package">Add DnnPackager NuGet Package</h3>
<p>Open the Package Manager Console (Tools --> NuGet Package Manager) and (With your project selected in the "Default Project" dropdown, type into it the following command and hit enter to install the DnnPackager NuGet package:</p>
<pre><code class="language-powershell">
Install-Package DnnPackager
</code></pre>
<p><img src="/img/NuGetConsoleAddDnnPackagerNuGet.PNG" class="img-fluid" alt="NuGetConsoleAddDnnPackagerNuGet.PNG" /></p>
<p>This will add some new items to your project, and to your solution. I will cover what these are for later.</p>
<p><img src="/img/ProjectAfterAddingDnnPackager.PNG" class="img-fluid" alt="ProjectAfterAddingDnnPackager.PNG" /></p>
<h3 id="dnn-sdk-assemblies">Dnn Sdk Assemblies</h3>
<p>In order to proceed with Dnn development, we will actually need to add references to the Dnn assemblies. Depending on the version of DotNetNuke you want your extension to be compatible with will often determine what version of the Dnn assemblies you will need to reference.</p>
<p>For the sake of this blog post I am going to assume that you are going to target the latest version of Dnn (at the time of writing this is Dnn 7)</p>
<p>Using the Package Manager Console again:</p>
<pre><code class="language-powershell">
Install-Package DotNetNuke.Core
</code></pre>
<p>This should add a reference to the DotNetNuke assembly to your project's references:</p>
<p><img src="/img/ReferencesAfterAddingDnnCore.PNG" class="img-fluid" alt="ReferencesAfterAddingDnnCore.PNG" /></p>
<h3 id="lets-develop-a-module">Let's Develop a Module!</h3>
<p>Now we have got most of the setup out of the way, it's time to get cracking on our module!</p>
<p>First add a new User Control to the project. This is going to be the default UI for our super cool DNN module.</p>
<p><img src="/img/AddUserControl.PNG" class="img-fluid" alt="AddUserControl.PNG" /></p>
<p>We then need to change our new User Control to make it inherit from <code>PortalModuleBase</code> rather than <code>System.Web.UI.UserControl</code>.</p>
<p>So change this:</p>
<pre><code class="language-csharp">
namespace MySuperModule
{
public partial class Default : System.Web.UI.UserControl
{
protected void Page_Load(object sender, EventArgs e)
{
}
}
}
</code></pre>
<p>To this:</p>
<pre><code class="language-csharp">using DotNetNuke.Entities.Modules;
namespace MySuperModule
{
public partial class Default : PortalModuleBase
{
protected void Page_Load(object sender, EventArgs e)
{
}
}
}
</code></pre>
<p>** Don't forget to add the 'using' statement depicted above! **</p>
<h3 id="making-an-awesome-module">Making an awesome module</h3>
<p>Further development of this super awesome module is beyond the scope of this post, so I am just going to make it display something really simple for the time being. There are <a href="http://www.dnnsoftware.com/community-blog/cid/141749/dotnetnuke-module-development-101-5--hello-world-3-using-visual-studio-to-create-a-module">plenty of other resources</a> out there for learning about Dnn module development. For now let's make it display some text.</p>
<p>Add the following h1 content to your markup for the user control:</p>
<pre><code class="language-html">
<%@ Control Language="C#" AutoEventWireup="true" CodeBehind="Default.ascx.cs" Inherits="MySuperModule.Default" %>
<h1>I came, I read a blog, I conquered!</h1>
</code></pre>
<h3 id="module-manifest">Module Manifest</h3>
<p>Now that we have this incredible... work of art, naturally we want to run it and test it out. In order to do this though, we first need to make sure our module is going to identify itself with DotNetNuke correctly. This means it should have a manifest.</p>
<p>One of the files that was automatically added to your project when you added the DnnPackager NuGet package was: manifest.dnn</p>
<p>Open up manifest.dnn and replace the values in square brackets with appropriate values. You only need to do this once.</p>
<p>For example, you will see something that looks like this:</p>
<pre><code class="language-xml">
<dotnetnuke type="Package" version="6.0">
<packages>
<package name="[YourPackageName]" type="Module" version="0.0.0">
<friendlyName>[FriendlyPackageName]</friendlyName>
<description></description>
<owner>
<name>[OwnerName]</name>
<organization>[OrganizationName]</organization>
<url>http://www.someurl.com</url>
<email><![CDATA[<a href="mailto:support@someorg.com">support@someorg.com</a>]]></email>
</owner>
<license src="License.lic">
</license>
<releaseNotes src="ReleaseNotes.txt">
</releaseNotes>
<dependencies>
</dependencies>
<components>
<component type="Module">
<desktopModule>
<moduleName>[YourModuleName]</moduleName>
<foldername>[FolderName]</foldername>
<businessControllerClass />
<supportedFeatures />
<moduleDefinitions>
<moduleDefinition>
<friendlyName>[Friendly Module Name]</friendlyName>
<defaultCacheTime>60</defaultCacheTime>
<moduleControls>
<moduleControl>
<controlKey>
</controlKey>
<controlSrc>[YourControllerOrPathToView]/[YourViewFileName].[YourViewFileExtension]</controlSrc>
<supportsPartialRendering>False</supportsPartialRendering>
<controlTitle>[Default title when added to page]</controlTitle>
<controlType>View</controlType>
<helpUrl>
</helpUrl>
</moduleControl>
<moduleControl>
<controlKey>settings</controlKey>
<controlSrc>[YourControllerOrPathToSettings]/[YourSettingsFileName].[YourSettingsFileExtension]</controlSrc>
<supportsPartialRendering>False</supportsPartialRendering>
<controlTitle>[Default settings title]</controlTitle>
<controlType>View</controlType>
<helpUrl>
</helpUrl>
</moduleControl>
</moduleControls>
<permissions>
</permissions>
</moduleDefinition>
</moduleDefinitions>
</desktopModule>
</component>
<component type="Assembly">
<assemblies>
<assembly>
<path>bin</path>
<name>[YourAssembly.dll]</name>
</assembly>
</assemblies>
</component>
<component type="ResourceFile">
<resourceFiles>
<basePath>DesktopModules/[FolderName]</basePath>
<resourceFile>
<name>Resources.zip</name>
</resourceFile>
</resourceFiles>
</component>
</components>
</package>
</packages>
</dotnetnuke>
</code></pre>
<p>Fill it in so it looks more like this:</p>
<pre><code class="language-xml">
<dotnetnuke type="Package" version="6.0">
<packages>
<package name="MySuperModule" type="Module" version="0.0.1">
<friendlyName>MySuperModule</friendlyName>
<description>Makes the internet work</description>
<owner>
<name>Darrell Tunnell</name>
<organization>Dazinator</organization>
<url>http://darrelltunnell.net</url>
<email><![CDATA[<a href="mailto:support@someorg.com">support@someorg.com</a>]]></email>
</owner>
<license src="License.lic">
</license>
<releaseNotes src="ReleaseNotes.txt">
</releaseNotes>
<dependencies>
</dependencies>
<components>
<component type="Module">
<desktopModule>
<moduleName>MySuperModule</moduleName>
<foldername>MySuperModule</foldername>
<businessControllerClass />
<supportedFeatures />
<moduleDefinitions>
<moduleDefinition>
<friendlyName>MySuperModule</friendlyName>
<defaultCacheTime>-1</defaultCacheTime>
<moduleControls>
<moduleControl>
<controlKey>
</controlKey>
<controlSrc>DesktopModules/MySuperModule/Default.ascx</controlSrc>
<supportsPartialRendering>False</supportsPartialRendering>
<controlTitle>Hello</controlTitle>
<controlType>View</controlType>
<helpUrl>
</helpUrl>
</moduleControl>
</moduleControls>
<permissions>
</permissions>
</moduleDefinition>
</moduleDefinitions>
</desktopModule>
</component>
<component type="Assembly">
<assemblies>
<assembly>
<path>bin</path>
<name>MySuperModule.dll</name>
</assembly>
</assemblies>
</component>
<component type="ResourceFile">
<resourceFiles>
<basePath>DesktopModules/MySuperModule</basePath>
<resourceFile>
<name>Resources.zip</name>
</resourceFile>
</resourceFiles>
</component>
</components>
</package>
</packages>
</dotnetnuke>
</code></pre>
<p>Note: I removed the entry for the "settings" for our module as we don't have a settings screen in this example. I also changed the default cache time to -1, which disables caching.. just because I have a feeling this module is going to one day become a lot more dynamic and I don't want outdated content causing confusion :-)</p>
<p>Important: I also set the version number to 0.0.1. Version numbers are important in that Dnn will not let you install an older version of a module over the top of a newer version. The version number in the manifest would have to be equal to, or greater than the currently installed version for it to install over the top.</p>
<h3 id="ready-to-roll">Ready to roll</h3>
<p>Sense that tension in the air? The excitement is building.. The entire blog post has been leading to this one, perfect, moment.</p>
<p>We are now going to deploy our module to our local Dnn website, and debug it.</p>
<h3 id="automating-deployment">Automating Deployment</h3>
<p>In VS, go to the "Package Manager Console" window, and make sure your project is selected from the projects dropdown.</p>
<ol>
<li>Type: <code>Install-Module [name of your website in IIS]</code> and hit enter.</li>
<li>Watch as your module project is built, packaged up as a zip, and then the zip is deployed to your local Dnn website!</li>
</ol>
<p>For example, on my IIS, the name of my Dnn website is "DotNetNuke"</p>
<p><img src="/img/IISDnnWebsite.PNG" class="img-fluid" alt="IISDnnWebsite.PNG" /></p>
<p>So I type into the Package Manager Console <code>Install-Module DotNetNuke</code> and hit enter.</p>
<p>After that completes, you can Login to your DotNetNuke website as host, and go to the Host-->Extensions page, and you should see that your module is now listed as an installed extension! Pretty cool!</p>
<p><img src="/img/hostextensionsmodules.PNG" class="img-fluid" alt="hostextensionsmodules.PNG" /></p>
<h3 id="setting-up-a-page-to-host-it">Setting up a Page to Host it</h3>
<p>Although our module has been installed onto our site, it won't display anywhere - because we need to tell DotNetNuke where it should be displayed!</p>
<p>This is a quick one time task, of simply creating a page in DotNetNuke to display our module.</p>
<ol>
<li>Login as Host</li>
<li>Pages --> Add New Page</li>
<li>Fill out page details and create it.</li>
<li>Modules --> Add New Module</li>
<li>Add your new module on to the page.</li>
</ol>
<p>You should see:</p>
<img class="img-responsive" src="/img/AddedModule.PNG">
<h3 id="debugging-it-testing-changes">Debugging it & Testing Changes</h3>
<p>Let's make some changes. Add some more content:</p>
<pre><code class="language-html">
<h1>I came, I read a blog, I conquered!</h1>
<p>Everyday I'm shuffling!</p>
</code></pre>
<p>Add some code in your code behind, and set a breakpoint on it:</p>
<p><img src="/img/codebehindbreakpoint.PNG" class="img-fluid" alt="codebehindbreakpoint.PNG" /></p>
<p>Now deploy this very simply by placing your cursor in the Package Manager Console window, and hitting "up" arrow on your keyboard. This will bring up the last command:</p>
<pre><code class="language-powershell">
Install-Module DotNetNuke
</code></pre>
<p>hit enter.</p>
<p>Once that completes, refresh the page displaying your module:</p>
<p><img src="/img/redeployedmodulewithchanges.PNG" class="img-fluid" alt="redeployedmodulewithchanges.PNG" /></p>
<p>Simples!</p>
<h3 id="but-wait-my-breakpoint-wasnt-hit">But wait - my breakpoint wasn't hit!</h3>
<p>That's because your module is being executed within the process running your DotNetNuke website. So what you need to do is "attach" the debugger to that process.</p>
<p>You can do this manually, or you can let DnnPackager do it for you. To let DnnPackager handle this, go back to the package manager console, and amend that command you are using, by adding on a couple of arguments:</p>
<pre><code class="language-powershell">
Install-Module DotNetNuke Debug Attach
</code></pre>
<p>So do that, and hit enter. You should see it deploy your module as before but this time it will also attach your debugger!</p>
<p>So.. refresh your page.. and BAM! Breakpoint is hit!</p>
<p><img src="/img/breakpointhit.PNG" class="img-fluid" alt="breakpointhit.PNG" /></p>
<p><strong>You need to be running Visual Studio as an Administrator before you can attach to the w3p process.</strong></p>
<p>The full syntax of the command is (values in braces are optional):</p>
<p><code>Install-Module [name of your website in IIS] {Build Configuration} {Attach}</code></p>
<p>However, if for some strange reason you'd prefer to attach to process in some other way, you absolutely can do that - no one is "forcing" you to use the above command. There are VS extensions you can get to make attaching to IIS processes trivial. Otherwise, within VS, a quick way to do it is do this:</p>
<ol>
<li>Hit ctrl + alt + p</li>
<li>Tick show all processes (if it's not allready)</li>
<li>Select any process in the list, then hit "w" on your keyboard - this should scroll you to the "w3wp.exe" process.</li>
<li>Click "attach".</li>
</ol>
<p><img src="/img/attachtoprocess.PNG" class="img-fluid" alt="attachtoprocess.PNG" /></p>
<h3 id="what-about-if-i-just-want-my-zip-file">What about if I just want my Zip file</h3>
<p>If you just want your installation zip for some reason, perhaps you want to upload it to the Dnn store etc, just build your project as normal and check in your projects output directory.</p>
<h3 id="issues">Issues?</h3>
<p>DnnPackager is open source on GitHub. Feel free to <a href="https://github.com/dazinator/DnnPackager">raise an issue</a></p>
<h3 id="in-summary">In Summary</h3>
<p>DnnPackager is an automation tool that I built to help streamline the Dnn module develop workflow. Feel free to drop me a comment - does this tool help? Or have I missed my mark? Where could it be better? I'd love to hear suggestions.</p>
<p>In this post, I am going to show you how to get up and running with DnnPackager for your DotNetNuke module / extension development.</p>
<p>I am using VS2015 Community Edition, but this should work equally well with previous versions.</p>http://darrelltunnell.net/blog/2015/11/04/automating-android-unit-test-apps-xamarin-like-a-proAutomating Xamarin Android Unit Test Apps - Like a Pro2015-11-04T17:50:00Z<h3 id="first-off">First Off..</h3>
<p>This article is for those of you out there who use Xamarin to write Android applications and want to automate the process of running your tests on an Android device. I'll show you how you can set this up with relative ease.</p>
<p>Here is the process we want:</p>
<ul>
<li>CI Build Begins</li>
<li>Produces the APK file containing my tests.</li>
<li>Starts up an Emulator and boots an AVD</li>
<li>Installs the tests APK onto the Android Device (Emulated)</li>
<li>Kicks of the tests</li>
<li>Reports back the test results.</li>
<li>If using Team City the test all appear nicely in the UI - otherwise the results are in STDOUT.</li>
</ul>
<!--more-->
<h3 id="unit-test-app-android-and-its-shortcomings">Unit Test App (Android) - and it's shortcomings.</h3>
<p>It all begins with adding the unit tests project itself.
Xamarin have provided a project type in Visual Studio called a "Unit Test App". Add one of those projects to your Solution and define some tests.</p>
<p><img src="/img/New%20Android%20Unit%20Test%20Project.PNG" class="img-fluid" alt="New Android Unit Test Project.PNG" /></p>
<p>Here are some tests:</p>
<pre><code class="language-csharp"> [TestFixture]
public class TestsSample
{
[SetUp]
public void Setup() { }
[TearDown]
public void Tear() { }
[Test]
public void Pass()
{
Console.WriteLine("test1");
Assert.True(true);
}
[Test]
public void Fail()
{
Assert.False(true);
}
[Test]
[Ignore("another time")]
public void Ignore()
{
Assert.True(false);
}
[Test]
public void Inconclusive()
{
Assert.Inconclusive("Inconclusive");
}
}
</code></pre>
<h3 id="shortcomings-of-running-these-tests">Shortcomings of Running these tests</h3>
<p>Naturally, you may be thinking how do you now run these tests? Well by default you have to manually run them. This is an app. Starting the tests project in VSis like starting any other Android application - it should deploy the APK to your Android device, and launch the app, which then shows a UI, and you must click various buttons on said UI to run the various tests that you want to run MANUALLY.</p>
<h3 id="an-enormous-pain-in-the-ass">An enormous pain in the ass..</h3>
<p>This ofcourse, is a rediculous way forward and we need to get these automated ASAP!</p>
<h3 id="the-short-answer">The short answer</h3>
<p>The short answer, is that we need to take a few steps to get these tests automated.. Read on..</p>
<h3 id="step-1-the-nuget-package">Step 1 - The NuGet Package</h3>
<p>I created a NuGet package called <a href="https://www.nuget.org/packages/Xamarin.TestyDroid/">TestyDroid</a>.</p>
<p>In order to write this tool, it's fair to say it has taken a fair bit of research and testing!</p>
<p>So - <a href="https://www.nuget.org/packages/Xamarin.TestyDroid/">Install the NuGet package to your tests project</a></p>
<p>It contains two things. Firstly, it contains a command line executable in it's tools folder, called TestyDroid.exe. This little command line will handle spinning up an emulator, installing your tests apk, and running all of your tests and reporting the results, and lastly terminating the emulator once done.</p>
<p>Secondly, it contains an Android library that is added to your Android tests project as a reference. This library includes an improved base class that you will derive form instead of the default Xamrin one. We will cover this in the next step.</p>
<h3 id="step-2">Step 2</h3>
<p>After that is installed, we need to address how these tests get "launched" in the first place.</p>
<p>Android has the concept of "Instrumentation"</p>
<p><code>Instrumentation</code> are basically special types, that can be launched via an intent, and can run tests.</p>
<p>So, in order to "start" the tests running on the Android device (after the APK) has been installed, we need to create this "Instrumentation" class in our tests project.</p>
<p>Add the following class to your Tests project:</p>
<pre><code class="language-csharp">namespace Xamarin.TestyDroid.TestTests
{
[Instrumentation(Name = "xamarin.testydroid.testtests.TestInstrumentation")]
public class TestInstrumentation : TestyDroid.Android.TestyDroidTestSuiteInstrumentation
{
public TestInstrumentation(IntPtr handle, JniHandleOwnership transfer) : base(handle, transfer)
{
}
protected override void AddTests()
{
AddTest(Assembly.GetExecutingAssembly());
}
}
}
</code></pre>
<p>Imortant to note (adjust the Namespace appropriately) - the Instrumentation Attribute above the class has a "Name" property. THIS IS VERY IMPORTANT. Make sure it matches yoru namespace + class name, but with the namespace in lower case.</p>
<p>So if you changed the namespace of this class to MyCoolApp.Tests
And you changed the Class Name of this class to MyCoolTestInstrumentation
Then the Attribute above the MyCoolTestInstrumentation class should look like this:</p>
<pre><code class="language-csharp"> [Instrumentation(Name = "mycoolapp.tests.MyCoolTestInstrumentation")]
public class MyCoolTestInstrumentation : TestyDroid.Android.TestyDroidTestSuiteInstrumentation
{
</code></pre>
<h3 id="step-3-jot-things-down">Step 3 - Jot things down</h3>
<p>We now need to make a note of a few variables that we will need in order to call <code>TestyDroid.exe</code> to run our tests.</p>
<p>The first thing we need is the "class path" of your tests Instrumentation. This is "Name" value of the [Instrumentation] attribute in the previous step. For example:</p>
<p><code>xamarin.testydroid.testtests.TestInstrumentation</code></p>
<p>The next thing we need is the Package name of your tests package. This you can grab from the <code>AndroidManifest.xml</code> file.</p>
<p>Here is mine:</p>
<pre><code class="language-xml"><?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="Xamarin.TestyDroid.TestTests" android:versionCode="1" android:versionName="1.0">
<uses-sdk />
<application android:label="Xamarin.TestyDroid.TestTests" android:icon="@drawable/Icon"></application>
</manifest>
</code></pre>
<p>So the package name for my tests app is <code>Xamarin.TestyDroid.TestTests</code></p>
<p>You will also need to know some more general paramaters about where things are on our environment:</p>
<ol>
<li>The path to <code>Adb.exe</code> (this is in your android-sdk\platform-tools directory)</li>
<li>The path to <code>Emulator.exe</code> (this is in your android-sdk\tools directory)</li>
<li>The path to your Tests APK file (I will give you a clue - it will probably be in your bin/release/ folder of your tests project!)</li>
<li>The name of the AVD that you would like to be launched in the emulator and used to run the tests on.</li>
</ol>
<p>Once you have these things, you are ready to give TestyDroid.exe a whirl!</p>
<h3 id="step-4-running-things-locally">Step 4 - Running things locally.</h3>
<p>Armed with the information in the previous step:</p>
<ol>
<li>Open up a command prompt.</li>
<li>CD to the tools directory of the Xamarin.TestyDroid nuget package you added to your earlier. It should be something like "..path to you solution/packages/Xamarin.TestyDroid.x.x.x/tools/"</li>
<li>Run <code>Xamarin.TestyDroid.exe</code> with the arguments it needs! Look here for a breakdown of all the arguments: <a href="https://github.com/dazinator/Xamarin.TestyDroid">https://github.com/dazinator/Xamarin.TestyDroid</a> - or just execute it with the <code>--help</code> argument to see the help screen.</li>
</ol>
<p>Here is an example:</p>
<pre><code class="language-bat">Xamarin.TestyDroid.exe -e "C:\Program Files (x86)\Android\android-sdk\tools\emulator.exe" -d "C:\Program Files (x86)\Android\android-sdk\platform-tools\adb.exe" -f "src\MyTests\bin\Release\MyTests.apk-Signed.apk" -i "AVD_GalaxyNexus_ToolsForApacheCordova" -n "MyTests" -c "mytests.TestInstrumentation" -w 120
</code></pre>
<p>Substitute the argument values accordingly.</p>
<p>You should see output similar to the following:</p>
<pre><code class="language-bat">Starting emulator: D:\android-sdk\tools\emulator.exe -avd Xamarin_Android_API_15 -port 5554 -no-boot-anim -prop emu.uuid=013b8394-db8d-4224-a36f-889ce164f74e
Waiting until: 04/11/2015 19:21:29 for device to complete boot up..
INSTRUMENTATION_RESULT: passed=1
INSTRUMENTATION_RESULT: skipped=1
INSTRUMENTATION_RESULT: inconclusive=1
INSTRUMENTATION_RESULT: failed=1
INSTRUMENTATION_CODE: 0
Killing device: emulator-5554
Sending kill command.
OK: killing emulator, bye bye
Emulator killed.
</code></pre>
<h3 id="step-5-running-on-team-city">Step 5 - Running On Team City</h3>
<p>Once you have verified you can automate the tests locally, the next step is to set up your build system to run them during your build!</p>
<p>TestyDroid additionally supports reporting Test results in a TeamCity format so that they show up as test results in the TeamCity ui - if that's your build system.</p>
<p>To configure TeamCity it's a case of:</p>
<ol>
<li>An MSBUILD step to build your Tests project (csproj file) such that it outputs the APK (remember to use the target <code>SignAndroidPackage</code> to have the APK produced)</li>
<li>A command line step that calls out to Xamarin.TestyDroid.exe with the necessary arguments.</li>
</ol>
<p>The first step is easy, but the important thing to remember is to set the target to SignAndroidPackage</p>
<p><img src="/img/tc%20commandlineparams%20testydroid.PNG" class="img-fluid" alt="tc commandlineparams testydroid.PNG" /></p>
<p>That will now take care of producing the APK in the output directory for your project during your team city build.</p>
<p>The second step to create is the one that actually runs the tests using TestyDroid!
The follwing screenshot shows setting up a Command line step to do this:
<img src="/img/tc%20testydroid%20commandlinestep.PNG" class="img-fluid" alt="tc testydroid commandlinestep.PNG" /></p>
<h3 id="step-6-admire-your-tests-in-team-city">Step 6 - Admire your tests in Team City.</h3>
<p>Now you can run a build - and if all is well - you should see your tests results added to a tests tab in Team City.</p>
<p><img src="/img/tc%20Tests%20Tab.PNG" class="img-fluid" alt="tc Tests Tab.PNG" /></p>
<h3 id="any-questions">Any Questions?</h3>
<p>I have been someone limited by time so this was fairly rushed together! If there is anything you would like me to elaborate on, please leave a comment below.</p>
<p>Also you can read more information about TestyDroid on Github: <a href="https://github.com/dazinator/Xamarin.TestyDroid/wiki/Getting-Started">https://github.com/dazinator/Xamarin.TestyDroid/wiki/Getting-Started</a></p>
<p>This article is for those of you out there who use Xamarin to write Android applications and want to automate the process of running your tests on an Android device. I'll show you how you can set this up with relative ease.</p>
<p>Here is the process we want:</p>http://darrelltunnell.net/blog/2015/08/29/ssh-on-windowsSSH access to bitbucket on Windows2015-08-29T18:50:00Z<h3 id="getting-ssh-set-up-on-windows">Getting SSH set up on Windows</h3>
<p>Follow this: <a href="http://blog.muhammada.li/setting-up-ssh-access-to-bitbucket-on-windows#comment-53292">http://blog.muhammada.li/setting-up-ssh-access-to-bitbucket-on-windows#comment-53292</a></p>
<p>Follow this: <a href="http://blog.muhammada.li/setting-up-ssh-access-to-bitbucket-on-windows#comment-53292">http://blog.muhammada.li/setting-up-ssh-access-to-bitbucket-on-windows#comment-53292</a></p>http://darrelltunnell.net/blog/2015/08/16/aurelia-and-asp-net-5-mvcASP.NET 5 Projects - NuGet-NPM-Gulp-Bower-Jspm-Aurelia2015-08-16T18:50:00Z<p><strong>This post is part 1 of a series. Part 2 is <a href="http://darrelltunnell.net/blog/2016/01/24/aurelia-and-asp-net-5-mvc-part2/">here</a></strong></p>
<h3 id="asp-a-sea-of-packages.net-5">ASP (A Sea of Packages).NET 5</h3>
<p>When you create a new ASP.NET 5 project, you will see all sorts of new-ness. I am going to guide you, the uninitiated ASP.NET 5 web developer, through creating your first ASP.NET 5 MVC application, but we won't stop there. In the next post of this series, we will then enhance the project with a number of features:</p>
<ol>
<li>Bundling and Minification.</li>
<li>Auto browser refresh (as you make changes to files during development)</li>
</ol>
<p>In addition, I will touch upon important tooling that you need to be aware of:</p>
<ol>
<li>NPM</li>
<li>Bower and why we are going to replace it with Jspm</li>
<li>Gulp - and why is it useful</li>
</ol>
<p>To be able to do all of this, we will be creating an ASP.NET MVC 5 project, and then we will be using <a href="http://aurelia.io/">Aurelia</a> to run an Aurelia application on Home page (Index.cshtml)</p>
<!--more-->
<h3 id="new-project">New Project</h3>
<p>The first step on our quest is simply to create a new ASP.NET application. I am sure you know the drill:</p>
<ol>
<li>In VS 2015, File --> New Project</li>
<li>"ASP.NET Web Application"
<img src="/img/new%20aspnet%20project.PNG" class="img-fluid" alt="new aspnet project.PNG" /></li>
<li>"Web Application"
<img src="/img/new%20aspnet%20project%202.PNG" class="img-fluid" alt="new aspnet project 2.PNG" /></li>
</ol>
<h3 id="project-structure">Project Structure</h3>
<p>At this point, with the project created, let's stop and appreciate some noteworthy files in our new project.</p>
<p><img src="/img/asp%20net%20project%20sol%20explorer.PNG" class="img-fluid" alt="asp net project sol explorer.PNG" /></p>
<ul>
<li><code>project.json</code> - this is the new form of the project file. It replaces for example the older <code>.csproj</code> and <code>.vbproj</code> files.</li>
<li><code>package.json</code> - this file is managed by <a href="https://docs.npmjs.com/">NPM</a>. It records the dependencies that your application has on NPM packages. More on NPM later.</li>
<li><code>bower.json</code> - this file is managed by <a href="http://bower.io/">Bower</a>. It records the dependencies that your application has on Bower packages. More on Bower later.</li>
<li><code>gulpfile.js</code> - this file contains <code>tasks</code> that can be executed by <a href="http://gulpjs.com/">Gulp</a> as part of your development workflow, for example, whenever the project is built, cleaned etc. More on this later.</li>
<li><code>Startup.cs</code> this is the entry point for your application. For the purposes of this article, the default code is fine and we won't be amending anything in this file. It contains bootstrapping code such as setting up and registering services such as authentication.</li>
</ul>
<h4 id="npm-its-an-important-citizen">NPM - it's an important citizen</h4>
<p><a href="https://docs.npmjs.com/">NPM</a> is now a first class citizen of an ASP.NET 5 project. This is why you have a <code>package.json</code> file in your project.</p>
<p><img src="/img/packages%20json%20file.PNG" class="img-fluid" alt="packages json file.PNG" /></p>
<p>NPM is a package manager - the Node Package Manager to be precise. Think <code>NuGet</code> but for NodeJs packages. You could be forgiven for thinking it stands for "Not another Package Manager" - it doesn't, I checked.</p>
<p>If you aren't yet familiar with NPM, stop here and do yourself a favour - go <a href="https://docs.npmjs.com/">get familiar</a>, you will be seeing a lot of it in your ASP.NET 5 projects in the days to come!</p>
<h4 id="hold-on-another-package-manager-but-we-allready-have-nuget">Hold on, another Package Manager? But we allready have NuGet?</h4>
<p>NuGet is for .NET libraries like log4net silly. Npm has a vast array of packages not available through NuGet. Why wouldn't you want to tap into those also?</p>
<h4 id="bower">Bower</h4>
<p>Here is where things get a tiny bit confusing. Bower is a package, that is also another package manager. I am tempted to move on.. but I'll explain.</p>
<p>Bower is a NodeJs program, and is therefore distributed as a NodeJs package, via <code>NPM</code>. However it's purpose in life is to be a package manager, but specifically for client (website) dependencies such as javascript or css. Think Jquery. If you want to add Jquery, or Bootstrap, or any other client side library to your project, then Bower would be the package manager to use to achieve that. Not NPM (<a href="https://www.npmjs.com/package/jquery">although you could</a>), and not NuGet (<a href="https://www.nuget.org/packages/jQuery/">although you could</a>). The ASP.NET team thinks <code>Bower</code> is the package manager to use as Bower specialises for client dependencies - so the ASP.NET 5 project is set up by default to use Bower and you may allready see some Bower packages downloaded into your <code>bower_components</code> folder within your project. The <code>bower.json</code> file keeps track of your bower dependencies.</p>
<p>However, in this walkthrough, we shall be scrapping <code>Bower</code> and using a different package manager for our JQueries and our Bootstraps. One called <a href="http://jspm.io/">Jspm</a>. Jspm is recomended for it's additional capabilities, mainly that it provides not just package management features (at dev time) but package loading features, that your application uses at runtime.</p>
<h4 id="gulp">Gulp</h4>
<p><a href="http://gulpjs.com/">Gulp</a> is what all the cool kids are using to automate their development workflows.</p>
<p>Gulp basically lets you define <code>tasks</code> in a javascript file (gulpfile.js) that can then be run at an appropirate point. VS 2015 has a <code>Task Runner Explorer</code> window in which you can pick which Gulp tasks (the ones defined in your gulpfile.js) that you would like to run and when. For example, you can have your gulp task executed whenever the project is built, or cleaned etc. You can also execute your gulp task via the command line (see the Gulp docs)</p>
<p>We are going to write some Gulp tasks in gulpfile.js, and have them executed as part of the our project's build process. These tasks are going to automatically handle bundling and minification of our javascript files for us.</p>
<p>Our web application is going to reference the "bundle" of javascript that gulp outputs, rather than the individual javascript files that we download using jspm. Which means our application is going to be nice and optimised as the browser will have to do less roundtrips with the server (network requests) to load the required javascript.</p>
<h4 id="but-wont-bundling-and-minification-lead-to-a-poor-debugging-experience">But won't bundling and minification lead to a poor debugging experience?</h4>
<p>Not if sourcemaps are enabled. I will show you how to enable this. This will mean the browser will be requesting and running the optimised bundle of javascript - but you the developer, will be stepping through and reading the original source code in your browser's dev tools, thanks to the magical power of source maps.</p>
<p>However, I will also show you what to do if you just don't want to bundle / minify your javascript during development (not all browsers will support source maps yet). If bundling and minification is something you only want to do at the time of a release build - which is pretty sensible - then I'll cover that too.</p>
<h3 id="stay-tuned">Stay tuned</h3>
<p>In the next post/s, we will begin modifying our ASP.NET 5 project to do the things I have discussed:</p>
<ol>
<li><a href="http://darrelltunnell.net/blog/2016/01/24/aurelia-and-asp-net-5-mvc-part2/">Replace Bower with JSPM</a></li>
<li>Bring in Aurelia</li>
<li>Get an Aurelia application working on the Index.cshtml page</li>
<li>Enable bundling and minification via a Gulp task</li>
<li>Enable automatic browser refresh</li>
<li>Disable bundling when our application is running in development (to maintain an easy debugging experience should your browser not support source maps)</li>
</ol>
<p>If there is anything else you would like me to cover in this series, drop me a comment below!</p>
<p><strong>This post is part 1 of a series. Part 2 is <a href="http://darrelltunnell.net/blog/2016/01/24/aurelia-and-asp-net-5-mvc-part2/">here</a></strong></p>
<p>When you create a new ASP.NET 5 project, you will see all sorts of new-ness. I am going to guide you, the uninitiated ASP.NET 5 web developer, through creating your first ASP.NET 5 MVC application, but we won't stop there. In the next post of this series, we will then enhance the project with a number of features:</p>
<p>In addition, I will touch upon important tooling that you need to be aware of:</p>
<p>To be able to do all of this, we will be creating an ASP.NET MVC 5 project, and then we will be using <a href="http://aurelia.io/">Aurelia</a> to run an Aurelia application on Home page (Index.cshtml)</p>http://darrelltunnell.net/blog/2015/06/13/dotnetnuke-streamlining-module-developmentDotNetNuke - Streamlining Module Development Workflow2015-06-13T18:50:00Z<h3 id="module-debugging-two-approaches">Module Debugging - Two Approaches</h3>
<p>When developing DotNetNuke modules people take many different approaches but they boil down to two alternatives in terms of workflow:</p>
<ol>
<li><p>Placing / checking out your source code directly into the \DesktopModules folder of a DotNetNuke website, and having your module dll's output to directly into the DotNetNuke website's \bin folder.</p>
</li>
<li><p>Checking out and working on your code wherever you like, but having to deploy your module (content and assemblies) to a local DNN website when you are ready to run it.</p>
</li>
</ol>
<p>Both approaches require that you "attach to process" from within Visual Studio in order to debug your module.</p>
<!--more-->
<h4 id="i-hate-approach-1">I hate approach #1</h4>
<p>I have all sorts of issues with approach #1. Yes it’s technically possible, but it’s also nasty in my view (not very clean) - I have elaborated on that elsewhere so won’t do so again here in depth, aside to say that I believe #2 is the "cleanest" approach and that many forms of debugging use #2 as the approach, not #1. For example, xamarin devs, when they debug an android app, you will see that xamarin actually deploys their project to the device / emulator, and then attaches the debugger to the remote process that's running on the device / emulator. The result is that they click "Play" in VS, and a shortwhile later they are attached and stepping through their code.. It may not be obvious that a deployment took place - but it did. Lastly. i'll point out that #1 creates a coupling between how you structure your source code, and where it needs to be when it's actually deployed.</p>
<h4 id="but-approach-2-is-lacking">But approach #2 is lacking</h4>
<p>So deciding to take approach #2, having to manually copy / deploy your module content to the DotNetNuke website each time you want to test your module, is just not an efficient use of your time!</p>
<p>What's needed is some nice visual studio integration so that when you are ready to "Run / Debug" your module, you click one button and bam! chrome opens up, displaying your module, with the debugger attached so you can step through code.</p>
<h4 id="can-anything-be-done">Can anything be done?</h4>
<p>I have allready made strides to address the inefficiences of #2 so that it's now a lot more streamlined: <a href="https://github.com/dazinator/DnnPackager">https://github.com/dazinator/DnnPackager</a> - it's a NuGet package that you add to any VS project, and it will produce the Dnn module installation zip for you when you build the project. It then also extends the package manager console window in VS with an additional command you can run, that will deploy the module project to a local DNN website. So this is the workflow I currently use for module debugging:</p>
<ol>
<li>Make a change to the code</li>
<li>Hit “up” arrow and then hit “enter” in package manager console (this runs the previous command which is the DnnPackager one I spoke of, that builds and deploys my module project to my local dnn website)</li>
<li>Refresh my browser page, and attach Visual Studio (ctrl + alt + p) to the w3w process.</li>
</ol>
<p>This is a bit more streamlined! This makes approach #2 workable in my opinion.</p>
<h4 id="room-for-improvements">Room for Improvements!</h4>
<ol>
<li>What if I don’t have a DNN website already installed - for example I am new to Dnn development and just want to get up and running as quickly as possible.</li>
<li>What if I am curious to know if my module runs in DNN 6.5.1 and I only have DNN7 installed?</li>
<li>What if this is the first time I am testing this particular module - I have to make sure I go to DotNetNuke website, Create a page and add my module to that page right?</li>
</ol>
<p>These things are all tedious. Most developers (new to DNN) expect to be able to click Debug and immediately be debugging their code - they don’t expect to have to jump through these additional hurdles / barriers.</p>
<p>This is why DotNetNuke development can be a bit of a culture shock for many developers.</p>
<h4 id="next-feature">Next Feature!</h4>
<p>So the next feature I am thinking of adding to DnnPackager is one that addresses those concerns mentioned above. I’d be really greatful if anyone with such a curiousity wouldn't mind reading it and offering their feedback on this proposed awesome feature <a href="https://github.com/dazinator/DnnPackager/issues/14">https://github.com/dazinator/DnnPackager/issues/14</a> - just so I can get a feel for whether there is much demand for such a capability.</p>
<h4 id="feedback">Feedback?</h4>
<p>Do you disagree?
Would this new feature <a href="https://github.com/dazinator/DnnPackager/issues/14">https://github.com/dazinator/DnnPackager/issues/14</a> help you?</p>
<p>Darrell Tunnell
<a href="http://darrelltunnell.net">http://darrelltunnell.net</a></p>
<p>When developing DotNetNuke modules people take many different approaches but they boil down to two alternatives in terms of workflow:</p>
<p>Both approaches require that you "attach to process" from within Visual Studio in order to debug your module.</p>http://darrelltunnell.net/blog/2015/04/29/automating-dotnetnuke-deployments-with-octopus-deployAutomating DotNetNuke deployments with Octopus Deploy2015-04-29T18:50:00Z<h3 id="automating-dotnetnuke-deployments-using-octopus-deploy">Automating DotNetNuke Deployments using Octopus Deploy</h3>
<p>Because I am an awesome dude, i'll share with you how I automate dotnetnuke delivery / deployments. This works. It takes some effort to get this set up though, but it will be well worth it in the end.</p>
<p>First i'll explain the process for automating the deployment of the DotNetNuke website itself. Then I'll explain how you can automate the deployment of modules / extensions on a continous basis.</p>
<!--more-->
<h3 id="preparation-work">Preparation work</h3>
<ol>
<li>Set up a brand new DotNetNuke website, and go through the install wizard until you are greeted with an empty default dotnetnuke website.</li>
<li>Stop the website. Create a NuGet package containing the website folder.</li>
<li>Put that on your internal NuGet feed.</li>
<li>Go to the dotnetnuke database, and generate the create scripts (with data).</li>
<li>Create a new console application that uses <a href="http://dbup.github.io/">dbup</a> to run the above sql scripts when it is executed (as described <a href="http://dbup.github.io/">here</a>). Remember to replace things like server name etc in the sql scripts with appropriate <span class="math">variablename</span>. Dbup can substitute <span class="math">variablename</span> in the sql scripts with their actual values (which you can pass through from Octopus) before it executes them.</li>
<li>Add <a href="http://docs.octopusdeploy.com/display/OD/Using+OctoPack">OctoPack</a> to your Console Application so that it is packaged up into a NuGet package. Put this NuGet package on your internal NuGet feed.</li>
</ol>
<p>You should now be in this position:</p>
<ol>
<li>You have a NuGet package on your feed containing the DotNetNuke website content</li>
<li>You have a NuGet package on your feed containing your wonderful console application (DbUp) which will run the database scripts.</li>
</ol>
<p>Next Step - to Octopus!</p>
<ol>
<li>Create a project in Octopus to deploy a "DotNetNuke" website. For the deployment process you will need the NuGet packages prepared previously. The deployment process should:</li>
</ol>
<ul>
<li>Create a website in IIS using the website NuGet package.</li>
<li>Create the database by executing the executable within the Database NuGet package.</li>
</ul>
<p>There are lot's of things to remember when deploying dotnetnuke. I won't go into detail but things like:</p>
<ul>
<li>Granting full permission to the app pool identity that the website runs under to the website folder.</li>
<li>Updating the portalalias table with appropriate access url.</li>
</ul>
<p>... and other things. The Dnn install process has been covered elsewhere so I won't go into any further detail here.</p>
<h3 id="congratulations-partly">Congratulations (partly)</h3>
<p>You should now be in a postion where you can roll out a DotNetNuke website via Octopus.. BUT WHAT ABOUT THE MODULES I'M DEVELOPING!! - I hear you exclaim.</p>
<h3 id="automating-module-deployments">Automating Module Deployments</h3>
<ol>
<li><p>When you build your module projects (via build server etc) you want them packaged as DotNetNuke install packages, inside a NuGet deployment package, which is then published to your NuGet feed. You can use <a href="https://github.com/dazinator/DnnPackager">DnnPackager</a> for this (which is something I created).</p>
</li>
<li><p>You'd need something that can copy a set of zip files to the "Install/Module" folder of a DotNetNuke website, and then monitor that folder, whilst calling the DotNetNuke url to install packages (<a href="http://www.dotnetnuke.com/install/install.aspx?mode=installresources">www.dotnetnuke.com/install/install.aspx?mode=installresources</a>). I wrote a quick console application to do this. It repeats calls to that URL all the time the number of zips in the install folder decrements (dotnetnuke deletes them after they are installed). If after x calls, there are the same number of zips left in the directory, it assumes they cannot be installed and reports a failure (return code).
You should package this tool up into a NuGet package and, you guessed it, stick it on your internal feed.</p>
</li>
</ol>
<p>3.Create a project in Octopus for "Module" deployment. You want the deployment process to:</p>
<ul>
<li>Dowload the NuGet package containing your module zips.</li>
<li>Download the NuGet package containing your module deployment utility (that console app i spoke of)</li>
<li>Invoke your deployment tool exe, passing in arguments for where the module zip files were placed, what the website url is, and potentially the path to the Install/Modules folder on disk (although my own tool interrogated IIS based on the website URL to find the website directory)</li>
</ul>
<h2 id="full-congratulations">Full Congratulations</h2>
<p>You will now find that you can create a release of your module project in Octopus and deploy all your lates modules to any DotNetNuke website at the push of a button.</p>
<p>Because I am an awesome dude, i'll share with you how I automate dotnetnuke delivery / deployments. This works. It takes some effort to get this set up though, but it will be well worth it in the end.</p>
<p>First i'll explain the process for automating the deployment of the DotNetNuke website itself. Then I'll explain how you can automate the deployment of modules / extensions on a continous basis.</p>http://darrelltunnell.net/blog/2015/02/26/how-far-does-the-xrm-sdk-s-executemultiplerequest-get-youHow far does the XRM SDK's ExecuteMultipleRequest get you?2015-02-26T17:50:00Z<h3 id="executemultiplerequest-lets-take-it-to-the-max">ExecuteMultipleRequest - Let's take it to the max</h3>
<p>In this post, I will explore what kinds of things can be achieved using the SDK's ExecuteMultipleRequest, by starting of with a simple SQL command, and implementing a semantically equivalent ExecuteMultipleRequest, and then slowly introducing some additional complexity - so that, we can see some areas where the SDK starts to fall short!</p>
<!--more-->
<h3 id="starting-simple">Starting Simple</h3>
<p>Consider this SQL:</p>
<pre><code class="language-sql">INSERT INTO contact (firstname, lastname) VALUES ('albert', 'einstein');
</code></pre>
<p>Well you hardly need a ExecuteMultipleRequest for this, but if you really wanted to you could create one no problem. I am going to assume you are already familiar with the code to create a ExecuteMultipleRequest - if not it's described <a href="https://msdn.microsoft.com/en-gb/library/jj863631.aspx">here.</a></p>
<p>This equates to the following:</p>
<p>Either:-</p>
<ul>
<li>A single CreateRequestMessage.</li>
<li>An ExecuteMultipleRequest containing a single CreateRequestMessage.</li>
</ul>
<p>I hope you are with me so far..</p>
<h3 id="take-it-up-a-notch">Take It Up A Notch</h3>
<p>Let's now imagine that when a contact is INSERTED, an <code>accountnumber</code> is generated on the server, and that we want to grab this value using a single roundtrip with the server.</p>
<p>Here's it is in T-SQL:</p>
<pre><code class="language-sql">INSERT INTO contact (firstname, lastname) OUTPUT inserted.accountnumber VALUES ('albert', 'einstein');
</code></pre>
<p>This equates to the following using the SDK:-</p>
<p>SORRRY DAVE. YOU CAN'T DO THAT.</p>
<p>The problem being, is that to do this in one roundtrip with the CRM server means building an ExecuteMultipleRequest that contains:-</p>
<ul>
<li>A CreateRequestMessage (to insert / create the contact)</li>
<li>A RetrieveRequestMessage (to retrieve the accountnumber of the inserted contact)</li>
</ul>
<p>However in order to construct the appropriate RetrieveRequestMessage we need to know the ID of what the inserted contact will be in advance. If you look at the SQL query - we are not specifying an ID in advance - therefore we cannot perform the equivalent to this query.</p>
<h3 id="a-bit-further">A bit further..</h3>
<p>With the previous example in mind, consider the following SQL</p>
<pre><code class="language-sql">INSERT INTO contact (contactid, firstname, lastname) OUTPUT inserted.accountnumber VALUES ('2f4941ec-2f6f-4c7f-8adc-c6f4fb002d42', 'albert', 'einstein');
</code></pre>
<p>If you are quick, you've already cottoned on that this one is possible, and it equates to:-</p>
<p>An ExecuteMultipleRequest (ContinueOnError = false) containing:-</p>
<ul>
<li>A CreateRequestMessage (to insert / create the contact)</li>
<li>A RetrieveRequestMessage - to retrieve the "accountnumber" of the created entity)</li>
</ul>
<h3 id="lets-start-to-push-the-boat-out-a-little">Let's start to push the boat out a little.</h3>
<p>Here is a batch of T-SQL commands:</p>
<pre><code class="language-sql">INSERT INTO contact (firstname, lastname) VALUES ('albert', 'einstein');
UPDATE contact SET lastname = 'Johnson' WHERE contactid = '3a4941ec-2f6f-4c7f-8adc-c6f4fb002d42';
DELETE FROM contact WHERE contactid = '4b4941ec-2f6f-4c7f-8adc-c6f4fb002d42'
</code></pre>
<p>Now, we know that SQL Server would execute that SQL, by executing each sql command within that batch in sequence, and if there were any errors it will not continue to process the rest of the commands in the same batch. It would also not execute that batch within a transaction, so it would not roll back should errors occur half way through etc.</p>
<p>This equates to:</p>
<p>An ExecuteMultipleRequest (ContinueOnError = false) - containing the following messages:</p>
<ul>
<li>A CreateRequestMessage (to insert / create the contact)</li>
<li>An UpdateRequestMessage(to update the contact)</li>
<li>A DeleteRequestMessage</li>
</ul>
<p>It seems like this is a good fit between the SQL and an ExecuteMultipleRequest.</p>
<h3 id="the-boat-is-now-heading-towards-the-open-ocean">The boat is now heading towards the open ocean</h3>
<p>Let's add a bit of complexity to the previous T-SQL - consider this:</p>
<pre><code class="language-sql">INSERT INTO contact (contactid, firstname, lastname) OUTPUT inserted.accountnumber VALUES ('2f4941ec-2f6f-4c7f-8adc-c6f4fb002d42', 'albert', 'einstein');
UPDATE contact SET lastname = 'Johnson' WHERE contactid = '3a4941ec-2f6f-4c7f-8adc-c6f4fb002d42';
DELETE FROM contact WHERE contactid = '4b4941ec-2f6f-4c7f-8adc-c6f4fb002d42'
</code></pre>
<p>The first command in that batch of SQL commands is this:</p>
<pre><code class="language-sql">INSERT INTO contact (contactid, firstname, lastname) OUTPUT inserted.accountnumber VALUES ('2f4941ec-2f6f-4c7f-8adc-c6f4fb002d42', 'albert', 'einstein');
</code></pre>
<p>And we know that this actually equates to 2 seperate RequestMessages, a CreateRequest and a RetrieveRequest. We then also need to do an Update and a then a Delete. So this equates to:</p>
<p>An ExecuteMultipleRequest (ContinueOnError = false)</p>
<p>Containing:</p>
<ul>
<li>A CreateRequestMessage (to insert / create the contact)</li>
<li>A RetrieveRequestMessage - to retrieve the "accountnumber" of the created entity.</li>
<li>An UpdateRequestMessage</li>
<li>A DeleteRequestMessage</li>
</ul>
<p>Ok good so far!</p>
<h3 id="should-look-at-boat-breakdown-cover">Should look at Boat Breakdown cover</h3>
<p>Now consider this one:</p>
<pre><code class="language-sql">INSERT INTO contact (firstname, lastname) OUTPUT inserted.accountnumber VALUES ('albert', 'einstein');
GO
DELETE FROM contact WHERE contactid = '6f4941ec-2f6f-4c7f-8adc-c6f4fb002d42'
</code></pre>
<p>What this says is:</p>
<ol>
<li>We want to Insert a Contact, output its account number.</li>
<li>Then in a second "batch" of sql statements - we want to Delete a contact. The second batch needs to execute regardless of any problem or outcome from the first batch - (The GO keyword is used as a batch seperator in T-SQL)</li>
</ol>
<p>What this translates into is:</p>
<ol>
<li>A CreateRequest that allways needs to be executed.</li>
<li>A RetreiveRequest (to retrieve the "accountnumber") which should only be executed if the preceeding CreateRequest succeeds.</li>
<li>A DeleteRequest that allways needs to be executed.</li>
</ol>
<p>Can we construct the equivalent ExecuteMultipleRequest to do that?</p>
<p>Well.. the answer is.. we can semantically construct an appropriate ExecuteMultipleRequest, but it won't be supported by CRM - because you are not allowed to nest ExecuteMultipleRequest - if you do the CRM server will throw an error when you send it such a request.</p>
<p>Here is what that looks like though (if only it was supported by the server!)</p>
<ol>
<li>An ExecuteMultipleRequest (ContinueOnError = true) Containing:
<ol>
<li>An ExecuteMultipleRequest (ContinueOnError = false) Containing:
<ol>
<li>A CreateRequest to create the contact</li>
<li>A RetrieveRequestMessage - to retrieve the "accountnumber" of created entity</li>
</ol>
</li>
<li>A DeleteRequestMessage</li>
</ol>
</li>
</ol>
<p>As I say, constructing such a Request is possible, but the CRM server won't process it due to current runtime limitations that are imposed about not allowing nested ExecuteMultipleRequests.</p>
<p>So - unfortunately we have hit a CRM limitation here.</p>
<p>But what you could do, is, on the client side, split that SQL statement on the <code>GO</code> keyword, to get each <code>batch</code> of T-SQL commands. Then for each batch, construct and send an appropriate ExecuteMultipleRequest for the statements in that batch.</p>
<h3 id="what-have-we-learned-so-far">What have we learned so far</h3>
<p>The ExecuteMultipleRequest provides the ability to send a single "batch" of commands to the server. Thinking from a SQL perspective, this is akin to sending all the statements upto a "GO" keyword (batch seperator). To get the same behaviour as SQL though, you should set <code>ContinueOnError</code> to false - so that processing halts if any request in the batch errors.</p>
<p>The ExecuteMultipleRequest is not a good fit for sending multiple individual <code>batches</code> of operations to the CRM server, as there is no way to group the Requests within a ExecuteMultipleRequest into their batches. For this reason it's probably best to think of ExecuteMultipleRequest as a single SQL batch and to always use <code>ContinueOnError</code> = false if you want to mirror the behaviour of SQL as closely as possible.</p>
<h2 id="a-weird-scenario-can-send-multiple-batches-in-one-go-as-long-as-each-batch-contains-1-requestmessage-only">A weird scenario - can send multiple batches in one go - as long as each batch contains 1 RequestMessage only.</h2>
<p>Consider the following T-SQL:</p>
<pre><code class="language-sql">INSERT INTO contact (firstname, lastname) VALUES ('albert', 'einstein');
GO
DELETE FROM contact WHERE contactid = '6f4941ec-2f6f-4c7f-8adc-c6f4fb002d42';
GO
UPDATE contact SET firstname = 'bob' WHERE lastname = 'Hoskins';
GO
</code></pre>
<p>In this scenario - each batch of commands contains only a single command. What this means is that you can construct an ExecuteMultipleRequest with 'ContinueOnError' set to true, and there will be no danger that a particular command in a batch will error, and that the rest of the commands in that batch will execute regardless - because there is only a single command in each batch!</p>
<p>For an example of the danger I am referring to here, consider this:</p>
<pre><code class="language-sql">DELETE FROM contact WHERE contactid = '6f4941ec-2f6f-4c7f-8adc-c6f4fb002d42';
DELETE FROM account WHERE primarycontactid = '6f4941ec-2f6f-4c7f-8adc-c6f4fb002d42';
GO
UPDATE contact SET firstname = 'bob' WHERE lastname = 'Hoskins';
GO
</code></pre>
<p>The first batch above, contains 2 operations. The second batch contains 1.</p>
<p>Now imagine, that for the above - we constructed an ExecuteMultipleRequest, and set 'ContinueOnError' to true (to enable the server to process both batches regardless of whether the first batch fails.)
Well in that scenario, because the first batch actually contains 2 operations, the 'ContinueOnError' = true would actually apply to each operation within that batch as well. So you could hit a scenario where the first Delete in that first batch errored, but then CRM continued on regardless to execute the second DELETE etc. This is not what the semantics of the above SQL query conveys - i.e the equivalent CRM beahviour for the above SQL query would be for it to stop processing a particular batch as soon as it hits an error. The only way this can be satisfied at present is if each batch only contains a single RequestMessage.</p>
<h3 id="conclusion">Conclusion</h3>
<p>If you would like to send a batch of commands to the CRM server in one go, the good news is you can. The bad news is, it's not perfect, there are limitations, and hopefully I have shown you just about how far you can stretch things.</p>
<p>If you need to send multiple batches of commands to the CRM server in one go, the good news is you can if each batch contains only a single request message (i.e Create, Retreive, Delete, Update etc) - the bad news is, if thats not the case, then you will need to send each batch as an individual ExecuteMultipleRequest, and implement your own "ContinueOnError" behaviour clientside such that should one ExecuteMultipleRequest fail to be processed it doesn't halt subsequent batches (ExecuteMultipleRequests) from being processed.</p>
<p>In this post, I will explore what kinds of things can be achieved using the SDK's ExecuteMultipleRequest, by starting of with a simple SQL command, and implementing a semantically equivalent ExecuteMultipleRequest, and then slowly introducing some additional complexity - so that, we can see some areas where the SDK starts to fall short!</p>http://darrelltunnell.net/blog/2015/01/18/asp-net-5-vnext-projects-why-your-nuget-package-may-fail-to-installASP.NET 5 (vNext) Projects - Your NuGet Package May Fail to Install Correctly2015-01-18T17:50:00Z<h3 id="dont-assume-nuget-packages-that-you-have-authored-will-continue-to-work-with-asp.net-5-vnext-projects">Don't assume NuGet Packages that you have authored will continue to work with ASP.NET 5 (vNext) projects.</h3>
<p>Over the past year or so, I have authored <a href="https://www.nuget.org/packages?q=darrell.tunnell">a number of NuGet packages</a> - because, well... I am just an all around great guy.</p>
<p>Recently, <a href="http://stackoverflow.com/questions/27762659/error-while-adding-nuget-package-to-asp-net-vnext-project#comment44383264_27762659">I was contacted by someone</a> who was trying to use one of my NuGet packages with an ASP.NET vNext project (Preview release). Not something I have tried before - and this is where things get a little interesting.</p>
<!--more-->
<h3 id="when-nuget-packages-are-installed-into-an-asp.net-vnext-project-powershell-scripts-included-in-the-package-are-not-run">When NuGet packages are installed into an ASP.NET vNext project - powershell scripts included in the package, are not run.</h3>
<p>As most NuGet package authors will already know, it's a <a href="http://docs.nuget.org/docs/creating-packages/creating-and-publishing-a-package#Automatically_Running_PowerShell_Scripts_During_Package_Installation_and_Removal">standard feature of NuGet</a> that you can include powershell scripts within your NuGet package, that will then be executed when your package is installed (or uninstalled) into a visual studio project / solution.</p>
<p>Many NuGet packages out there currently rely on this feature - else they will not work.</p>
<p>Well, the issue with my NuGet package failing to install into an ASP.NET vNext project was eventually posted on the asp.net forums, and <a href="http://forums.asp.net/members/davidfowl.aspx">David Fowler</a> (who's on the ASP.NET team) - kindly responded with some insight into the matter. He seems to suggest that <a href="http://forums.asp.net/t/2027698.aspx?Error+while+adding+NuGet+package+to+ASP+NET+vNext+project">ASP.NET v5 does not support running the packages powershell scripts when you install a NuGet package into an ASP.NET v5 project.</a></p>
<p>I wanted to confirm that with him a second time - because <strong>that's a huge problem for some of my NuGet packages</strong>, but as you will see from that thread, I am still awaiting a secondary confirmation of this - although his first answer seems pretty clear cut.</p>
<h3 id="surely-this-is-documented-somewhere-or-perhaps-asp.net-5-offers-an-alternative-mechanism-for-running-tasks-on-installation-uninstallation-of-a-nuget-package">Surely this is documented somewhere - or perhaps ASP.NET 5 offers an alternative mechanism for running tasks on installation / uninstallation of a NuGet package?</h3>
<p>I have tried to look for more information. At the moment all I have to go on is David Fowlers response. Perhaps this is because there is still work in progress in this area, who knows. All I can suggest is that if your NuGet package currently requires custom tasks to be performed and you are using an <code>init</code>, <code>install</code> or <code>uninstall</code> ps1 script - then be prepared for the fact that it may not work with ASP.NET 5 projects - and also be prepared for the fact that there may not be any workaround either. I seriously hope this is false speculation on my part - but if this does turn out the be true after ASP.NET 5 is released, I'll be left with a slightly bitter taste in my mouth.</p>
<h3 id="so-where-from-here">So where from here?</h3>
<p>I am generally really excited about ASP.NET 5. I love what the team are doing. However I beleive that the ASP.NET team really should put some guidance out there to the NuGet community, so that NuGet package authors can gain an understanding of how their packages might have to change to work in the context of ASP.NET 5 projects.</p>
<p>At a minimum, if ASP.NET 5 will indeed no longer support the running of these powershell scripts, then it should atleast warn you that the package contains such scripts and that they will not be executed - which means the package may not beahve as desired.</p>
<p>My hope is that David Fowler or someone from the ASP.NET team will offer a clarification, insight, or workaround for this issue that makes it a non issue. Fingers crossed.</p>
<p>Over the past year or so, I have authored <a href="https://www.nuget.org/packages?q=darrell.tunnell">a number of NuGet packages</a> - because, well... I am just an all around great guy.</p>
<p>Recently, <a href="http://stackoverflow.com/questions/27762659/error-while-adding-nuget-package-to-asp-net-vnext-project#comment44383264_27762659">I was contacted by someone</a> who was trying to use one of my NuGet packages with an ASP.NET vNext project (Preview release). Not something I have tried before - and this is where things get a little interesting.</p>http://darrelltunnell.net/blog/2014/12/22/crm-plugin-generated-values-and-reducing-roundtripsCRM / Plugin Generated Values - and Reducing Roundtrips!2014-12-22T17:50:00Z<h3 id="setting-the-scene">Setting the Scene</h3>
<p>Imagine we have an application that uses the CRM SDK. It needs to:</p>
<ol>
<li>Create a new <code>account</code> entity in crm.</li>
<li>Get some value that was just generated as a result of a synchronous plugin that fires on the create. For example, suppose there is a plugin that generates an account reference number.</li>
</ol>
<h3 id="the-i-dont-care-about-network-latency-method">The "I don't care about network latency method!"</h3>
<p>The 'I don't care about network latency' way of dealing with this is to just do 2 seperate Requests (roundtrips) with the CRM server.</p>
<ol>
<li>Create the new <code>account</code> which returns you the ID.</li>
<li>Retrieve the <code>account</code> using that ID, along with the values that you need.</li>
</ol>
<p>This approach is sub optimal where network latency is a concern, as it incurs the penalty of making two roundtrips accross the network with the server, where 1 is possible.</p>
<p>Let's now have a look at the "I'm running on a 56k modem method" of doing the same thing!</p>
<!--more-->
<h3 id="the-im-running-on-a-56k-modem-method-this-weeks-pro-tip">The "I'm running on a 56k modem method" - this weeks pro tip!</h3>
<p>For quite some time now - as of <code>CRM 2011 Update Rollup 12 - (SDK 5.0.13)</code> you can utilise the <a href="http://msdn.microsoft.com/en-gb/library/jj863604.aspx">Execute Multiple</a> request to do this kind of thing in one roundtrip with the CRM server.</p>
<p>Here is an example of creating an account, and retrieiving it in a single round trip:</p>
<pre><code class="language-csharp"> // Create an ExecuteMultipleRequest object.
var multipleRequests = new ExecuteMultipleRequest()
{
// Assign settings that define execution behavior: continue on error, return responses.
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = false,
ReturnResponses = true
},
// Create an empty organization request collection.
Requests = new OrganizationRequestCollection()
};
var entity = new Entity("account");
entity.Id = Guid.NewGuid();
entity["name"] = "experimental test";
CreateRequest createRequest = new CreateRequest
{
Target = entity
};
RetrieveRequest retrieveRequest = new RetrieveRequest
{
Target = new EntityReference(entity.LogicalName, entity.Id),
ColumnSet = new ColumnSet("createdon") // list the fields that you want here
};
multipleRequests.Requests.Add(createRequest);
multipleRequests.Requests.Add(retrieveRequest);
// Execute all the requests in the request collection using a single web method call.
ExecuteMultipleResponse responseWithResults = (ExecuteMultipleResponse)orgService.Execute(multipleRequests);
var createResponseItem = responseWithResults.Responses[0];
CreateResponse createResponse = null;
if (createResponseItem.Response != null)
{
createResponse = (CreateResponse)createResponseItem.Response;
}
var retrieveResponseItem = responseWithResults.Responses[1];
RetrieveResponse retrieveResponse = null;
if (retrieveResponseItem.Response != null)
{
retrieveResponse = (RetrieveResponse)retrieveResponseItem.Response;
}
Console.Write(retrieveResponse.Entity["createdon"]); // yup - we got the value we needed!
</code></pre>
<h3 id="what-happened">What happened?</h3>
<p>Both the CreateRequest, and the RetrieveRequest (for the created entity) are batched up into a single Request and shipped off to the CRM server for processing.</p>
<p>CRM processed them in that order, collated the responses together, and returned them in a single batch.</p>
<h3 id="caveats">Caveats</h3>
<p>One caveat of this approach is that, if you intend to grab the generated values for an entity that is being created, then you need to know in advance what the ID will be.</p>
<p>This means you have to specify the ID of the entity when you create it yourself - you can't let CRM auto create the new ID.</p>
<p>For updates / deletes this is a non issue, as the ID is allready known.</p>
<h3 id="last-thoughts-sql-optimisation">Last thoughts - SQL Optimisation</h3>
<p>I speculate that specifying your own ID's <em>might be a bad thing</em> if you don't use Sequential Guid's.</p>
<p>When CRM generates Id's for you, it generates them sequentially, and I beleive there may be SQL performance benefits to this in terms of index optimisation etc. So if using Guid.NewGuid() to create your new Id's you may want to check with a SQL guru first to understand any impact of using random Guid's as Id's on performance of the CRM tables! That said - Microsoft do support this, so it can't be too bad..</p>
<p>Imagine we have an application that uses the CRM SDK. It needs to:</p>
<p>The 'I don't care about network latency' way of dealing with this is to just do 2 seperate Requests (roundtrips) with the CRM server.</p>
<p>This approach is sub optimal where network latency is a concern, as it incurs the penalty of making two roundtrips accross the network with the server, where 1 is possible.</p>
<p>Let's now have a look at the "I'm running on a 56k modem method" of doing the same thing!</p>http://darrelltunnell.net/blog/2014/12/14/unit-testing-crm-pluginsUnit Testing Dynamics CRM Plugins2014-12-14T17:50:00Z<h3 id="there-is-no-spoon-crm">There is no <del>Spoon</del> CRM</h3>
<p>The purpose of this post will be to look at the code for a fairly typical looking crm plugin, and examine how to implement a unit test with the least possible effort. Reduced Effort == Reduced Person Hours == Reduced Cost.</p>
<p>Remember, this is Unit Testing, not Integration testing - so at test time - there is no CRM!</p>
<!--more-->
<h3 id="a-plugin-and-its-requirements">A plugin - and it's requirements</h3>
<p>Firstly, let's look at a plugin that we will call the <code>ReclaimCreditPlugin</code>. Here are the requirements:</p>
<ul>
<li>It must run only within a transaction with the database.</li>
<li>When a Contact entity is Updated, if the contact has a parent account, and that parent account is "on hold" then set the "taketheirshoes" flag on the contact record to true.</li>
</ul>
<h3 id="developer-jon-doe">Developer Jon Doe</h3>
<p>Jon Doe immediately gets to work on writing the plugin for those requirements. He produces the following plugin</p>
<pre><code class="language-csharp">public class ReclaimCreditPlugin : IPlugin
{
public void Execute(IServiceProvider serviceProvider)
{
var executionContext = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
// 1. We must run only within a transaction
if (!executionContext.IsInTransaction)
{
throw new InvalidPluginExecutionException("The plugin detected that it was not running within a database transaction. The plugin requires a database transaction.");
}
// 2. Get the contact, check its parent account.
if (executionContext.InputParameters.Contains("Target") && executionContext.InputParameters["Target"] is Entity)
{
// Obtain the target entity from the input parameters.
var contactEntity = (Entity)executionContext.InputParameters["Target"];
// Get the parent account id.
var parentAccountId = (EntityReference)contactEntity["parentaccountid"];
// Get the parent account entity.
var orgServiceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
var orgService = orgServiceFactory.CreateOrganizationService(executionContext.UserId);
var parentAccountEntity = orgService.Retrieve("account", parentAccountId.Id, new ColumnSet("creditonhold"));
var accountOnHold = (bool)parentAccountEntity["creditonhold"];
if (accountOnHold)
{
contactEntity["taketheirshoes"] = true;
var tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
tracingService.Trace("Have indicated that we should take the shoes from contact: {0}", contactEntity.Id.ToString());
}
}
}
}
</code></pre>
<h4 id="good-job">Good Job?</h4>
<p>Take a moment to peer review the above code. Would you vindicate Jon Doe's effort? It seems it has all the required logic in all the required places. It appears he has covered the list of requirements. Although Jon doesn't check to make sure the current entity being updated is definately a contact entity.. But within the confines of this blog post we will assume that there is no possible danger that the plugin could ever be registered against the wrong entity.</p>
<p>So.. does it actually work?</p>
<h3 id="does-it-work">Does it work?</h3>
<p>Assuming you want to start haemorrhaging people's time accross the organisation, one way to find out if this code works is to immediately go through the process of deploying it to a QA environment, getting someone to test it manually, and then repeating that cycle of Dev --> Deployment --> QA as often as necessary, until the tester gives the thumbs up.</p>
<p>If you want to go that route, feel free to skip the rest of this article. Otherwise read on, where sanity awaits!</p>
<h3 id="show-me-a-unit-test-already">Show me a Unit Test Already!</h3>
<p>Bad news for you. I could.. but I won't.</p>
<h3 id="why-wont-you-show-me-a-unit-test">Why won't you show me a unit test?</h3>
<p>In short, because I value my time. Just look at that code again for crying out loud! It's littered with dependencies on things that are only provided at runtime by Dynamics CRM - things like:</p>
<ul>
<li>IServiceProvider</li>
<li>IPluginExecutionContext</li>
<li>IOrganizationServiceFactory</li>
<li>IOrganizationService</li>
<li>ITracingService</li>
</ul>
<p><strong>WHAT THE HELL ARE ANY OF THESE THINGS TO DO WITH THE ACTUAL REQUIREMENTS THAT I <em>NEED</em> TO TEST???</strong></p>
<p>Listen.. I read those requirements for this plugin. I read them atleast one thousand times. And I wrote them in fact. Here they are again:</p>
<blockquote class="blockquote">
<ul>
<li>It must run only within a transaction with the database.</li>
<li>When a Contact entity is Updated, if the contact has a parent account, and that parent account is "on hold" then set the "taketheirshoes" flag on the contact record to true.</li>
</ul>
</blockquote>
<p>So with that in mind, can you please show me the requirement dictating: When a contact is updated, it is of upmost importance to us as a business that it looks at the <code>IPluginExecutionContext</code> and grabs the <code>IOrganizationServiceFactory.</code></p>
<p>Or please show me where the requirements state: When a contact is updated, the plugin absolutely must interact with the <code>IServiceProvider</code> because otherwise you know.. Our business just won't function anymore.</p>
<p>No my friends. The requirements do not say <em>any of that</em>. I am in the business of testing against the requirements.</p>
<h4 id="why-is-that-a-problem">Why is that a problem?</h4>
<p>The problem is not obvious at first glance. It is definately technically possible to mock / fake all of those services at unit test time. You can use something like RhinoMocks or another Mocking library to mock out <code>IServiceProvider</code> for the purposes of your test. You would then have to mock out all the calls to <code>IServiceProvider</code> that are made, so that it returns your other 'mocked' services like a mock 'IPluginExecutionContext' etc etc - and down the rabbit hole you go.</p>
<p>The problem, is about <em>effort</em>. This approach, although technically possible, requires significant <em>effort</em>. You would have to mock a tonne of runtime services and interactions. We have to ask ourselves, is all that effort really necessary? Sometimes it may be, but most of the time, it isn't. In this instance it definately isn't and I will explain why.</p>
<h3 id="lets-use-the-requirements-to-write-the-plugin-in-pseudo-code">Let's use the requirements to write the plugin, in pseudo code.</h3>
<p>With those requirements - forget everything you know about Dynamics Crm and write your ideal pseudo code that would implement those requirements. This is the actual logic we care about testing.</p>
<p>PSEUDO CODE:</p>
<pre><code class="language-csharp">if (!IsRunningInTransaction)
{
Throw "Plugin requires a transaction."
}
If (IsUpdateOf("contact"))
{
var contact = GetTargetEntity();
var account = GetAccountForContact(contact);
var isOnHold = (bool)account["creditonhold"];
if(isOnHold)
{
contact["taketheirshoes"] = true;
}
</code></pre>
<h3 id="look-at-that-pseudo-code-do-you-see-any-runtime-services">Look at that Pseudo Code - Do you see <em>any</em> runtime services?</h3>
<p>Notice how it contains only the logic we really care about testing - the logic as described by the requirements. It doesn't contain needless fluff. No <code>IServiceProvider</code>, No <code>IPluginExecutionContext</code>. It looks very simple, very basic. If we could actually write a CRM plugin like this, it would be about 1.5 million times easier to test. Well we can.</p>
<h3 id="isolating-out-dependencies-is-the-key-to-unit-testing">Isolating out dependencies is the key to unit testing.</h3>
<p>Yes it's true folks you heard it here first. The less dependencies you utilise directly in your methods, the easier they are to unit test.</p>
<p>With this principle in mind, let's revisit our plugin and refactor it to remove some dependencies.</p>
<h3 id="new-and-improved-plugin">New and Improved Plugin</h3>
<pre><code class="language-csharp"> public class ReclaimCreditPlugin2 : IPlugin
{
private IServiceProvider _ServiceProvider;
public void Execute(IServiceProvider serviceProvider)
{
_ServiceProvider = serviceProvider;
Execute();
}
/// <summary>
/// This is the method containing the business logic that we want to be able to assert at unit test time.
/// </summary>
public void Execute()
{
// 1. We must run only within a transaction
if (IsInTransaction())
{
throw new InvalidPluginExecutionException("The plugin detected that it was not running within a database transaction. The plugin requires a database transaction.");
}
// 2. Get the contact
var contact = GetTargetEntity();
// 3. Get the Parent Account for the contact.
var parentAccount = GetAccountEntity(contact);
if (parentAccount == null)
{
return;
}
// 4. If credit on hold, set taketheirshoes.
var accountOnHold = (bool)parentAccount["creditonhold"];
if (accountOnHold)
{
contact["taketheirshoes"] = true;
}
}
/// <summary>
/// Returns the parent account entity for the contact.
/// </summary>
/// <param name="contact"></param>
/// <returns></returns>
protected virtual Entity GetAccountEntity(Entity contact)
{
// Get the p[arent account id.
var parentAccountId = (EntityReference)contact["parentaccountid"];
// Get an instance of the IOrganisationService.
var orgServiceFactory = (IOrganizationServiceFactory)_ServiceProvider.GetService(typeof(IOrganizationServiceFactory));
var executionContext = (IPluginExecutionContext)_ServiceProvider.GetService(typeof(IPluginExecutionContext));
var orgService = orgServiceFactory.CreateOrganizationService(executionContext.UserId);
// Get the account entity, with only the column / attribute that we need.
var parentAccountEntity = orgService.Retrieve("account", parentAccountId.Id, new ColumnSet("creditonhold"));
return parentAccountEntity;
}
/// <summary>
/// Returns the current "Target" entity that the plugin is executing against.
/// </summary>
/// <returns></returns>
protected virtual Entity GetTargetEntity()
{
var context = (IPluginExecutionContext)_ServiceProvider.GetService(typeof(IPluginExecutionContext));
if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)
{
var contactEntity = (Entity)context.InputParameters["Target"];
return contactEntity;
}
return null;
}
/// <summary>
/// Returns whether the plugin is currently enrolled within a database transaction.
/// </summary>
/// <returns></returns>
protected virtual bool IsInTransaction()
{
var context = (IPluginExecutionContext)_ServiceProvider.GetService(typeof(IPluginExecutionContext));
return context.IsInTransaction;
}
}
</code></pre>
<h3 id="what-just-happened">What just happened?</h3>
<p>I applied a technique called the <a href="http://taswar.zeytinsoft.com/2009/03/08/extract-and-override-refactoring-technique/">Extract and Override</a> technique, to remove the concrete references to all of those CRM runtime only services from within the Execute method, and instead they are now referenced within virtual methods which can be overriden at unit test time.</p>
<p>For example rather than having the following code directly within the execute method:</p>
<pre><code class="language-csharp">
var executionContext = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
// 1. We must run only within a transaction
if (!executionContext.IsInTransaction)
{
// shortened for brevity
}
</code></pre>
<p>It has been replaced by a call to virtual method:</p>
<pre><code class="language-csharp"> if (IsInTransaction())
{
}
</code></pre>
<p>Because the interactions with the various CRM runtime Services now occur within Virtual methods, we no longer need to mock them up at unit test time. Say goodbye to having to mockup <code>IPluginExecutionContext</code>, <code>IServiceProvider</code> or <em>any</em> of the Crm runtime services. All we need to do now is just override the various virtual methods that our Execute() method calls, and return appropriate values at test time.</p>
<h3 id="ok-so-now-will-you-show-me-a-unit-test">Ok so - Now will you show me a Unit Test??</h3>
<p>Certainly Sir / Madame. Now that I can write one within a few minutes as opposed to a few hours, your wish is my command:-</p>
<p>For the purpose of our unit tests all we do, is create a class that derives from our original plugin class, but overrides the various virtual methods to provide different values at test time.</p>
<pre><code class="language-csharp"> public class UnitTestableReclaimCreditPlugin : ReclaimCreditPlugin2
{
public UnitTestableReclaimCreditPlugin()
{
AccountIsOnHold = false;
IsRunningInTransaction = false;
ContactEntity = new Entity("contact");
}
protected override Entity GetTargetEntity()
{
ContactEntity["parentaccountid"] = new EntityReference("account", Guid.NewGuid());
return ContactEntity;
}
protected override Entity GetAccountEntity(Entity contact)
{
var accountEntity = new Entity("account");
accountEntity["creditonhold"] = AccountIsOnHold;
return accountEntity;
}
protected override bool IsInTransaction()
{
return IsRunningInTransaction;
}
public bool AccountIsOnHold { get; set; }
public bool IsRunningInTransaction { get; set; }
public Entity ContactEntity { get; set; }
}
</code></pre>
<h3 id="and-here-are-the-unit-tests">And here are the Unit Tests</h3>
<pre><code class="language-csharp">
[TestFixture]
public class ReclaimCreditPluginUnitTests
{
public ReclaimCreditPluginUnitTests()
{
}
[ExpectedException(typeof(InvalidPluginExecutionException),
ExpectedMessage = "The plugin detected that it was not running within a database transaction",
MatchType = MessageMatch.Contains)]
public void Should_Only_Run_Within_Transaction()
{
// arrange
var sut = new UnitTestableReclaimCreditPlugin();
sut.IsRunningInTransaction = false;
// act
sut.Execute();
}
public void Should_Take_Shoes_When_Credit_On_Hold()
{
// arrange
var sut = new UnitTestableReclaimCreditPlugin();
sut.IsRunningInTransaction = true;
sut.AccountIsOnHold = true;
// act
sut.Execute();
//assert
Assert.That(sut.ContactEntity["taketheirshoes"], Is.EqualTo(true));
}
public void Should_Not_Take_Shoes_When_Credit_Not_On_Hold()
{
// arrange
var sut = new UnitTestableReclaimCreditPlugin();
sut.IsRunningInTransaction = true;
sut.AccountIsOnHold = false;
// act
sut.Execute();
//assert
Assert.That(sut.ContactEntity["taketheirshoes"], Is.Not.EqualTo(true));
}
}
</code></pre>
<h3 id="wrapping-up">Wrapping Up</h3>
<p>I hope I have demonstrated a simple plugin, with a simple set of unit tests. More importantly, I hope I have demonstrated that although it may be technically possible to write a unit test for an exising plugin, by mocking up every CRM runtime service and interaction that the plugin makes,just because such a thing is possible, doesn't mean you should just do it. First the work has to be justified. To justify what is necessary, examine the requirements, examine the plugin code, and be absolutely clear on what it is you want to cover in your unit tests. With that in mind, refactor the plugin code to eliminate fluff (extraneoues concrete references to dependencies that are surplus to the requirements that you want to test). Use techniques like the <code>Extract and Override</code> technique to allow you to substitute these dependencies easily at test time. When you do this, you may be surprised at how much simpler it becomes to write unit tests. I would aslo reccommend reading a book on unit testing, I found <a href="http://artofunittesting.com/">The Art of Unit Testing</a> very educational on this topic.</p>
<p>The purpose of this post will be to look at the code for a fairly typical looking crm plugin, and examine how to implement a unit test with the least possible effort. Reduced Effort == Reduced Person Hours == Reduced Cost.</p>
<p>Remember, this is Unit Testing, not Integration testing - so at test time - there is no CRM!</p>http://darrelltunnell.net/blog/2014/11/16/a-proclamationA Proclamation2014-11-16T00:00:00Z<p>On this day the 16th November 2014, let it be known that Darrell's blog was rendered forth unto the internet.</p>
<!--more-->
<p>Let the knowledge and manner of its creation also be recorded, lest it be lost from mortal ken. Thus humanity, need not be stricken in ignorance and awe, and need not refer to my blogging website as "Witchcraft" or "Devilry".</p>
<h3 id="the-mechanism-of-creation">The Mechanism of Creation</h3>
<ul>
<li>Inspiration taken from JakeGinnivan's blog which is powered by OctoPress: <a href="http://jake.ginnivan.net/">http://jake.ginnivan.net/</a></li>
<li>Purchased a domain name: darrelltunnell.net</li>
<li>Followed this OctoPress documentation: <a href="http://octopress.org/docs/setup/">http://octopress.org/docs/setup/</a></li>
<li>Hosted on <a href="https://github.com/">https://github.com/</a></li>
</ul>
<h3 id="a-revelation">A revelation</h3>
<p>And so it came to pass that Darrell Tunnell's blog was incredibly useful to others. Darrell's blog is currently read by 3.1 million people. Web browser requests for Darrell's blog constitute approximately 85.2% of all browser requests made, worldwide. NetFlix was eventually absorbed into Darrell's Blog at the end of 2014, through internet osmosis.</p>
<p>On this day the 16th November 2014, let it be known that Darrell's blog was rendered forth unto the internet.</p>