Archive

Archive for the ‘Uncategorized’ Category

Code Generation to Reduce Software Costs

May 25, 2014 Leave a comment

I have developed software for many years now and am surprised at just how much money companies spend on writing and maintaining code that would be more reliable, lest costly, and more easily maintained if code generation tools were used instead. As an American developer the best way for us to compete on cost is to think smarter unless we want to work every waking minute for less money, and die young from stress related disease. That being said, code generation requires planning and careful thought because there is a balance that must be maintained lest you incur technical debt in the form of future support costs.

If your code generation strategy does not accomplish the following objectives, you have it wrong:

  1. Code generation should reduce cost of development
  2. Code generation should reduce time of development
  3. Code generation should increase quality and reliability of code

When some people think of code generation they immediately think of software that produces the User Interface tier of an application based on the schema in a database. Although I am a prolific user of code generation, rarely will I use code generation in the UI tier (aside from ASP.NET MVC which is technically generating HTML for you, or maintenance screens). The vast majority code generation that I implement is for what I call “plumbing code”. A good candidate for this type of code generation has the following qualities in my opinion:

  1. At least 90% of the time the code should follow the same repeatable pattern
  2. It requires little or no custom business logic, and little thought on behalf of the developer
  3. It is part of an architecture that you don’t want to be violated (such as bypassing layers)
  4. If it is in the UI tier, it is for maintenance or administrative views where usability is not a concern
  5. Other approaches wouldn’t solve your problem better

Where are some ideal places in your application to leverage code generation and who should maintain the code generation templates?

 

In many cases you will leverage code generation tools created for you, often in your data access layer to generate entities or code that maps between entities and your database. Entity Framework and nHibernate are good examples of this. If there is an industry accepted approach developed by a company responsible for maintaining the template, you will want to leverage their solution as often as possible and limit customization of their provided templates. That way when you are ready to update to a newer release, the cost of labor is minimal.

 

Other areas that I focus on are the “edges” of an application where code does little more than act as a traffic cop. I make it easy to override generated code in partial classes so that in the rare event you must deviate from the default pattern you are able to do so. Once you learn how to leverage code generation tools effectively, you will want to abuse this power (which I admittedly have done in the past) so be selective. In short, don’t reinvent the wheel here, and don’t believe the extremists on either side who say that all code generation is bad or who say that everything should be generated!

 

My final piece of advice is that unless your business is creating code generation tools that you not create the tooling itself and instead purchase a code generation framework. Your job should be adding business value for your customers, not creating a management nightmare for them for years to come. CodeSmith is a great platform that I had used for years, while today I tend to use T4 because of the built-in support with Visual Studio (plus it is free and easily shared between developers). I suspect that Razor syntax will eventually replace T4, which is great because using MVC and Razor for code generation would be far easier and more maintainable than leveraging T4 (which is a template language similar to classic ASP.NET).

 

If your company is interested in custom application development that is well architected and efficiently implemented, or if you are a developer with questions or ideas related to code generation, please submit a request for a free consultation or contact us direct at http://www.SDSElite.com.

Advertisements
Categories: Uncategorized

nUnit Heart MSTest

October 1, 2010 Leave a comment

It’s nearly impossible to make everyone happy all of the time.  When it comes to nUnit and MSTest there are people who seem to be married to their test platform; I’ll admit that I’m rather attached to MSTest myself.  Some of the reasons given seem to have more with people being anti-Microsoft than pro-nUnit.  But the last thing I want to do is make people not want to test because they aren’t happy with the tools.  The way I see it any test, no matter what the platform, is much better than no test at all.

One thing I know about software development teams is you’ve got to pick your battles because you only get so many chips.  I knew that I needed MSTest so I could capture metrics, conduct load testing, use built-in Visual Studio tools to get code coverage, and more easily test my code.  I also knew that key players had their heart set on nUnit.  The nUnit test itself is not a bad test; I simply needed to leverage MSTest as a shell so I can do all these other great things.

Fortunately for our team we embrace outside of the box thinking and try to be flexible within reason.  Because of the desire to solve both problems we are now able to execute nUnit tests from both test harnesses, while only needing to write one test.  And it should work the opposite way (haven’t tried yet).  Depicted below is proof that it works.

Passing nUnit test The SAME test Passing in MSTest
image image

 

The solution was to create a base class that abstracts the complexity of it all.  I didn’t want developers to manage tons of attributes, and I didn’t want them to have to change the way they write tests. 

In the base class I added the nUnit attributes, using a custom class I created that inherits from the TearDownAttribute and SetupAttribute.  This workaround was required because the attributes weren’t being picked up in my inheriting class.  By adding the AttributeUsage attribute on top of the custom attributes and setting the Inherited property to ‘true’, I was able to solve that problem.   

Create the base class first

using System;
using nutest = NUnit.Framework;

[NUTestClass()] //For nUnit
public class MSTestAndNUnitCombo
{
[NUSetup()]
public virtual void Setup() { }

[NUTearDown()]
public virtual void TearDown() { }
}

[AttributeUsage(AttributeTargets.Method, Inherited = true, AllowMultiple = true)]
public class NUTearDown : nutest.TearDownAttribute { }

[AttributeUsage(AttributeTargets.Method, Inherited = true, AllowMultiple = true)]
public class NUSetup : nutest.SetUpAttribute { }

[AttributeUsage(AttributeTargets.Class, Inherited = true, AllowMultiple = true)]
public class NUTestClass : nutest.TestCaseAttribute { }

Unfortunately, I was not able to abstract the MSTest attributes on the Setup and TearDown virtual methods because Microsoft, in their infinite wisdom, made those attributes SEALED classes.  So, although the developer will still see some attributes we are still reducing complexity and forcing the developer not to forget two methods we always want implemented.

Next, create your unit test; Inherit from your Base Class; Alias your using statements; Add Attributes.

using mstest = Microsoft.VisualStudio.TestTools.UnitTesting;
using nutest = NUnit.Framework;

[mstest.TestClass()]
public class MSTestAndNUnitCombined : MSTestAndNUnitCombo
{
[mstest.TestInitialize()]
public override void Setup()
{
//TODO
}

[mstest.TestCleanup()]
public override void TearDown()
{
//TODO
}


/// This test still works in NUnit AND MSTest.
[mstest.TestMethod()]
[nutest.Test()]
public void FailingTest()
{
nutest.Assert.IsTrue(1 == 0);
}

/// This test still works in NUnit AND MSTest
[mstest.TestMethod()]
[nutest.Test()]
public void PassingTest()
{
nutest.Assert.IsTrue(1 == 1);
}
}

Finally, in the first PropertyGroup section of your nUnit test project you will need to add some XML.  If you added the project as an MSTest project first, this step isn’t required.  Also, if this step doesn’t work (perhaps the Guids used change in the future) simply add a new test project and look at the csproj file, then snag it’s setting (right-clicking and edit with notepad works well).

Finally add ProjectTypeGuids to the First PropertyGroup

<PropertyGroup>
. . . More XML will be found here . . .
<ProjectTypeGuids>
{3AC096D0-A1C2-E12C-1390-A8335801FDAB};
{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}
</ProjectTypeGuids>
</PropertyGroup>

There you have it.  Once this is all done, you can do some real cool thing with Visual Studio Test Tools.  One good tool is the Performance Wizard.  Simply right-click your unit test in the MSTest window and select the Performance Test option.  I have Ultimate, but I think this feature is available in Professional edition.

Performance Wizard Screen 1 Performance Wizard Screen 2 Performance Wizard Screen 3
image image image

After you run the profiler, you have some interesting metrics and a great baseline to measure true performance of your application.  Obviously the simple test above is not real-world, but you can run this against any of your unit tests.  I wasn’t able to gather metrics on my VM, however the Load Test does essentially the same thing (and more).  I have that working.

You will need to run Visual Studio as Admin and there are some limitations to what instrumentations you can run on a VM.  If you are going to run load tests on the VM, make sure you give it as much memory as possible. 

Metrics I just gathered using a Load Test

image

 

That’s all for this post.  If you have any questions, comments, or problems as you try to do this yourself I’ll gladly answer comments.  I am publishing this blog to the Kindle.  If you would like to subscribe visit this link

Categories: Uncategorized