Pages - Menu

Octopus Deploy - Get Current System Time in Automated Emails

Scope

We are using Octopus Deploy 3.2.6 for our continuous deployment process. We would like to setup and automated email that sends our stake holders with 30 minute in advance for any deployment jobs. An email will look something similar to this.



Unfortunately this cannot be easily achieved as Octopus Deploy do not have any system variables that can print out the current time.

Solution

Although there are no system variables readily available to us, but we can make our own variables by using output variables.

In our deployment process, we add an extra step before our email process that will initialize any system variables we want.


We can use powershell scripts to get the system time, and store as an octopus output variable.

$now = [System.DateTime]::Now.ToString("f")
Set-OctopusVariable -name "CurrentDateTime" -value $now

In our email body, we can then reference the current time like this.

#{Octopus.Action[Init System Variables].Output.CurrentDateTime}

And the result will looks like this.


Conclusion

It was a lot of efforts just to get the current time. However, this is very now flexible as I can use anything that are gettable via PowerShell.

Moving Jenkins from One Machine to Another Machine

Moving Jenkins

Moving Jenkins from one machine to another machine was fairly straight forward, it is pretty much just a copying the folder from one machine to another and is documented in the Jenkins Wiki

java.nio.file.DirectoryNotEmptyException

The migration was smooth. I followed my previous post to move our Jenkins server to another instance, and we only get one console error related to the migration.



In the JENKINS_HOME\jobs\JOB_NAME folder, I am manually deleting the builds\lastSuccessfulBuild and builds\lastStableBuild folders, and Jenkins is happy building again :)

Sass + Compass compile css with minification and map file


Scope

I am trying to mimic the way how files are generated in the Bootstrap Distribution that contains the css file with minification and the map files by using the Bootstrap Sass.

Compass

When I compile in development mode, I will get the normal css; when I compile in production mode, I will get the minified css.

In my config.rb

output_style = (environment == :production) ? :compressed : :expanded

on_stylesheet_saved do |file|
 if File.exists?(file) && (output_style == :compressed)
  filename = File.basename(file, File.extname(file))
  FileUtils.mv(file, css_dir + "/" + filename + ".min" + File.extname(file))
  puts "   minify " + css_dir + "/" + filename + ".min" + File.extname(file)
 end
end

The output will look like this between development (default) and production.



Map File

In order to generate the map files, it can be done via a switch.

compass compile --sourcemap

Or by setting up in the config.rb.

sourcemap = (environment == :production)

Setting Up Octopus Deploy for Jenkins with nopCommerce Projects

Scope

After our previous success at setting up our Jenkins build server, we are now looking into an automatic deployment process for our nopCommerce projects by using Octopus Deploy.

Octopus Deploy

Octopus Deploy installation is straight forward and the wiki page is very detailed.

After setting up our server and tentacles, we have to bundle our artifacts into NuGet packages for Octopus Deploy to consume.

NuGet

NuGet Spec

Firstly we will initialize a nuget spec file for our project by running a command from my project folder.

NuGet Pack

Personally, I prefer to use NuGet Pack to do all my packagings as that will give me more power and better control. Unfortunately I ran into this issue that is related to the way how nopCommerce structure their project folders and I cannot change.

It is a misfortune that I have to combine the build process and packaging process into one command, I would have preferred to use NuGet Pack in a separate process for better granularity control and easier for troubleshooting. Life is imperfect and I am resort to my second option - OctoPack.

NuGet Explorer

NuGet Explorer is a tool that can examine the .nupkg NuGet packages. It is a very helpful tool that we can troubleshoot what is in the package content.

OctoPack

From Visual Studio, we installed OctoPack via NuGet on our Web Project only. 


In our Jenkins build server, we created a new job that only responsible to build the artefacts. It is similar to the existing build only job, but with extra parameters in the MSBuild command.


This will trigger the OctoPack to run, using the Release Configuration in .Net, and uploading the package to our Octopus Server package folder by using an API key.

MSBuild .\NopCommerce.sln 
  /p:Configuration=Release 
  /p:RunOctoPack=true 
  /p:OctoPackEnforceAddingFiles=true 
  /p:OctoPackPublishPackageToHttp=http://yourUrl/nuget/packages 
  /p:OctoPackPublishApiKey=API-YOURAPIKEY

The packages will became available in our Octopus Library.



In our Jenkins Server, we ran into an issue that any builds afterward will always fail and the error message is related to OctoPack.

Pushing Nop.Web 3.3.5791.28374 to 'http://octopus/nuget/packages'...

  Failed to process request. 'Bad Request'.

EXEC : The remote server returned an error : (400) Bad Request..

If I delete the package from the Octopus Library and a build will work again.

To resolve this, we need to increment the version number every times we build an artefact. The easiest way to do this is adding an OctoPack switch in our MSBuild command.

/p:OctoPackPackageVersion=3.3.${BUILD_NUMBER}

nopCommerce

One of the biggest challenge in this exercise is not just about Octopus Deploy but to get it to work with the nopCommerce. As I mentioned before, the nopCommerce structure is not NuGet friendly and the underlying technology of OctoPack is by using NuGet Pack anyway, so while I was able to dodge a few bullets get things moving, but I still have to resolve some of the known issues.

I downloaded the nopCommerce 3.6 stable version and tried to pack it and get the following.



It is noticeable that there are many missing files.

  • Administration folder
  • Nop.Admin.dll (from the bin folder)
  • Plugins folder

The reason is because we are just building the Nop.Web.proj. The Nop.Admin.proj is a standalone project as well as all the plugin projects.

There are different approaches to this but I chose the easiest one just to get things going.

  • Include the Nop.Admin project as a reference in the Nop.Web project.
  • Add an extra step that builds the NopCommerce.sln. This will pre-build all the plugins into binaries.
  • Manually include the missing folders in nuspec so they get picked up during the NuGet / Octo Pack process.



Web Config Transformation

In Octopus Deploy, we are allowed to a web config transformation after we deploy. I have my project name set up that aligned (beautifully) with our project names in Visual Studio, so that I can use the Octopus Variables when I perform transformation like this.

Web.Staging#{Octopus.Project.Name}.config => Web.config



During the transformation, we ran into the following error.

No element in the source document matches 'my_config_keys'

The error is saying a key my_config_keys is in our Web.StagingReebok.config but not in Web.config. Octopus tries to transform the key into Web.config and thrown a null exception.

Multi Tenant

Multi tenant support is not yet available in Octopus Deploy 3.2, According to roadmap, it should be available in 3.3.

In the meanwhile, Octopus Deploy documented 2 possible ways to implement the deployment strategy in a multi tenant environment. I chose the "project per brand" because I have a staging environment for each brands and it is messy to put all machines into 1 project.


Octopus Deploy Process

We have setup a simple Deployment Process as below.
  1. Send out notification email 
  2. Change to app offline
  3. Deploy the package
  4. Clean up the redundant transformation config
  5. Bring the site online (from app offline)
  6. Send out notification email (again)


Octopus Step Template

I have created some custom step templates that were not available in the community library.

Octopus Notification Integration

I also added a slack integration that will send notification to all developers upon deployment. I found notification is a better channel to communicate than automated emails clogging up in the inbox. As a teaser, our notification channel looks like this.


Conclusion

It is a tedious and enormous job to deploy 20+ websites in our web farm environment. Octopus Deploy was not difficult to setup, and all our deployment jobs are now automatically deployed from our continuous deployment server. Not only we are saving times, but also we are now able to eliminate any possible human errors in our deployment process.

Setting Up Jenkins for GitHub


Scope

Previously, I have setup Jenkins with TFS without much issues. This time I am setting up Jenkins to work with GitHub.

Git

We will need to install GitHub Plugin on our Jenkins server. We can verify our installation from Manage Jenkins -> Manage Plugins.



Then, we will need to install the Git Executable from git-scm.

We then need to setup our Git via Manage Jenkins -> Configure System -> Git -> Git Installations -> Path to Git executable.



We will create a Jenkins Job fill in the Source Code Management information similar to follow.


Test our build job to make sure we can get all source code in the workspace of our Jenkins server before we move on to the next stage.


MSBuild

After setting up our job to successfully clone from the Git repository, we will now set up our build process as part of our continuous integration.

Manage Jenkins -> Configure System -> MSBuild



Notice there is a warning about MSBuild.exe is not a directory. If we omit the executable part, we will get an error like the following. I believe this is minor bug in Jenkins. That's why it is always important to check the Console Output to see what the error is.


Then we set up our build job.


Troubleshooting

.Net 4.5.1

I have encountered the some errors about .Net 4.5.1 not found.

C:\Windows\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets(983,5): warning MSB3644: The reference assemblies for framework ".NETFramework,Version=v4.5.1" were not found.

I tried to download the Targeting .Net Platforms but I am still getting the same error.

Until I found this solution. I just copied the reference aseemblies from my dev machine to our Jenkins server with the exact same location.

C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.5.1

VS2012

error MSB4019: The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v11.0\WebApplications\Microsoft.WebApplication.targets" was not found.

Our projects were upgraded from VS2012 to 2013, and there seems an issue to do with the extra properties in the .csproj file.

The short fix is to specify the VisualStudioVersion parameter to MSBuild.

Manage Jenkins -> Configure System -> MSBuild



Or the long fix to clean the .csproj files.

NuGet

error : The build restored NuGet packages. Build the project again to include these packages in the build.

This reminds me one of the NuGet issue we had when integrating Jenkins with VSO. However, a perfect solution does not always solve the problem in an imperfect world. That's because our current process do not Restore Nuget Packages the Right Way.

Thanks to Jenkins, there is a workaround for it. We can add a windows command process before getting into the MSBuild process by using a Windows batch command.

C:\Tools\nuget.exe restore NopCommerce.sln





Conclusion

Had a fun time to set all this up (again). Plus one to our Joel Test score!



Restore NuGet Packages the Right Way

The Infamous Right Click to Enable NuGet Package Restore

Recently engaged in a conversation about NuGet and one of the developers suggested to me "Did you right click the solution to enable NuGet restore?" I was a little shock and unease as this is a very very old and obsoleted way of using NuGet and is not relevant in today's code of practice.

Let's learn the right way together!

NuGet Team documented 3 ways to restore packages.
  1. Automatic Package Restore in Visual Studio
  2. Command-Line Package Restore
  3. MSBuild-Integrated Package Restore
As the NuGet Team explained.
"Prior to NuGet 2.7, an MSBuild-Integrated Package Restore approach was used and promoted. While this approach is still available, the NuGet team suggests using the Automatic Package Restore and Command-Line Package Restore instead."
Or what other people twitted.
"Every Time someone enables #nuget package restore on a solution, a kitten dies. Learn the new workflow!"

Verdict

What is the best practice? Automatic Package Restore is the number one recommended method. You may however find yourself using CLI Packge Restore if you are building a CI server which is not so bad. The worst possible scenario is you are still doing "Right Click to Enable NuGet Package Restore". Say hello to 2015 :)



Mobile App - Hello World to PhoneGap Cordova + Ionic

Scope

We are going to build some apps in Android and iOS, as I have never done that before, I am going to build a really simple Hello World app that would introduce me to the world of apps. 

Prerequsite

Installations

There are quite a few things to prepare before we can even say hello to the world.
There are many options for choice of IDEs. Popular ones are as follow.

Cordova

Create

Firstly we will create the app by using a command line. This will be compatible for Android and iOS by adding the platforms to the app.

$ cordova create HelloWorld com.helloworld HelloWorld
$ cd HelloWorld
$ cordova platform add android

Setup

Setup Environment variable for ANDROID_HOME. This can be found by opening the Android SDK Manager.


Therefore, we will setup as follow.


Build

We will use our Command Line Interface to build our app as follow.

$ cd HelloWorld
$ cordova build android

If you are a little unlucky, you might run into error like this. Error: Please install Android target: "android-22".


This indicates we are targeting an incorrect version of android. We need to change our target in both places.

HelloWorld/platforms/android/project.properties 
HelloWorld/platforms/android/CordovaLib/project.properties

It will look like this if built successfully.

Test

As part of the Android SDK installation, it comes with the Android Virtual Device (AVD) Manager.

Choose Device Definition and Click Create AVD.


We will launch the AVD that we just created, and deploy our app to the emulator by running this command.

$ cordova run android

If everything ran smoothly, by default it should load up the Cordova Device Ready page from our /HelloWorld/www/index.html


Hello World

Before we begin, we will need to pick a mobile framework to work with. I have chosen the Ionic Framework.

Firstly, we will install the packages.

$ npm install -g cordova ionic

Then, we will scaffold our site.

$ ionic start HelloIonic blank

Similar to our Cordova commands, we can build and run our apps in Ionic commands.

$ cd HelloIonic
$ ionic platform add android
$ ionic build android
$ ionic run android

This time, we have a default page from the Ionic framework.



We are going to create some simple notifications. There are infinite ways of doing it. The most simplist form will be this.

alert('Hello World!');

However, we have not much control with the default alerts. The next step for me is to utilize some available plugins to do a bit more customizations. eg. https://github.com/apache/cordova-plugin-dialogs

function alertCallBack() {
 alert('Hello World');
}

navigator.notification.alert(
 'The World is Amazing!',  
 alertCallBack,  
 'My App',       
 'Done'          
);

The first message pops up is 'The World is Amazing!', then a callback function is invoked when the message is dismissed, so the alert message 'Hello World' will pop up.


 

USB Device

We can run the app not just on emulator, but a USB connected device as well.

Firstly, we will need to setup our device for development.

For my Samsung device, I needed the extra Windows drivers which I was hardly able to find from within the Samsung support page, but with a bit of googling, I found the hidden gem in their developer site.




After installing the drivers, we can connect our device to the computer via USB, we can test if our device is available for debugging purpose by running an adb (Android Debug Bridge) command.

$ <android-sdk>/adb.exe devices


The device is now visible to us and we are just one step away. On our device, we need to authorize our computer for USB debugging. After connecting our device to the computer, a similar screen should popup on your device.


We will run our adb command again. This should now say device after it is authorized.


With our Ionic commands, we can now explicitly specify to run our app in our device.

$ Ionic run --device

It will install and run the app in our USB attached device.


 


Conclusion

How about iOS? It happens that iOS can only be done in OS X and I don't have any. I tried to add the iOS platform in Cordova with my Microsoft machine, and I got the error as follow. There are some hackintosh way that I could get around the Apple issue, but I will leave that for another project next time.




Took me awhile to get there, but it is all worth it. As a web developer, I am now able to use my existing web knowledge to start building multi-platform mobile apps.

nopCommerce - AWS Migration

Scope

We are moving our existing websites to AWS EC2. From a high level perspective, our plan as follow.
  • Move the websites to EC2
  • Move the database to RDS
  • Content Images upload via CMS will go to s3
  • Product Images on the websites are moved to s3
  • Move the console apps to a t2.micro instance
We are using nopCommerce 3.3.

Implementation

Websites and Console Apps

Moving sites and apps to cloud instances are pretty straight forward. There are no code changes required in nopCommerce.

Database

We used the SQL Database Migration Wizard tool, pretty straight forward.

S3

Migration to S3 required some code changes, but we are not getting much resistance because of the advantage of using OO and polymorphism.

Previously we already wrote a custom class that inherit from the PictureService.

public class TAPictureService : PictureService, ITAPictureService, IPictureService

Now, I can inherit my custom class by my new AWS class.

public class AwsPictureService : TAPictureService, ITAPictureService, IPictureService

We will then override the image read / write method to connect to AWS. This was mentioned previously in my other post Amazon SDK S3 with .Net

In our DependencyRegister, we now register our new AWS Service.

builder.RegisterType<AwsPictureService>().As<IPictureService>();

Now my system is able to support either local or S3 images by changing the register type via DI.

How to remove all namespaces from XML with C#?

Scope

How to remove all namespaces from XML with C# is a commonly found questions in developments. It is not the best practice to remove the namespace and I see it as an anti-pattern to remove it rather than utilizing it, but there are cases that this can be an 'acceptable' solution.

One of the example could be this - Converting a DTD / XSD to strongly typed C# class

In this scenario, I was working with multiple servers that their returning namespaces are different each time, and the serializer fail to validate the namespace. Given the return xml is only so short lived and it was not worth the effort to implement multiple or dynamic namespaces.

Solutions

After looking up the stackoverflow, I was a little surprise that the first few answers seem to have issues with attributes being remove. They did not seem to serve the purpose of what they were asking - A clean, elegant and smart solution to remove namespacees from all XML elements.

Since no working solutions are available to me, I had an attempt to write my own as follow.

public static XElement RemoveAllNamespaces(this XElement element)
{
    return new XElement(element.Name.LocalName,
        element.HasAttributes ? element.Attributes().Select(a => new XAttribute(a.Name.LocalName, a.Value)) : null,
        element.HasElements ? element.Elements().Select(e => RemoveAllNamespaces(e)) : null,
        element.Value);
}

And I have tested the code with the following XML that have multiple level of nodes and attributes.



The 4 liner solution all serve it's purpose.

  1. Firstly, it creates the root node
  2. Appends all attributes
  3. Append all elements 
  4. Lastly, the value


The trick is in line 3, where it picks all the elements() and pushed back to it's own function from the current element. Thus, recursively doing the same thing for all the nodes.

The magic for it to work is the elegant way of how the XElement constructor works. I often found using XElement will end up with a 1 liner solution because of it's constructor, so a good coding style and line breaks will really help the next person who read your code.

Did my solution look clean, elegant and smart to you? Share your thoughts :)

Converting a DTD / XSD to strongly typed C# class

We are looking to implement some CyberSource Reporting functions that our system will be able to poll CyberSource for various reports, and call different actions base on the result. After going through the documentations, we know the CyberSource portals are providing us the DTD definitions.

Prerequisite

Implementation

Firstly we will need to download the DTD from the CyberSource portal.


We want to generate the strongly typed model class from the DTD definition, but we will need to convert this into XSD first. This can be done the XML Create Schema function via Visual Studio.



We are now able to generate the C# class by using the XSD.exe. It is available via the Visual Studio Developer Command Prompt.


The generated file will look something like this.


Troubleshooting

[System.InvalidOperationException]: {"<xxx xmlns='some_uri'> was not expected."}

During developments, we got an exception of "There is an error in XML document (0, 0)." and the inner exception is "<Report xmlns='https://ebctest.cybersource.com/ebctest/reports/dtd/cdr.dtd'> was not expected."

This is due to the DTD that we downloaded had a namespace of vendor's live site, therefore our strongly typed class has a name space of the live URI too. However when we are connected to their sandbox, throws an exception as the namespace didn't match.

One approach would be dynamically changing the namespace according to our connections, but I chose an easier way.

Firstly, I removed the namespace from the System.Xml.Serialization Xml Attributes in our generated strongly typed object class.



We will also need to remove the namespace from the incoming xml before we do any deserialization. This can be achieved by creating an extension method to recursively remove all namespaces from all nodes.
private ConversionDetailReport GetConversionDetailReport(string xmlString)
{
    var serializer = new XmlSerializer(typeof(ConversionDetailReport));

    var xDoc = XDocument.Parse(xmlString);

    var xr = xDoc.Root.RemoveAllNamespaces().CreateReader();

    return (ConversionDetailReport)serializer.Deserialize(xr);
}