Category name:Uncategorized

Octopus Deploy with PublishedApplications

Normally, when you install Octopack using nuget, the contents of the OutDir (msbuild variable) will be put in the nuget package it creates for you. But when running in TFS build this will give you trouble as mentioned in http://help.octopusdeploy.com/discussions/problems/505-all-binaries-from-tfs-build-in-nuget-package

A solution was mentioned to use the PublishedApplications nuget package to build each project to its own directory and I blogged as much yesterday…. But this is just a half-baked solution; yes, each project is build to its own directory, but octopack still takes the output of the tfs binaries folder for the packages. I found a way around this and I will describe it here.

I had to edit the source for octopack. I changed the dll to determine how a web project is recognized. Normally it does this by looking for a ‘web.config’ file, now you set the attribute TreadEveryProjectAsApplication to ‘true’ of the CreateOctoPackPackage task which will make octopack always use the content of the OutDir as input for the package. (It will ignore the content files in the project directory.)

I also removed the line where it excluded the files in the _PublishedWebsites folder, because I explicitly need these files.

Added this PropertyGroup to the octopack.targets file:

<PropertyGroup>

      <OctoPackDirectoryToPack Condition=‘$(ExeProjectOutputDir)’ != ”>$(ExeProjectOutputDir)</OctoPackDirectoryToPack>

      <OctoPackDirectoryToPack Condition=‘$(WebProjectOutputDir)’ != ”>$(WebProjectOutputDir)</OctoPackDirectoryToPack>

    </PropertyGroup>


It will set the variable OctoPackDirectoryToPack to either ExeProjectOutputDir or WebProjectOutputDir. I then use that variable as input for the OutDir attribute of the CreateOctoPackPackage.

Download it here: http://bloggingabout.net/media/p/578418/download.aspx or check out the code at https://github.com/dmarckmann/OctoPack

Happy Coding!

PS. I later also created the property GetVersionFromAssemblyFileVersion (bool) if you want to get the version from the assemblyfileversion of the PrimaryOutputAssembly like we do. Download from github and build locally…

From ‘A-ha’ to ‘Ka-Ching’ with Sound Of Data

This post will be posted here and on the site of Sound of Data as well (http://soundofdata.nl/en/nieuws)

As of february 25th I started as Senior Developer at Sound of Data. For those who do not know me I’ll shortly introduce myself.

I am 37 years old and I live in Goedereede-Havenhoofd. (That’s here). Writing code has always been a hobby and 14 years ago I managed to turn my hobby into work and I’ve been hobbying ever since.

After 7,5 years working for TellUs, leader in online (sales) lead generation, it was time for a change. I was lucky to be contacted by Sound of Data because of my affinity with CQRS and Event Sourcing.

Their entire platform has been built on this architectural design pattern and they could do with an extra senior developer. I soon learned that their implementation of CQRS & ES is okay, but not yet fully complete. I hope to be able to lend a hand in completing the implementation. Than we can enjoy all the benefits of this pattern.

This isn’t my first priority though. I saw that SOD has some issues where it comes to deployment, so I made it my mission to get some Application Lifecycle Management in place and take the first steps towards Continuous Delivery. The idea of this practice is to make the time between ‘A-ha’ (the idea) and ‘Ka-ching’ (the release to market) as small as possible by automating and standardizing releases. This will help us bringing our customers closer to their customers and bring us one step closer to world domination in that area.


Happy coding!

Windows 8 – Html to RichTextBox Content

As I explained in my last posts (here and here) I want to use Diffbot to implement my offline reading feature. I want the text to show up looking as close to the real website as I can. This can be done with the RichTextBlock control. This can implement a very limited set of xaml elements. Read about it on http://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.xaml.controls.richtextblock.

Luckily DiffBot can send me the text in html, so the style isn’t lost. All I needed is a way to convert Html to Xaml. If you search NuGet for ‘RichTextBlock’ you get 2 results: RichTextBlock.Html2Xaml and WinRT Html2Xaml Converter . I tried them both, but the first uses an xslt template for parsing the html and that only works if the html can be processed as xml. If only every website was this tidy… So I ended up using the last one that uses HtmlAgilityPack to parse the html.

WinRT.Html2Xaml

I like the general idea of this project. Check out the code on https://winrthtml2xaml.codeplex.com/. Here’s how it works:

There is an attached property for the Html so we can bind a string in a class called Properties. Whenever the Html changes the HtmlChanged eventhandler is called. This method uses the converter to convert html to xaml, set this in a new RichTextBlock and then moves all the blocks from the new RichTextBlock into the existing one. Works like a charm.

The Html2XamlConverter is a static class with 1 public method: Convert2Xaml(string htmlString) (it also has an overload where you can specify extra attributes) which is called from the HtmlChanged eventHandler. It uses TagDefinitions to specify how a translation between html and xaml tags should be made.

Downside

Unfortunately, this package does not support parsing img tags to Images and it provided no good way of extending the processing to my needs. The Html2XamlConverter is a static class with a fixed set of TagDefinitions, there is no way to get in between.

Nothing to do but fork I guess.

Fixing extensibility

First of all, to allow dependency the static Html2XamlConverter must be changed to a non-static class. Secondly, I added an extra attached property Converter in the Properties class to allow setting converter in xaml (this has to be a resource though. Here’s how you use it:

<Page.Resources>

        <common:MyHtml2XamlConverter x:Key=”MyConverter” />

    </Page.Resources>


<RichTextBlock

h2xaml:Properties.Html=”{Binding Text}”

h2xaml:Properties.Converter=”{StaticResource MyConverter}” />

I can now specify which Converter I want to use. I can even bind it if I so want.

So I forked the code, refactored and will do a pull request soon. In the meanwhile get the code here: http://winrthtml2xaml.codeplex.com/SourceControl/network/forks/dmarckmann/extendableHtml2Xaml

Accidentally, I added support for img tag parsing also.

Here’s a little sample of a converter that will parse pre tags to allow the right view of c# code:

public
class
MyHtml2XamlConverter : Html2XamlConverter

    {

        public MyHtml2XamlConverter()

            : base()

        {

            tags.Add(“pre”, new
TagDefinition(parsePre) { MustBeTop = true });

        }

        private
void parsePre(StringBuilder xamlString, HtmlNode node, bool isTop)

        {

            xamlString.Append(“<Span FontFamily=”Consolas” FontSize=”14″>”);

            xamlString.Append(node.InnerText.Replace(“n”, “<LineBreak/>”).Replace(” “, “<Run Text=” “/>”));

            xamlString.Append(“</Span>”);

        }

    }

Now we’ll have to wait for someone to make us a converter that will parse pre tags with c# color coding… (Do I hear ‘challenge accepted!’? Anyone?)

Happy coding!

Windows 8 RT & Caliburn.Micro – Being a share source

If you want to know how to share be a share target rather than the source, go here.

Being is the source of the share action is simple.

In the ViewModel where you have the content you want to share, add this:

protected
override
void OnActivate()

{


base.OnActivate();


DataTransferManager.GetForCurrentView().DataRequested += OnDataRequested;

}


protected
override
void OnDeactivate(bool close)

{


base.OnDeactivate(close);


DataTransferManager.GetForCurrentView().DataRequested -= OnDataRequested;

}


protected
void OnDataRequested(DataTransferManager sender, DataRequestedEventArgs args)

{


var request = args.Request;


var requestData = request.Data;

requestData.Properties.Title = Title;

requestData.Properties.Description = Description;

requestData.SetUri(new
Uri(Url));

}

Basically, we subscribe to the DataRequested event and fill the Request that comes with the event arguments with the data we want to share.

Easy!

In The Pocket Privacy Policy

Privacy Policy

I recognize that your privacy is important to you.

What information does the app collect?

In the Pocket and In The Pocket Free do not collect, store, or share any of your personal information. It also does not collect any data, information, trends, or track user movements.

Do we disclose any information to outside parties?

The app does not store, sell, trade, or otherwise transfer to outside parties any information. The app allows you to open web URLs from the application.
The URLs might be saved by internet browsers on your device. For information regarding the information stored by third party services, please read the privacy policy of the internet browsers on your device.

Windows 8 – NuGet: You are trying to install this package into a project that targets ‘.NETCore,Version=v4.5’

Today, after installing Visual Studio Express 2012 for Windows 8 on my home laptop, I experienced something I haven’t since I started developing. I got this message when trying to install Caliburn.Micro in a new project:

You are trying to install this package into a project that targets ‘.NETCore,Version=v4.5’, but the package does not contain any assembly references that are compatible with that framework. For more information, contact the package author.

I remember having to Google for a while before finding the answer, so here it is: It’s to do with the version of NuGet Package Manager you have installed. NuGet 2.0 or earlier expect the ‘winRT45′ or ‘NETCore45’ packages while as of version 2.1 NuGet expects one of the following: Windows, Windows8, win, win8.

More information can be found here: http://docs.nuget.org/docs/release-notes/nuget-2.1#Targeting_Windows_8_and_Windows_Phone_8_Projects

I’m back, starting a new blogging style

It’s been a while since I blogged. I have been busy with other stuff, but I’m back…

I am currently building a Windows 8 application for Pocket (http://www.getpocket .com). I want to share my experiences here. It won’t be a blog with long posts, but instead I will try and push small messages. They will be representations of my thoughts.They can vary in size.

So, without further ado… Let’s start…

(Super cool image copied from: http://inboundmarketing.kohfa.com/blog/bid/241680/Some-Thoughts-on-Website-Redesign)

Repair a broken Replication Sql Server 2005 vs Sql Server 2008 – PART 2

NOTE OF CAUTION: In the end we decided to turn off the continue on conflict option, because too many conflicts were ignored and the tables got too far out of sync… So the continue on conflict is not such a great option afterall…

Part 1: http://bloggingabout.net/blogs/dries/archive/2009/04/18/repair-a-broken-replication-sql-server-2005-vs-sql-server-2008.aspx

So here’s the update: The option to to continue on conflict works fine!

 

I tested it on two virtual machines that have p2p replication with a test table. Stopped Replication by stopping the SQL agents and then inserted identical rows at all servers. Nothing happened, at least no error.

So I did it again with records with the same id, but different values. That also worked, but the first processed command was replaced by the second value. So both nodes were in sync again.

I decided to go live with it and inserted the missing records from one node in one of the others. No problems at all! 

A note of caution though: Not all conflicts are handled.

On Friday I marked a table for replication that did not exist on the other nodes. This did result in a replication error! Apparently the option doesn’t work for this situation. Well, no worries. I created the table on the other nodes and then the magic option did the rest: lots of conflicts, but no errors! Don’t forget to remove the table from the articles of the publication though…

 

NOTE OF CAUTION:

In the end we decided to turn off the continue on conflict option, because too many conflicts were ignored and the tables got too far out of sync… So the continue on conflict is not such a great option afterall…