Importing Java code into Git on Visual Studio Online from Eclipse

Last time I wrote about Creating a Build.vNext agent on Linux. I ended that post with the idea to build a Java application on Linux. The first thing that we’ll need is some Java code in VSO to build. Being a Java newbie, it took me quite a bit of figuring out on how to do that. I thought I’d share my experiences here, for you to enjoy.

Continue reading

Advertisements

Creating a Build.vNext agent on Linux

In the past few months Microsoft has made it very clear that they are no longer a Windows-only company. Things like Xamarin integration, the new multi-platform ASP.Net 5 and the cross-platform .Net CLR are all results of that new strategy. This opens up all sorts of new possibilities for developers who were previously tied to a single platform. Exciting times are coming!

The new build agent for the Build vNext system is another example of this cross-platform strategy. The new build agent is actually based on Node.js which means it should run on any platform that supports Node.js. This includes Linux. So, I decided to give that a try. And guess what? It works! Running a Microsoft build agent on Linux, I think that’s pretty cool!

And, the new build agent is fully open-source! You can have a look at the code on the VSO agent GitHub page.

Create a build agent pool

While technically it’s not required, I think it’s nice to create a separate build agent pool for Linux build servers. Log in to your VSO account and click the gear icon in the top right to get to the settings page. Then go to the “Agent pools” tab and click “New Pool”.

image Provide a name for your new pool and make sure you leave the “Auto-Provision” checkbox on. This will make sure your new pool will be available to your Team Project Collection.

imageYour new pool will be created. Of course, there are no agents in there yet.

image

Set permissions

We’ll need to set some permissions before we can add and run our new agent. Ideally, you would create a new (service) account for your agents. Unfortunately, I don’t have one available so I’m going to use my personal account. First, we need to enable alternate credentials for the account. Log in to VSO using your account and open your profile settings.

image Then, go to the “Credentials” tab and enable your alternate credentials there.

image Now, go back to the “Agent pools” administration page and click the little triangle in front of your newly created pool.

imageThere are two security groups there. We’ll need to add our account to both.

  • Agent Pool Administrators:  allows adding agent to pool
  • Agent Pool Service Accounts: allows the agent to listen to the build queue

Create a server

Next, we need a server! The easiest way to get one is to create on in Azure. I decided to go for the newest Ubuntu version currently available.

imageFor now, I chose to use password authentication. If you want, you can create a key file and use that. It’ll take a few minutes before the server is ready. To log in we’ll need an SSH client. A great one for Windows is PuTTY. Go ahead and download that if you didn’t already. Start PuTTY, and type the hostname for your server. Then, click “Open” to connect.

image You’ll get a message asking you if you trust the computer you’re connecting to. Click “Yes” and you’ll be presented with a login prompt. Login using the credentials you provided when creating the virtual machine.

imageVoilà, you’re logged into your brand new Linux server!

Installing prequisites

We’ll need to install some prerequisites on our new server.  We’ll need to install Node.js and npm. Type “sudo apt-get install nodejs npm” to get things going.

image Once you hit Enter, apt-get will take care of resolving dependencies. As you’ll notice, there are quite a lot… Type “Y” to start the installation.

image Now be patient as everything gets installed…

Because of a naming conflict with another program called “Node”, the default installation in Ubuntu installs Node.js as “nodejs”. However, the build agent script expects to find a “node” executable… To overcome this problem, the “nodejs-legacy” package was created. Install it by typing “sudo apt-get install nodejs-legacy”.

imageNow check the installed versions of Node.js and npm by typing “node –v && npm –v”.

image For the build agent to work you’ll need Node.js >= v0.10 and npm >= 1.4. In my case all is good!

Install the agent

Next order of business is to install the agent. As with TFS, the total process is split up in two parts: first we install the software bits, and then we run the configuration. To install the software bits we can use npm. Type “sudo npm install vsoagent-installer –g” and hit enter. That will pull the latest version of the installer from the repository.

image

Configure agent

We’re almost there… The only thing left to do is configure our agent. The agent installer allows us to easily create multiple agents on a single server. Each agent will run inside its own folder. So, first we need to create a folder. Type “mkdir linuxagent1” to create one and then cd into it with “cd linuxagent1”.

imageNow, install the agent into this folder by typing “vsoagent-installer”.

imageThis installs the agent into the directory you just created.  Now we can start it up and connect to VSO! Type “node agent/vsoagent” to start the configuration. Enter your credentials (these are your alternate credentials!), your VSO URL, an agent name (optional) and your pool name (“Linux” in my case) and the agent should start up!

image Note: the agent is now running in interactive mode. This means that when you close your session, the agent will terminate. While this is fine for testing and demonstration, it’s not so nice for production. Currently, the agent doesn’t support running as a service on Linux. The development team is working on that. In the meantime, you can use this trick to run the agent in the background: after starting the agent, hit “Ctrl+z”. This will stop the process and display a job number (1 in my case). Then, type “disown –h %1” (where 1 is the job number). This ensures the job is not stopped when you log out. Finally, type “bg 1” (again, 1 is the job number). This will start the process again, this time running in the background.

image You can now safely log out and the agent will continue running. Of course, you have to repeat these steps if you restart the server.

Verify

You can check your new agent by going to the “Agent pools” administration page on VSO and checking your queue there. Your new agent should show up there. And, it’ll identify itself as a Linux server!

image

Next steps

In this post, I’ve shown you how to run a Build vNext agent on Linux and connect it to VSO. While we haven’t actually done anything useful with it yet, it does demonstrate the true multi-platform direction that Microsoft is headed in. In theory this should allow us to build a Java application on Linux for example. While I’m thinking about this my fingers are itching to give that a try… More on that in a later post!

Happy building!

Creating a custom Build.vNext agent

I’m always up for trying new bits of technology. And to be honest, as a consultant I think it’s an absolute necessity to know what is out there. Fortunately, exploring new technologies is fun!

Recently Microsoft has made Build.vNext publicly available on Visual Studio Online. You can read Brian Harry’s blog post here. Build.vNext gives us a much simpler build system than the old Xaml workflow based system. It’s also fully extensible (more about that in a later post) and  fully web-based. Also, the infrastructure has changed: instead of the Build controller – Build agent(s) architecture there is now the concept of build agent pools. These pools can be shared across collections, so you no longer have to deal with a separate controller for each collection.

Visual Studio Online provides a hosted build agent pool. This is great because you don’t have to deal with your own infrastructure. However, if you need to customize your build agent then you’ll have to install your own agent, which you can then connect to VSO.

I thought I’d give that a go and try to build a MVC 5.0 (which is also in preview) app while I’m at it. MVC 5.0 is included with the Visual Studio 2015 preview. In this post I’ll show you how to configure a build agent and connect it to VSO. In a next post, I’ll walk you through the process of creating a vNext build definition for building a MVC 5.0 app.

To get started, we need a server with Visual Studio 2015 RC installed. The fastest way to do that is to get one from Azure. Log in to http://manage.windowsazure.com and create a new virtual machine. There is a template available for the Visual Studio 2015 RC.

imageIn the next couple of screens, choose a username and password for logging on to the machine. The other options can just be left as defaults. Once your machine is completed, connect to it using Remote Desktop.
image Once logged in, I want to create a local user for running the build agent under. Go to Server Manager, Tools, Computer Management and open the “Users” node under “Local Users and Computers”. Now right-click and select “New User” and fill in the details.
imageNow that we have the prerequisites in place, we can get on to the more interesting stuff. First, we have to download the bits for our new build agent. On your new server, open up a browser and log on to your VSO account.  Click the little gear icon on the top-right to go to the VSO Control Panel, then go to the “Agent pools” tab and click the “Download agent” button.
imageThis should give you a zip file which contains everything you need to run a VSO build agent.
imageDownload and extract the zip file to your local machine. I put mine in “C:\VsoAgent”.
imageNow all that we need to do is configure our agent. This is done using PowerShell. Open up a PowerShell window in Administrator mode by shift+right click on the PowerShell icon and selecting “Run as Administrator”.
image Then, change to the directory where you downloaded the bits.
imageSet the execution policy to “Unrestricted” so that you can run the script to configure the agent.
imageNow, run the configuration script for the agent. It’ll ask you a bunch of questions regarding the configuration of the agent. You can customize if you want to, or just accept the defaults. In my example, I’m configuring the agent to run as a Windows Service so that it will start up when the server starts. Make sure to connect to your VSO account (don’t include the “/DefaultCollection” bit in the URL!) and provide the local account you created to run the agent if you choose to run the agent as a Windows Service. If all goes well, you’ll get a message telling you that the configuration of the agent succeeded.
imageIf you look in the VSO Control Panel, you’ll see your shiny new agent listed, along with it’s capabilities.
image 

That’s it! Your build agent is now ready for use. Happy building!

Debugging a DSC script for Release Management

When preparing a DSC script for deploying an application using Release Management it is often handy to test and debug that script locally. Nobody writes code that works the first time, right? However, you’ll quickly run into the fact that Release Management does some things slightly different than “vanilla” DSC (for example: .psd1 is not .ps1) and also provides some extra’s, like the $ApplicationPath variable that holds the path to the bits you’re deploying and the configuration of WinRM. Because of this, I found myself using the following workflow:

  1. Write DSC script
  2. Test and debug locally
  3. Configure Release Management with this DSC script
  4. Test and debug deployment from Release Management
  5. Change DSC script to work from Release Management

I wanted to eliminate step #5 and seriously reduce the time needed for #4.

For that purpose, I have created a PowerShell wrapper script. This script takes the same input parameters that you would configure in Release Management and executes your DSC script for you, much in the same way as Release Management would. This means you can test and debug your DSC scripts locally (without using Release Management) and when you eventually configure them in Release Management you don’t have to change anything!

You can download the script from my GitHub: Deploy_RM_local.ps1. Here’s how it works.

In Release Management, you use a “Deploy Using PS/DSC” block for deploying using DSC:

image

The interesting variables here are:

  • PSScriptPath: the path to your DSC script. This path is relative to the location of the component you’re deploying (usually the folder from your TFS drop).
  • PSConfigurationPath: the path to the script that contains your configuration data. Again, this is a relative path.

Furthermore, Release Management will provide a variable $ApplicationPath that will contain the absolute path to the folder where Release Management will have copied your component. We’ll need these three variables.

  1. Download the bits that you want to deploy to the server you want to test on, e.g. to “C:\Deploy\SampleApp_20150527.3”.
  2. Make sure you have the “Deploy_RM_local.ps1” somewhere on the server, e.g. in “C:\Scripts”
  3. Open a PowerShell prompt and navigate to the folder containing the script:
    cd "C:\Deploy\SampleApp_20150527.3"
  4. Execute the “Deploy_RM_local.ps1” script, feeding it the three paths as input. Optionally, you can add the “-Verbose” parameter to get some more logging. For example:
    .\Deploy_RM_local.ps1 -ApplicationPath "C:\Deploy\SampleApp_20150527.3" -PSConfigurationPath "Deploy\DeployConfig_AzureD.ps1" -PSScriptPath "Deploy\Deploy.ps1" -Verbose

That’s it! Happy deploying!

Bulk update TFS work items using the web interface

Sometimes there is a much simpler solution to a problem than you originally thought. Or maybe sometimes you just like to do things the hard way (I’m sure most engineers have experience with this 😉 ). In any case, it’s good to know the simple solution, for when you really need it.

Last week I posted about bulk updating TFS work items using Powershell (original post). However, it turns out there is a really simple way to do bulk updates using the TFS Web UI. This can be very helpful when you want to update (one or multiple) field values for many work items at once. I didn’t know about it, so I thought I’d post it here.

Start by creating a query in the TFS Web UI. Here, I created a simple query that selects all Features.

image

Then, select the work items that you want to update. Use Ctrl+Click to select multiple items, use Shift+Click to select a range or just hit Ctrl+a to select them all. Then right-click and select “Edit selected work item(s)…”:

image

You’ll then be presented with a screen where you can select which fields to update and which value to update to. You can also put a comment which will be put in the history field of the selected work items:

image

Once you click “OK”, your items will be updated! If you have a lot of items to update you’ll have a nice “Please wait” screen to stare at… Generally, it won’t take very long though.

image

When finished, your work items will be updated. Don’t forget to click the “Save results” button to save your work items!

image

Note: Using this feature it is possible to set field values which are not allowed (e.g. for the “State” field). When you do this, you’ll notice that your work items will turn red, indicating that they’re not valid:

image

When you do try to save, you’ll be presented with an error message:

image

Happy updating!

Bulk update TFS work items using PowerShell

Today I was faced with a question that persuaded me to try a bit of technology I haven’t used too often yet. I had to set a specific field to a specific value for a large amount of work items. Of course there are multiple ways to achieve that, like using Excel or writing a little application using the TFS API. I decided to brush up on my PowerShell skills and write a script for this purpose.

One of the cool things about PowerShell is that it lets you load any .Net assembly by using the Add-Type cmdlet. You can then use the classes defined in that assembly inside your PowerShell session. This opens up a wide range of possibilities, one of them being that you can use the TFS API from PowerShell!

So, let’s start by loading the TFS assemblies for working with the TFS Work Item store. These are the assemblies for TFS 2013:

Add-Type -AssemblyName "Microsoft.TeamFoundation.Client, Version=12.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"
Add-Type -AssemblyName "Microsoft.TeamFoundation.WorkItemTracking.Client, Version=12.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"

Tip: if you don’t know the Public Key Token, you can find out by using “sn.exe”:

image

After the TFS assemblies have been loaded, we can get a reference to the TPC:

$tfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($tfsUri)

And then the $tfs object can be used to access the required services (the work item store in this case):

$workItemStore = $tfs.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore])

From then on, the rest is easy and pretty much similar to using the API as you would from C#. You can find the entire script on my GitHub: TfsBulkUpdateWi.ps1

Happy PowerShell-ing!

Bug in xWebAdministration module in DSC resource kit on Windows 2008 / IIS7

At one of the clients that I’m working we’re doing a proof of concept on using Release Management vNext type release templates for deploying their application. We’re using Powershell Desired State Configuration (DSC) for the actual deployment script.

One of the requirements is that we should also be able to deploy to the existing environments, running Windows 2008R2 and IIS7. Since the application is a simple web application, I was using the xWebAdministration module from the DSC Resource kit. When deploying to a Windows 2012R2 machine with IIS8, everything worked like a charm. However, when deploying on a Windows 2008R2 machine with IIS7, I got all sorts of weird error messages. There were already a couple of websites running on that machine, and it seemed as if the DSC script was trying to update those. Of course, it shouldn’t touch already existing sites.

After a bit of searching, I ran into this bug on Connect. Apparently there is a problem with the “Get-Website” cmdlet on IIS7. It returns all websites, instead of only the one specified with the “-Name” parameter. Since the xWebAdministration module uses “Get-Website -Name xyz” to get a reference to the website it should modify, it’ll try to update all the websites in the IIS server. So that explains it. Now, how to solve this?

Actually, the solution is not very complicated. The easiest is to modify your copy of the xWebAdministration module. Open the “MSFT_xWebsite.psm1” file in your favourite editor. You’ll find it in the “DSCResources\MSFT_xWebsite” folder.

Then, do a “find and replace” and replace this:

$Website = Get-Website -Name $Name

With this:

$Website = Get-Website | Where { $_.Name -eq $Name }

You should find five occurrences of the above snippet. After doing the replace, save the file and run your deployment again. If all is well, it should work now 🙂

Happy deploying!