Logitech Performance MX on Linux

Everyone, I have a confession to make.

I have been sudo’ing lately. And I think I like it.

I’m not doing anything extreme, but the progression from my interest in PowerShell within Windows lately has grown from a slight fetish for auto-syntax highlighting, to nearly preferring to perform as much systems administration as I can from a command line. The Elementary OS (while not very hardcore, I know) has been an awesome segway for me from Windows to a lightweight, familiar, and fairly immersive (by someone with a MS background) into the Unix operating system. Going into this, I knew that I would have some hurdles with drivers and other extensive levels of support that I have become used to with a superbly popularized OS like Windows 10, but the challenge was something that I looked forward to. So after I figured out how to get my Dropbox, OneDrive, and a few other things that were fairly simple within Windows setup in Elementary, I realized through browsing the how-to articles that there was one major thing I was missing.

Support for the Logitech SetPoint software to customize the single thumb button on my Performance MX mouse. I have customized this button to a single keystroke – CTRL + W – to be able to quickly close tabs while browsing.

I quickly realized that there isn’t a direct port for SetPoint for Linux, but as usual, there sure were a million workarounds. The following is a short walk-through for what a fairly illiterate Windows user went through in order to recreate this functionality. I’m really documenting this for my own reference if I need to remember how to do this in the future, but am gladly sharing it out with anyone else that is trying to do this.

preprequisites

In order to get this to work, I relied on the following:

  • xbindkeys
  • xev
  • xautomation
  • xinput

I know there there are options to use a GUI alternative with the xbindkeys_config-gtk2 package or xbindkeys-config, but I preferred doing this via command line tools editing the config file directly.

Xbindkeys is a program that enables us to bind commands to certain keys or key combinations on the keyboard. Xbindkeys works with multimedia keys and is window manager / DE independent, so if you switch much, xbindkeys is very handy.

Xev is a program while allowed me to figure out what the exact button on my Performance MX mapped to. For those out there landing here from Google looking for some quick references, its b:10.

Also, if you’re looking to simply recreate the default functionality of using this button as window selector, I’d advise you to reference this walkthrough found HERE.

First, get everything installed with the following:

Once finished, start up xev to find out what the button you are looking to configure maps to on the back end. Starting the application will create a little white window which you can place your mouse in and execute whatever button click you’d like. In real-time, you will get a whole slew of information around what the OS is capturing, including cursor moves, so limit what you’re doing to a minimum. Alternatively, start xev with the following parameters to only capture button clicks:

Review the output in your terminal window for a few line items referring to a buttonrelease. Mine looked like the below:

The above shows that the click maps to “button 9”. This will be our reference for the configuration file.

We now need to create the configuration file that will be used as the reference for xbindkeys

This is where we will add our keystroke mapping. Open file with gedit, nano, or whatever your favorite editor is.

We will use xte here to map the stroke out in the configuration file. Depending on what your mapping is (obviously), this is where you will list out the action. Formatting is as follows:

begin the line with xte, and append the series of keystrokes for the mapping when you click the mouse button. The example above will simulate clicking the CTRL key, clicking the left mouse key, and then releasing the ctrl key upon release of the mapped mouse button click.

My final code block added to my xbindkeysrc file was as follows:

These commands can be further customized to perform in-app actions, functions, and other processes rather than just executing keystrokes. I won’t go into details around this here, but there are plenty of references online for how to do this.

To apply these changes, save the file and restart the service.

After this was all complete, I added the xbindkeys to my autostart group to ensure that this worked upon reboot.

Application Launcher > System Settings > Applications > Startup tab > add “xbindkeys” as a custom command

I would also recommend installing a great package called Solaar for monitoring battery life, adjusting DPI, pairing and unpairing to a unify receiver, and enabling other features like smooth and side scrolling.

HERE is a great walkthrough that helped me get this all done as well.

Restore default Site Columns and Content Types in SharePoint Online / Office365

Through my tenure as a SharePoint Administrator focusing on SharePoint migrations, I feel like I somehow run into a new and unique issue each time I have a new engagement. For this specific issue, I am migrating my client from an on-premise SharePoint 2007 environment to an ADFS enabled SharePoint Online / Office365 tenant. This is obviously a big jump, and is going to offer a lot of great new features for my client. As an end-user, this migration will be awesome, but as an administrator, I cringe at the thought of performing a jump straight from 2007 to SPO.

In order to make things work, we had to leverage a migration tool – specifically Metalogix Content Matrix. We used this tool to migrate mostly content-only from the source to the target. We skipped all custom InfoPath forms, workflows, etc. to prevent any conflicts or degraded functionality that may be replaced by upgraded modules.

Using Content Matrix to perform this content-only migration, we found that Site Columns associated with Custom InfoPath forms were still being migrated to the target. To make it better, my client had created some InfoPath forms with friendly names which were the same as system default Content Type Site Columns. Content Matrix addressed this by migrating the custom InfoPath form Column, and having this column overwrite the system default. In our case specifically, we saw the First Name and E-Mail Address fields (seen often in Contact List lists) to not be viewable. The column would show in List Settings, but you could not select it to review the column settings or migrate the content into these disabled columns.

This placed a need on how to recreate the out of the box Site Columns and Content Types to restore them to what they would be on a fresh Site Collection. With this being SharePoint Online, we have many more limitations on how to do this, but I was able to find a fix.

Using the SharePoint Online Client Browser (HERE), I was able to review all Site Collections Features (including the specific hidden features). The features I was looking for were:

  • Standard Column Definitions
  • Standard Content Type Definitions

Once finding the GUIDs for the above features, I was able to use the manage-features.ps1 script for SharePoint Online (HERE). I ran through and disabled both of the above features, and then re-enabled them, and the Site Columns were restored and functionality of the affected lists were restored.

 

SharePoint 2007 File Inventory

Once again, I am reminded of how much I can appreciate the PowerShell snap-ins for SharePoint after the 2010 release. Attempting to iterate through a farm or script out actions using only STSADM commands and direct access to the SharePoint assemblies ([System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”)) is much more challenging than using the built-in PowerShell capabilities of the later versions (even SharePoint Online with it’s own quirks).

I have a client who is migrating from an on-premise SharePoint 2007 environment to SharePoint Online. Yes – big jump, and yes – there will be some headaches. So many features and functions of SharePoint 2007 simply will not migrate and/or be mappable to equivalent SharePoint Online features due to the sheer difference in the way each of the platforms are laid out. Due to this, identifying some of these issues proactively before a migration rather than reacting to broken page layouts, missing site feature dependencies, and other prerequisites is greatly beneficial. To make matters a step more complicated, the client had elected to engage in a branding effort to give their SharePoint Online instance a fresh new look that resulted in a site that really didn’t look like SharePoint at all in the end.

So how do you migrate a bunch of SharePoint 2007 ASPX pages, based on deprecated site templates and layouts, into a branded SharePoint Online instance which utilizes a specific page layout for branding and the site home pages?

You don’t.

Lucky this client didn’t have very customized site home pages or a large extent of customized pages throughout their farm, so efforts to re-create these pages into branded SPO pages manually was feasible. In order to prepare for this work and delegate out this ASPX page recreation to site owners, the client wanted to have an inventory of all ASPX pages in their farm currently, with location. Per my normal consulting habits, I was confident in telling the client we could figure out a way to give them this report to help them plan this work out, yet I was not 100% sure how I would do it at that time. Now was time to search for how to get a SharePoint 2007 file inventory, and then strip out everything besides the ASPX pages from the list.

Through some research online, it was easy to find all arrows pointing to Gary LaPointe. The SharePoint MVP is THE man when it comes to STSADM commands. His website has WSP solutions for installing snap-ins with SharePoint 2010+ PowerShell commands, as well as some scripts here. Lucky for me, he had one script on his website which iterated through every web application in the local farm, and drilled down through each Site Collection, subsite, and document library recursively and output the results to either a grid view or a CSV file. View the script below, as well as a link to Gary’s personal blog which I would highly recommend to any SharePoint consultant or legacy farm administrator

View the script on his site HERE, as well as the downloads page which features a large library of tools and scripts for completing unique tasks that aren’t quite perfectly mapped out in OoB cmdlets.

SharePoint / IIS HTTP to HTTPS URL Redirect or Rewrite

It is a no brainer that for SharePoint, an HTTPS connection is almost always the preferred connection type. I can’t think of a single scenario where it would be preferred to set up a SharePoint Web Application to default to a non-SSL HTTP URL than a secure one. One problem you may find as an administrator though, is that user’s aren’t aware of the difference between this small change in the URL. Your SharePoint site may be bound to an HTTP connection on port 80, but your front end may only respond to the 443 connection that you have set up.

The easiest way to address this problem is to set up an IIS HTTPs URL redirect for SharePoint. The redirect will be used to take any request to the SharePoint front end for an http site, and rewrite the URL to HTTPS. This configuration outlined below will work for both the top level SharePoint site, as well as any sub-locations.

Requirements:
IIS 7.0+
Microsoft URL Rewrite Module
http://www.microsoft.com/en-us/download/details.aspx?id=7435

Set up HTTP and HTTPS bindings for all Web Applications on ALL SharePoint Front End Servers

Before you can make any configuration changes in IIS, you must first make sure that the front end servers know how to respond to the requests that they will now receive on 443.

  1. Open IIS Manager
  2. Expand the navigation tree to show the list of sites hosted on the server
  3. Select the site that you would like to set up a redirect for, and click bindings on the right under Actions
  4. In this window, ensure that you have a FQDN Host name entry for your web application for both 443 and 80. Note that you will need to have a certificate to use for the 443 binding.
  5. Repeat these steps for all Front End servers in the farm

Set up the IIS Rewrite module extension

Before we can set up the rewrite, the IIS extension must be installed on all IIS instances on all front end servers. See the link above under requirements for the download link from Microsoft. No restart is required after installation (always important to know as a SharePoint Administrator)

  1. Navigate to your site and in the middle pane, select the “URL Rewrite” button under the IIS section
  2. Under Actions on the right side, select Add Rule(s)…
  3. Inbound Rule > Blank Rule
  4. Name: “HTTP to HTTPS Redirect for WebApplication”
  5. Requested URL: Matches the Pattern
  6. Using: Regular Expressions
  7. Pattern: (.*)
  8. Ignore Case: checked
  9. Conditions -> Add…
  10. Condition Input: {HTTPS}
  11. Check if input string: Matches the pattern
  12. Pattern: off
  13. Action type: Redirect
  14. Redirect URL: https://{HTTP_HOST}/{R:1}
  15. Append query string: checked
  16. Redirect type: Found (302)

SharePoint 2010 vs PowerShell 3.0 CLR

Microsoft SharePoint 2010 is not supported with version 4.0+ of the Microsoft .NET Framework. This issue can come into play if you happen to install the Windows Management Framework 3.0+, .NET Framework 4.0+, or PowerShell v3.0. All of these items run on a CLR (Common Language Runtime) of 4.0+, the default that comes with these applications. When loading the SharePoint 2010 Management Shell, you may see one of the following:

The local farm is not accessible. Cmdlets with FeatureDependencyID are not registered

or

Microsoft SharePoint is not supported with version 4.0.x

To address this, there are a couple of workarounds:

Force the SharePoint Management Shell to use PowerShell Version 2


Forcing the shell to run on PSv2 is a quick workaround that requires no downtime, restarts, or interruptions in services.

    1.  Right click the icon for the SharePoint Management Shell, click Properties
    2. Click the shortcut tab in the properties window
    3. Input the following into the target value:

This will force the Shell to open in PowerShell v2.0, running the CLR 2.0.X. To verify this, open the SharePoint Management Shell, and type $psversiontable. This will show you the version of CLR and PowerShell that the current session is utilizing.

Force PowerShell to globally use a different CLR

  1. Create or edit a file at $pshome\powershell.exe.config
  2. Insert or edit the contents to include the line for the CLR 2.0.X support, ensuring that this is before the 4.0 callout (the order of the supported run times shows how the system will favor each).
  3. Save the file and reopen any previous PowerShell sessions
  4. Test this by opening PowerShell and running the $psversiontable command

Azure ACS Namespace authentication with SharePoint 2013

Introduction

This document will be used to outline the process for implementing Azure’s ACS namespace as a trusted identity provider to an on-premise SharePoint environment, utilizing external identity providers such as Facebook, Yahoo, or Windows Live ID to pass manipulated claims as a standardized trusted token to SharePoint

Preface, Considerations, Prerequisites

  • Azure ACS authentication with SharePoint is a free service as of 12/2014. The service requires a subscription to Azure, but costs no money. A $0 spending limit can be set to prevent the possibility of any charges coming through from this account.
  • The Azure ACS system can utilize any Identity Provider that can export an OpenID claim
  • SharePoint 2013 does not support SAML 2.0, SAML 1.1 must be used
  • As of 12/2014, Windows Live ID can only export a nameidentifier claim, but this can be mapped to an email address claim to use as a UPN for login
  • An x.509 SSL certificate needs to be uploaded to Azure to sign tokens, and to explicitly trust in SharePoint

Definitions/References:

Azure ACS – ACS (access control system) is a free feature of Azure that allows the developer to create a namespace that can tie external Identity Providers incoming tokens to a single claim to a web application
Azure Namespace – the item in Azure that provides the URL for all Identity Providers, Relying Party Applications, Rule Group Settings, and Certificates and Keys to be trusted
Identity Provider – an external service, such as Yahoo!, Facebook, or Windows Live ID, that provides the inherently trusted authentication to SharePoint
Relying Party Application – these are the web applications that are connected to the namespace that receive the token that is output from the Azure ACS namespace. In this walkthrough, SharePoint 2013 is our Relying Party Application
Rule Groups – Rule groups are the list of incoming claims tied to claim issuers/identity providers, how they may be mapped when exported, as well as any customizations that need to happen for the token itself

Azure Setup

Creating the Namespace
  1. Login to the Azure management portal at https://manage.windowsazure.com with the Azure account created prior to this work. Note: Ensure that this account has a subscription set up prior to continuing any further, any work created with a trial account will expire once the trial is over, and are unable to be transferred to a setup account
  2. Once logged into the management console, click the large NEW button at the bottom left of the screen to begin the process of creating a namespace
  3. Azure1

  4. Navigate App Services > Active Directory > Access Control > Quick Create:
  5. Azure2

  6. Give the namespace a name that makes sense for the authentication set up (this won’t be an actively used URL by an end-user). Select your region, and the subscription that has been set up (Pay-As-You-Go, or another valid subscription, NOT free trial). The namespace status will change from creating to active:
  7. Azure3

  8. Click on the newly created namespace, and click the manage button on the bottom ribbon, this will take you to the Silverlight based Windows Azure Platform Access Control Service control panel:
  9. Azure4

  10. Utilize the Getting Started steps that appear on the homepage of this control panel for much of the work. You will now want to set up the Identity Providers:
Making Windows Live ID Work for SharePoint 2013 Authentication

Since the Windows Live ID Identity Provider is a preset default in the Azure ACS namespace, I will use it as a full example on configuring the claim issuer for authentication in SharePoint, and will later lead into how to configure additional Identity Providers.

Configure Relying Party Applications
  1. Navigate to the Relying party applications management by clicking the URL on the left-hand side of the ACS administrative panel
  2. Click the Add URL at the top left of the Relying Party Applications table
  3. On the configuration page, enter in the following information:
    Name: a display name such as “SharePoint2013” is fine for this as it is simply a display name for SharePoint’s trust with ACS
    Realm: the realm will be equal the resolvable url for your SharePoint web application (https://winchestertonfieldville.ia.gov/) would work fine if this is your SharePoint web application URL
    Return URL: this is the URL in which ACS will return tokens to SharePoint. This will always be the same URL as the realm defined above, with “/_trust” appended to the end (https://winchestertonfieldville.ia.gov/_trust)
    Error URL (optional): This is the URL for an optional error page in case authentication fails
    Token Format: Choose SAML 1.1, SharePoint 2013 is NOT compatible with SAML 2.0
    Token lifetime (secs): This value represents the seconds that the token is valid before expiring, or how long the user will be able to hit the realm URL repeat times and be auto-authenticated back in to the environment. The default value of 600 seconds is typically on the low side. Set this value to 3600 seconds (equal to the default value in ADFS).
    Identity Providers: the list of Identity Providers that you would like to have as optional authentication methods when users hit the realm’s login page. Each Identity Provider will show in the login drop down
    Rule Groups: leave the check box to create a new rule group as we will be doing this in the next step
    Token Signing: Choose whether to Use service namespace certificate or Use a dedicated certificate. For the purposes of this walk through, I would recommend choosing Use a dedicated server and upload the X.509 certificate .pxf file to use for when signing. Browse to the file, upload, and input the password that was created when it was exported from the server.
  4. Click Save
Configure Rule Groups

The rule groups are the configuration for what claims Azure will receive from the Identity Providers, as well as how they will map these claims to the output token.

  1. Navigate to the Rule Groups page by clicking the URL on the left-hand navigation
  2. Click the “Default Rule Group for <RelyingPartyApplication>” that was automatically created when we set up the Relying Party Application
  3. On the Edit Rule Group page, feel free to edit the name for the rule group
  4. In the Rules table, click the Generate URL. You will notice that the Rules table currently shows “No rules have been added. Click Generate to generate rules automatically, or click Add to add rules manually.” Note – If additional Identity Providers are added at a later time, use this option to generate new rules for the new identity providers
  5. On the following page, click the checkbox next to Windows Live ID, and click Generate
  6. You will then see the that the nameidentifier output claim from Windows Live ID is listed as a rule. Click on the nameidentifier URL.
  7. On the resulting page, you can see how to configure the input/ouput of this claim for how ACS will prep it for the output token that will be trusted by SharePoint. By default, nameidentifier is not a usable claim for a login in SharePoint. To address this, we will map the output claim type to email address.
  8. For everything under the If section, leave default
  9. For the “Then” section, change the select type dropdown from http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier to http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress to map the claim from the nameidentifier URI to a usable email address login
  10. You can leave all other items as default, or add information under “Rule Information” that will explain that we are mapping the Windows Live ID nameidentifier claim to an emailaddress output claim
Upload Token Signing Certificate

Azure ACS uses an X.509 certificate to digitally sign the tokens it generates. You can also increase the functionality of this certificate to include encryption/decryption of the tokens themselves. Many users will use the makecert utility to create a self-signed certificate for testing purposes, but do not do this in a production environment.

  1. Click the Certificates and Keys URL on the lefthand side of the Azure ACS configuration page
  2. Click the Add URL at the top of the Token Signing table
  3. On the “Add Token-Signing Certificate or Key” page, select the correct Relying Party Application in the dropdown.
  4. Select X.509 Certificate for type
  5. Click Choose file and upload the .pfx file. Note: On this page, it also shows how to use the MakeCert utility from the Microsoft SDK to make this certificate.
  6. Click Save

SharePoint 2013 Configuration

Within SharePoint, we will move forward in this walkthrough with the assumption that the Web Application within SharePoint 2013 has already been created with a friendly URL assigned to it.

  1. Log in to the Central Admin server
  2. Open a PowerShell ISE session as an administrator and run the following:
  3. I would recommend running the above code line by line to ensure you can identify any errors as they happen, rather than an error that you can’t attribute to a single line of code when bulk running the block
  4. Open Central Administration and click Manage Web Applications
  5. Click the web application that we have been using thus far, and click Authentication Providers in the ribbon.
  6. In the appropriate zone, you should be able to see empty check boxes for the newly created Trusted Identity Providers that we have set up. Check the boxes next to Windows Live ID
  7. Press Save
  8. SharePoint now knows to trust any SAML1.1 tokens that come from the specific Azure ACS namespace that we have created, and knows how to parse the claims within the token.

You are now complete! You now should be able to hit the SharePoint Web Application URL, be redirected to an Azure ACS hosted login page, be able to choose between windows authentication or Windows Live ID, be redirected to the Windows Live ID login page to authenticate, and automatically be pushed back to SharePoint. As of our current configuration performed thus far, the users will hit a “You are not authorized to view this page” SharePoint page. For security, I would recommend enabling Access Request settings for the root Site Collection, and setting the recipient to a HA individual, or a distribution group to be able to triage requests from possible users as they hit your environment.

Powershell in MOSS 2007 – Site Maps

So you have a client that’s running MOSS 2007. With the features and benefits of SharePoint 2013, it’s easy to get quickly frustrated with this old version of SharePoint. Nothing irks me more than forgetting about the lack of PowerShell, each and every time, and spending time looking up STSADM equivalent cmdlets to achieve the same task.

Isn’t there some way I can PowerShell SharePoint 2007?

Yes, but it kind of sucks. I would always recommend using STSADM commands natively when possible, but there really isn’t a good alternative to PowerShell sometimes. Windows SharePoint Services 3.0 and SharePoint Server 2007 do no include built-in cmdlets, but we can do the following if PowerShell is at least installed on the server:

1. Run the following to set the execution policy to allow for scripts to be run locally (this may or may not be reversed/denied via group policy:

2. Load the SharePoint API into you script with the following:

3. Load the MOSS 2007 Script Collection from CodePlex onto the machine HERE

4. Create a new PS1 file with the following format:

The above will run the two referenced PS1 files and keep them in memory preemptively to calling the cmdlets in the script later, and when they are, the console knows where to look for them.

The following can be done via native STSADM to create the same type of output (Note: the enumallwebs command is an STSADM command that is only available in SP ’07 SP2 and above):

Both of these options clearly will work, but the output will be different and some may prefer to stick to STSADM when possible in the older environments.

In the end, this is simply a workaround that allows you to run a small sub-set of your PowerShell scripts that you know and love from SP’13 and SP’10 over a MOSS2007 farm when STSADM commands just fall short.

SharePoint – The local farm is not accessible

Below is an error one may receive when attempting to run SharePoint PowerShell cmdlets through the SharePoint Shell Admin:

“The local farm is not accessible. Cmdlets with FeatureDependencyID are not registered.”

OR

Get-SPContentDatabase : The farm is unavailable.
At line:1 char:1
+ Get-SPContentDatabase
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (Microsoft.Share...ContentDatabase:
SPCmdletGetContentDatabase) [Get-SPContentDatabase], SPException
+ FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletGetConte
ntDatabase

This error can occur when the user running the commands doesn’t have the proper permissions to the correct back end content databases. The SecurityAdmin roles is necessary to run Shell commands. To resolve this error, either grant the account running the command sharepoint_shell_access and db_owner at minimum to the configuration database in SQL, or login with the SP_Admin account and run with elevated permissions.

If you already have the account granted the above permissions in SQL, you can run the following in a PowerShell console:

Add-SPShellAdmin -UserName DOMAIN\User

Veeam File Level Restore – Restoring from a Storage Pool

Veeam is unable to restore a file from a backup where it’s original location was within a Storage Pool. Seriously?

I want to preface this post by stating that I have had experience with Backup Exec, Data Protection Manager, and Veeam, and that Veeam is by far, the easiest solution to implement, use, and maintain. The installation of the entire infrastructure is fool proof. The error messages are easy to resolve. The Veeam support is rock solid. And the wizards used for restores are about as intuitive as they come, especially after beginning my backup/disaster recovery work with the beast which is Backup Exec.

With that said, I have been disappointed with a few restore capabilities of Veeam. The first file level restore I had to make was a simple word document to be restored in place to a file server. I spun up the built in recovery wizard on the VM, and the restore continually failed and errored out. Why was a restore this simple failing?

Turns out, Veeam is not able to restore files from Storage Pools by default.

The overall process of recovering a file from a backup where it was located on a Storage Pool can be quite a hassle. Sometimes, you really have to ask if it’s worth it and if the user REALLY needs that file, and most times – you can bet your buns that it’s a life or death situation to the end user despite what you think.

So how to do it?

High level overview:

  • Spin up an Instant Recovery copy of the VM containing the file needing to be restored
  • Create a new temp disk on the sandboxed restored VM
  • Move the file to the new disk
  • Move the new disk to the prod VM (recovery target)
  • Move file from temp disk
  • Delete temp restored VM and temp disk

More detailed explanation:

Within Veeam, you’ll need to choose the Instant VM Recovery option from the Restore tab. Work through this wizard, selecting “Restore to a new location, or with different settings”. Rename the VM something different than the original name so to not confuse your VM host with conflicting names.

Now that you’re new VM is up and running (not connected to a network and not powered on), add a new hard disk to the VM. The new disk that you are adding to this VM only needs to be large enough to hold the file you need to restore. Once this is set up, power on and console into the VM, and move your file off the original Storage Pool location and onto the new disk that was added.

With your new disk containing your restored file, you will want to log into your production VM, add the disk containing the restored file, pull it off, and clear off all the fragments of temporarily restored VMs, new disks, etc.

This was an issue that I battled for quite some time, and only figured out this solution once having a case open for Veeam for a while and being told that Veeam is actually unable to perform restores from Storage Pools as of v7.0 Patch 3 (February 2014).  This seems like a simple enough feature that I am more than sure will be eventually not be an issue, but for now, this is the only way I know of doing it.

If you have any comments or a better way of handling this, please leave a comment and I would love to hear your thoughts.