SharePoint 2007 File Inventory

Once again, I am reminded of how much I can appreciate the PowerShell snap-ins for SharePoint after the 2010 release. Attempting to iterate through a farm or script out actions using only STSADM commands and direct access to the SharePoint assemblies ([System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”)) is much more challenging than using the built-in PowerShell capabilities of the later versions (even SharePoint Online with it’s own quirks).

I have a client who is migrating from an on-premise SharePoint 2007 environment to SharePoint Online. Yes – big jump, and yes – there will be some headaches. So many features and functions of SharePoint 2007 simply will not migrate and/or be mappable to equivalent SharePoint Online features due to the sheer difference in the way each of the platforms are laid out. Due to this, identifying some of these issues proactively before a migration rather than reacting to broken page layouts, missing site feature dependencies, and other prerequisites is greatly beneficial. To make matters a step more complicated, the client had elected to engage in a branding effort to give their SharePoint Online instance a fresh new look that resulted in a site that really didn’t look like SharePoint at all in the end.

So how do you migrate a bunch of SharePoint 2007 ASPX pages, based on deprecated site templates and layouts, into a branded SharePoint Online instance which utilizes a specific page layout for branding and the site home pages?

You don’t.

Lucky this client didn’t have very customized site home pages or a large extent of customized pages throughout their farm, so efforts to re-create these pages into branded SPO pages manually was feasible. In order to prepare for this work and delegate out this ASPX page recreation to site owners, the client wanted to have an inventory of all ASPX pages in their farm currently, with location. Per my normal consulting habits, I was confident in telling the client we could figure out a way to give them this report to help them plan this work out, yet I was not 100% sure how I would do it at that time. Now was time to search for how to get a SharePoint 2007 file inventory, and then strip out everything besides the ASPX pages from the list.

Through some research online, it was easy to find all arrows pointing to Gary LaPointe. The SharePoint MVP is THE man when it comes to STSADM commands. His website has WSP solutions for installing snap-ins with SharePoint 2010+ PowerShell commands, as well as some scripts here. Lucky for me, he had one script on his website which iterated through every web application in the local farm, and drilled down through each Site Collection, subsite, and document library recursively and output the results to either a grid view or a CSV file. View the script below, as well as a link to Gary’s personal blog which I would highly recommend to any SharePoint consultant or legacy farm administrator

View the script on his site HERE, as well as the downloads page which features a large library of tools and scripts for completing unique tasks that aren’t quite perfectly mapped out in OoB cmdlets.

Azure ACS Namespace authentication with SharePoint 2013


This document will be used to outline the process for implementing Azure’s ACS namespace as a trusted identity provider to an on-premise SharePoint environment, utilizing external identity providers such as Facebook, Yahoo, or Windows Live ID to pass manipulated claims as a standardized trusted token to SharePoint

Preface, Considerations, Prerequisites

  • Azure ACS authentication with SharePoint is a free service as of 12/2014. The service requires a subscription to Azure, but costs no money. A $0 spending limit can be set to prevent the possibility of any charges coming through from this account.
  • The Azure ACS system can utilize any Identity Provider that can export an OpenID claim
  • SharePoint 2013 does not support SAML 2.0, SAML 1.1 must be used
  • As of 12/2014, Windows Live ID can only export a nameidentifier claim, but this can be mapped to an email address claim to use as a UPN for login
  • An x.509 SSL certificate needs to be uploaded to Azure to sign tokens, and to explicitly trust in SharePoint


Azure ACS – ACS (access control system) is a free feature of Azure that allows the developer to create a namespace that can tie external Identity Providers incoming tokens to a single claim to a web application
Azure Namespace – the item in Azure that provides the URL for all Identity Providers, Relying Party Applications, Rule Group Settings, and Certificates and Keys to be trusted
Identity Provider – an external service, such as Yahoo!, Facebook, or Windows Live ID, that provides the inherently trusted authentication to SharePoint
Relying Party Application – these are the web applications that are connected to the namespace that receive the token that is output from the Azure ACS namespace. In this walkthrough, SharePoint 2013 is our Relying Party Application
Rule Groups – Rule groups are the list of incoming claims tied to claim issuers/identity providers, how they may be mapped when exported, as well as any customizations that need to happen for the token itself

Azure Setup

Creating the Namespace
  1. Login to the Azure management portal at with the Azure account created prior to this work. Note: Ensure that this account has a subscription set up prior to continuing any further, any work created with a trial account will expire once the trial is over, and are unable to be transferred to a setup account
  2. Once logged into the management console, click the large NEW button at the bottom left of the screen to begin the process of creating a namespace
  3. Azure1

  4. Navigate App Services > Active Directory > Access Control > Quick Create:
  5. Azure2

  6. Give the namespace a name that makes sense for the authentication set up (this won’t be an actively used URL by an end-user). Select your region, and the subscription that has been set up (Pay-As-You-Go, or another valid subscription, NOT free trial). The namespace status will change from creating to active:
  7. Azure3

  8. Click on the newly created namespace, and click the manage button on the bottom ribbon, this will take you to the Silverlight based Windows Azure Platform Access Control Service control panel:
  9. Azure4

  10. Utilize the Getting Started steps that appear on the homepage of this control panel for much of the work. You will now want to set up the Identity Providers:
Making Windows Live ID Work for SharePoint 2013 Authentication

Since the Windows Live ID Identity Provider is a preset default in the Azure ACS namespace, I will use it as a full example on configuring the claim issuer for authentication in SharePoint, and will later lead into how to configure additional Identity Providers.

Configure Relying Party Applications
  1. Navigate to the Relying party applications management by clicking the URL on the left-hand side of the ACS administrative panel
  2. Click the Add URL at the top left of the Relying Party Applications table
  3. On the configuration page, enter in the following information:
    Name: a display name such as “SharePoint2013” is fine for this as it is simply a display name for SharePoint’s trust with ACS
    Realm: the realm will be equal the resolvable url for your SharePoint web application ( would work fine if this is your SharePoint web application URL
    Return URL: this is the URL in which ACS will return tokens to SharePoint. This will always be the same URL as the realm defined above, with “/_trust” appended to the end (
    Error URL (optional): This is the URL for an optional error page in case authentication fails
    Token Format: Choose SAML 1.1, SharePoint 2013 is NOT compatible with SAML 2.0
    Token lifetime (secs): This value represents the seconds that the token is valid before expiring, or how long the user will be able to hit the realm URL repeat times and be auto-authenticated back in to the environment. The default value of 600 seconds is typically on the low side. Set this value to 3600 seconds (equal to the default value in ADFS).
    Identity Providers: the list of Identity Providers that you would like to have as optional authentication methods when users hit the realm’s login page. Each Identity Provider will show in the login drop down
    Rule Groups: leave the check box to create a new rule group as we will be doing this in the next step
    Token Signing: Choose whether to Use service namespace certificate or Use a dedicated certificate. For the purposes of this walk through, I would recommend choosing Use a dedicated server and upload the X.509 certificate .pxf file to use for when signing. Browse to the file, upload, and input the password that was created when it was exported from the server.
  4. Click Save
Configure Rule Groups

The rule groups are the configuration for what claims Azure will receive from the Identity Providers, as well as how they will map these claims to the output token.

  1. Navigate to the Rule Groups page by clicking the URL on the left-hand navigation
  2. Click the “Default Rule Group for <RelyingPartyApplication>” that was automatically created when we set up the Relying Party Application
  3. On the Edit Rule Group page, feel free to edit the name for the rule group
  4. In the Rules table, click the Generate URL. You will notice that the Rules table currently shows “No rules have been added. Click Generate to generate rules automatically, or click Add to add rules manually.” Note – If additional Identity Providers are added at a later time, use this option to generate new rules for the new identity providers
  5. On the following page, click the checkbox next to Windows Live ID, and click Generate
  6. You will then see the that the nameidentifier output claim from Windows Live ID is listed as a rule. Click on the nameidentifier URL.
  7. On the resulting page, you can see how to configure the input/ouput of this claim for how ACS will prep it for the output token that will be trusted by SharePoint. By default, nameidentifier is not a usable claim for a login in SharePoint. To address this, we will map the output claim type to email address.
  8. For everything under the If section, leave default
  9. For the “Then” section, change the select type dropdown from to to map the claim from the nameidentifier URI to a usable email address login
  10. You can leave all other items as default, or add information under “Rule Information” that will explain that we are mapping the Windows Live ID nameidentifier claim to an emailaddress output claim
Upload Token Signing Certificate

Azure ACS uses an X.509 certificate to digitally sign the tokens it generates. You can also increase the functionality of this certificate to include encryption/decryption of the tokens themselves. Many users will use the makecert utility to create a self-signed certificate for testing purposes, but do not do this in a production environment.

  1. Click the Certificates and Keys URL on the lefthand side of the Azure ACS configuration page
  2. Click the Add URL at the top of the Token Signing table
  3. On the “Add Token-Signing Certificate or Key” page, select the correct Relying Party Application in the dropdown.
  4. Select X.509 Certificate for type
  5. Click Choose file and upload the .pfx file. Note: On this page, it also shows how to use the MakeCert utility from the Microsoft SDK to make this certificate.
  6. Click Save

SharePoint 2013 Configuration

Within SharePoint, we will move forward in this walkthrough with the assumption that the Web Application within SharePoint 2013 has already been created with a friendly URL assigned to it.

  1. Log in to the Central Admin server
  2. Open a PowerShell ISE session as an administrator and run the following:
  3. I would recommend running the above code line by line to ensure you can identify any errors as they happen, rather than an error that you can’t attribute to a single line of code when bulk running the block
  4. Open Central Administration and click Manage Web Applications
  5. Click the web application that we have been using thus far, and click Authentication Providers in the ribbon.
  6. In the appropriate zone, you should be able to see empty check boxes for the newly created Trusted Identity Providers that we have set up. Check the boxes next to Windows Live ID
  7. Press Save
  8. SharePoint now knows to trust any SAML1.1 tokens that come from the specific Azure ACS namespace that we have created, and knows how to parse the claims within the token.

You are now complete! You now should be able to hit the SharePoint Web Application URL, be redirected to an Azure ACS hosted login page, be able to choose between windows authentication or Windows Live ID, be redirected to the Windows Live ID login page to authenticate, and automatically be pushed back to SharePoint. As of our current configuration performed thus far, the users will hit a “You are not authorized to view this page” SharePoint page. For security, I would recommend enabling Access Request settings for the root Site Collection, and setting the recipient to a HA individual, or a distribution group to be able to triage requests from possible users as they hit your environment.

Powershell in MOSS 2007 – Site Maps

So you have a client that’s running MOSS 2007. With the features and benefits of SharePoint 2013, it’s easy to get quickly frustrated with this old version of SharePoint. Nothing irks me more than forgetting about the lack of PowerShell, each and every time, and spending time looking up STSADM equivalent cmdlets to achieve the same task.

Isn’t there some way I can PowerShell SharePoint 2007?

Yes, but it kind of sucks. I would always recommend using STSADM commands natively when possible, but there really isn’t a good alternative to PowerShell sometimes. Windows SharePoint Services 3.0 and SharePoint Server 2007 do no include built-in cmdlets, but we can do the following if PowerShell is at least installed on the server:

1. Run the following to set the execution policy to allow for scripts to be run locally (this may or may not be reversed/denied via group policy:

2. Load the SharePoint API into you script with the following:

3. Load the MOSS 2007 Script Collection from CodePlex onto the machine HERE

4. Create a new PS1 file with the following format:

The above will run the two referenced PS1 files and keep them in memory preemptively to calling the cmdlets in the script later, and when they are, the console knows where to look for them.

The following can be done via native STSADM to create the same type of output (Note: the enumallwebs command is an STSADM command that is only available in SP ’07 SP2 and above):

Both of these options clearly will work, but the output will be different and some may prefer to stick to STSADM when possible in the older environments.

In the end, this is simply a workaround that allows you to run a small sub-set of your PowerShell scripts that you know and love from SP’13 and SP’10 over a MOSS2007 farm when STSADM commands just fall short.

SharePoint – The local farm is not accessible

Below is an error one may receive when attempting to run SharePoint PowerShell cmdlets through the SharePoint Shell Admin:

“The local farm is not accessible. Cmdlets with FeatureDependencyID are not registered.”


Get-SPContentDatabase : The farm is unavailable.
At line:1 char:1
+ Get-SPContentDatabase
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (Microsoft.Share...ContentDatabase:
SPCmdletGetContentDatabase) [Get-SPContentDatabase], SPException
+ FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletGetConte

This error can occur when the user running the commands doesn’t have the proper permissions to the correct back end content databases. The SecurityAdmin roles is necessary to run Shell commands. To resolve this error, either grant the account running the command sharepoint_shell_access and db_owner at minimum to the configuration database in SQL, or login with the SP_Admin account and run with elevated permissions.

If you already have the account granted the above permissions in SQL, you can run the following in a PowerShell console:

Add-SPShellAdmin -UserName DOMAIN\User

Veeam File Level Restore – Restoring from a Storage Pool

Veeam is unable to restore a file from a backup where it’s original location was within a Storage Pool. Seriously?

I want to preface this post by stating that I have had experience with Backup Exec, Data Protection Manager, and Veeam, and that Veeam is by far, the easiest solution to implement, use, and maintain. The installation of the entire infrastructure is fool proof. The error messages are easy to resolve. The Veeam support is rock solid. And the wizards used for restores are about as intuitive as they come, especially after beginning my backup/disaster recovery work with the beast which is Backup Exec.

With that said, I have been disappointed with a few restore capabilities of Veeam. The first file level restore I had to make was a simple word document to be restored in place to a file server. I spun up the built in recovery wizard on the VM, and the restore continually failed and errored out. Why was a restore this simple failing?

Turns out, Veeam is not able to restore files from Storage Pools by default.

The overall process of recovering a file from a backup where it was located on a Storage Pool can be quite a hassle. Sometimes, you really have to ask if it’s worth it and if the user REALLY needs that file, and most times – you can bet your buns that it’s a life or death situation to the end user despite what you think.

So how to do it?

High level overview:

  • Spin up an Instant Recovery copy of the VM containing the file needing to be restored
  • Create a new temp disk on the sandboxed restored VM
  • Move the file to the new disk
  • Move the new disk to the prod VM (recovery target)
  • Move file from temp disk
  • Delete temp restored VM and temp disk

More detailed explanation:

Within Veeam, you’ll need to choose the Instant VM Recovery option from the Restore tab. Work through this wizard, selecting “Restore to a new location, or with different settings”. Rename the VM something different than the original name so to not confuse your VM host with conflicting names.

Now that you’re new VM is up and running (not connected to a network and not powered on), add a new hard disk to the VM. The new disk that you are adding to this VM only needs to be large enough to hold the file you need to restore. Once this is set up, power on and console into the VM, and move your file off the original Storage Pool location and onto the new disk that was added.

With your new disk containing your restored file, you will want to log into your production VM, add the disk containing the restored file, pull it off, and clear off all the fragments of temporarily restored VMs, new disks, etc.

This was an issue that I battled for quite some time, and only figured out this solution once having a case open for Veeam for a while and being told that Veeam is actually unable to perform restores from Storage Pools as of v7.0 Patch 3 (February 2014).  This seems like a simple enough feature that I am more than sure will be eventually not be an issue, but for now, this is the only way I know of doing it.

If you have any comments or a better way of handling this, please leave a comment and I would love to hear your thoughts.

E6530 Sound Issues


When I’m working in front of a screen day in and day out, I require some good tunes. The silence and ambient office chatter can really ruin your concentration if you’re easily distracted like myself. Without my good tunes, productivity can drop.

I was getting ready to jump into a Lync conference call and plugged in a USB headset, and my sound suddenly stopped working. If i tried to play a test sound for the default playback device on my Dell Latitude E6530, I would get an error message saying that there was an issue and the test sound couldn’t be played. Something was clearly up.

Test tone failure


I reinstalled my audio drivers which seemed to have resolved the issue for a short amount of time, but the next time I plugged/unplugged my regular headphones, this issue came around again. This was driving me crazy, I didn’t want to accept that this was an issue I wouldn’t be able to resolve. This laptop hasn’t failed me yet either till this point.


  1. Right click the speaker icon in your system tray and click “Playback Devices”
  2. Under the device that is failing to output audio, right click > Properties
  3. In the Properties window, click the Advanced tab
  4. Click Restore Defaults

SharePoint Feature GUID Identification

When working on a Sharepoint 2007 to 2013 migration, I had a need to identify SharePoint feature GUIDs in the 2007 farm to identify whether or not these items would be able to migrate to the new farm without issue. I had an application that provided documentation on each of the farms and identified all Site Collection features that were activated, but would output the resulting data with a list of these GUIDs rather than feature names.

Many features that come with SharePoint have GUIDs that you can reference by doing a quick google search for SharePoint Feature GUIDs. But for those custom features, I needed a way to pair these GUIDs with the feature names.


What to do?

When on the Site Collection you’re examining, go to Site Settings under the gear icon at the top right or Site Actions depending on what version of SharePoint you’re using:SiteSettingsSharePoint 2013 Site Settings

Navigate to Site Collection Features on the Site Settings page.SiteCollectionFeatures

You will then be given a page with a list of features activated, or available to activate, on that Site Collection. To find a specific feature’s GUID, click deactivate. Note: This will NOT deactivate the feature yet, you will be prompted with a confirmation screen before the feature is actually deactivated.


The next page will be the feature deactivation confirmation screen. We are not going to deactivate any features. The trick is that when you look at the URL on the resulting screen, it will contain the feature GUID with a FeatureID=<FeatureGUID> section in the URL:FeatureGUID

Remote Access – Port Forwarding UVerse 3800HGV-B

Recently, I have found a need to be able to set up a Remote Desktop Connection to my home desktop from work. At times, I’ve needed to access files that I accidentally left at home once I got to work, or maybe continue a long running task that I’d like to have finished by the time I got home. I’ve explored options for software such as LogMeIn, but found it to be tacky and overkill. Additionally, I use a program called RDTabs extensively at work to RDP into different client’s environments, so why not mine? Thus, the decision to set up external RDP access to my desktop seemed simple.


Part 1: Port Forwarding the Gateway for RDP Access

This seemingly simple task has proved to be quite daunting. Previous routers I have used have had much simpler UIs than what this router offers. Want to swap out this router with one you already have? Too bad. UVerse doesn’t want to let you do this. Setting up the 3800GHV into a bridge mode also, through some quick internet searches, isn’t a feature that can be utilized. How nice.

Remote Desktop Connections connect on port 3389. We need to set up the router to forward connections to this port to my desktop computer.

  1. Connect to the router’s administrator console using the default access URL
  2. RouterURL

  3. The default username and password to connect will be printed right on the router (assuming you haven’t changed this yet)
  4. Navigate to the firewall settings, and then Applications, Pinholes, and DMZ (Settings -> Firewall->Applications, Pinholes, and DMZ)
  5. Settings-firewall

  6. On the Firewall page, choose a profile for the device that you would like to apply the port forwarding settings to. In my case, I selected my ethernet connected desktop named “Neptune”. Notice the icons next to the devices which can help identify your device in case you are unsure of a name.
  7. DeviceSelection

  8. Next, ensure that “Allow individual application(s)” is selected. Select “XP Remote Desktop” in the Applications List in the center column, and then click add.
  9. XP Remote Desktop

  10. Click save at the bottom right
  11. Save

Now, we have the correct port forwarding to the device for remote desktop access. One would think that they are finished, but the finicky gateway provided by AT&T will prove that the setup you have just finished isn’t quite enough. Proceed to Part 2.


Part 2: Port Forward PPTP to the Same Device

This step doesn’t really make a lot of sense to anyone who knows what PPTP. PPTP is a protocol used for VPN access, which is something out of the realm of what we are doing today. Luckily, I stumbled across a blog post through my troubleshooting that stated turning on PPTP port forwarding to the same device will allow the RDP connections.

  1. Ensure you are on the “Applications, Pinholes, and DMZ” page that we were on from step 3 above.
  2. On the Firewall page, choose a profile for the device that you would like to apply the port forwarding settings to (step 4 above).
  3. Select the PPTP option in the middle Applications List (similar to step 5 above)
  4. Click Add
  5. Click Save


Part 3: Enabling Remote Access on the Device

Beyond the settings that have to be made in order to allow your router to forward these ports, you must make sure that your computer itself will allow Remote Access. Below are steps to make sure that your device will allow you to connect remotely. Note: The steps outlined below are for Windows 8.1

  1. Open file explorer, click My PC in the left navigation, and then System Properties in the ribbon:
  2. SystemProperties

  3. Click Remote Settings in the left navigation in the resulting screen:
  4. Remote Settings

  5. Allow remote connections to your computer:
  6. EnableRA


Part 4: Connecting to the Device

With the above steps completed, we should be able to now connect to your device using its public IP address. To find your public IP address, google “what is my IP” or navigate to

  1. Find your public IP address for the device by Googling “What is my IP” or utilizing a site such as
  2. Open your RDP application or open your Windows default Remote Desktop Connection application
  3. Connect using your public IP that was gathered above
  4. Enter credentials to log in


You’re done! You should now be able to connect to your computer. If you continue to have problems, make sure that your firewall isn’t blocking remote access connections. If you’re using the built-in Windows Firewall, allow remote access connections in your computer’s policy.