Robocopy Nested “Application Data” Glitch


So, I copied some profiles over using Robocopy. The size of these profiles on the new server was staggering! I had to expand the drive to accommodate the bloat!
Then I started looking – the data on the original profiles was not anywhere close to that big.

What happened?

Well, this is my normal robocopy command :

Robocopy \\OldServer\c$\Users\username C:\users\username * /ZB /e /Copy:DATSO /dcopy:DAT /xo /r:0 /XD $Recycle.Bin DFSRPrivate /XF desktop.ini thumbs.db /Log:c:\temppath\username.log /np /tee

One profile got the glitch – and others were starting to before I added a little line in my command code:

Robocopy \denali-rds-s16\c$\Users\username C:\users\username * /ZB /e /Copy:DATSO /dcopy:DAT /xo /r:0 /XD $Recycle.Bin "Application Data" DFSRPrivate /XF desktop.ini thumbs.db /Log:c:\temppath\username.log /np /tee

The profile that had the problem – well, I am still deleting with the following code:

Robocopy c:\temppath\blank "C:\users\username\appdata\local\application data\application data" * /ZB /e /purge /Copy:DATSO /dcopy:DAT /xo /r:0 /XD $Recycle.Bin DFSRPrivate /XF desktop.ini thumbs.db /Log:c:\temppath\usernamePurge.log /np /tee

c:\temppath\blank is an empty folder. Robocopy will delete anything within the target thanks to the /purge switch. This also works for folder paths with more than 256 characters that windows can’t delete on it’s own.

Anyway – starting I had 190 GB free on the drive – I am at 400 GB free on the drive now. More than 200 GB in nested “Application Data” folders – replicated by robocopy over and over and over again.

The Cleanup in Action

24 nested folders – and this is after about 20 minutes of deleting! It is still going as I write this!

… and it just finished – 402 GB free –

Okay, I know what you’re thinking – what does copying a profile to a new system with Robocopy accomplish? It doesn’t really transfer the profile over!

That is where I have to give a shout out to the Genius team at ForensIT – Profile Wiz is a life safer!

You can use Profile Wiz to literally take over a profile! Lets say, for example, JohnSmith worked for the company for 5 years, and all his documents and such were in his profile. John gets hit by a bus and you get a new employee – Brad Cooper – well, you want Brad to have all of John’s information – you can use Profile Wiz to give c:\users\johnsmith to the BradCooper login. Concerned about the folder name? Change the folder name to BradCooper – and then use the “Unassigned Profiles” checkbox to assign it to Brad.

Absolutely worth the Profession Edition!

DC AD and Group Policy

In the last post, I covered setting up a new domain controller and some things to help keep your domain healthy, well organized and your IT provider happy.

In this followup, I will keep going. Now that we have a Domain Controller, a Domain and DNS, we should look at Group Policy.

Group Policy Walk-Thru

One of the reasons that we chose to create OUs instead of Containers in the last post/video is that group policy can be applied to OUs, but not Containers.

In going over Group Policy, I’d like to start with User folders. In a corporate environment, losing a file can be a very bad thing. For the most part, servers are backed up, but workstations are not. So, how to protect the files of users? Server Shared folders are one option, but I’ll cover a couple others in this post – Folder Redirection and Home Folders. These let your users have more control over their files, as other users can not normally access either one.

It is a good idea to make a dedicated drive for Data files, separate from the OS drive.

For Home folders, create a folder on the Data drive named something like “HomeFolders.”

Open properties of the folder, security, advanced and disable inheritance.

Remove the Users permissions – give Authenticated users “This Folder Only” permissions to:
List folder / read data
Read attributes
Read extended attributes
Create folders / append data
Read permissions

The user “Creator Owner” should have “Subfolders and Files onlyfull control.

On the Sharing tab, use advanced and share the folder as “Home$” to make it a hidden share. Give Everyone read and Authenticated Users full control of the share.

In Active Directory Users and Computers, on the Profile tab, in the Home Folder section, choose a drive letter and put a path with a folder name that matches the user’s logon name.

Clicking Apply creates the folder. If you have a lot of users and don’t want to edit every user to add the home folder, you can use powershell – but you will need to use powershell to give them permissions to the folder as well.
Below is a powershell script to create the folders for existing users, give the users permissions and set the home folder for all users in active directory.

Import-Module ActiveDirectory

#Script for updating folder permissions to give the user full access to their home folder
# as long as its named the same as their username - so, jdoe will have full access to the jdoe folder.
# - with This "Folder, Subfolders and Files" level.
# --- change the domain name
 $domain = "kearan"
 $hdpath = "E:\KearanCo\HomeFolders"

# --- Make Home Directories
 $users=get-aduser -filter *
  Foreach($user in $users){
  $nhd = $hdpath + "\" + $($usern)
  New-Item -ItemType Directory -Path $nhd

# ---- change the Folder Path
 $folders = Get-ChildItem -Path $hdpath | Where-Object -FilterScript {
     $_.PSIsContainer -eq $true

# --- Set the folder permissions
 foreach ($folder in $folders) 
     $path = $folder.fullname
     $ACL = Get-Acl -Path $path
     $user = $
     $AccessRule = New-Object System.Security.AccessControl.FileSystemAccessRule("$domain\$user","FullControl",”ContainerInherit, ObjectInherit”,"None",”Allow”)
     $AccessRule1 = New-Object System.Security.AccessControl.FileSystemAccessRule("$domain\Domain Admins","FullControl",”ContainerInherit, ObjectInherit”,"None",”Allow”)
     $Account = New-Object -TypeName System.Security.Principal.NTAccount -ArgumentList "$domain\$user"
     $acl | Set-Acl $path

# --- Set users Home Directory in AD ---
# --- change "FileServer" to the actual file server name 
# --- and Home$ to the actual share name. And H to your share letter.
 $users=get-aduser -filter * 
  Foreach($user in $users){
  $HomeDir="\\FileServer\Home$\$($usern)" -f $usern
  Set-ADUser $user -HomeDirectory $HomeDir -HomeDrive H:

You can use each section of the above script as a stand-alone script in order to do one at a time. Use the below code to change an existing home drive to a new server.

# Change Home Directory
$users=get-aduser -filter {homedirectory -like '*Old_Server*'} 
 Foreach($user in $users){
 $HomeDir="\\NewServer\Home\$($usern)" -f $usern
 Set-ADUser $user -HomeDirectory $HomeDir -HomeDrive H:

For Folder Redirection, create an AD group for all those you want to have redirected folders. Unless you are comfortable having all the users in an OU having the folder redirection, of course. To have more control over what accounts get the folder redirection, use the AD group method.

Create a folder, like the Home Folder above, with the same permissions. Now, go into Group Policy Management and create a new Group Policy.

Edit the group policy and go to User Configuration –> Policies –> Windows Settings –> Folder Redirection

Choose the items to redirect (See the video) and set the scope of the policy to Domain Computers (or whichever computer group you want, such as RDS Servers) and the Group you want to apply it to, ie “Folder Redirection Group.” Apply the policy to the domain, or the target OU.

See the video for more on Group Policy and Troubleshooting.

The GPupdate code from the Video:


gpresult /H e:\kearanit\%username%_GPResult.htm

– Video only available through the blog – how to enable the AD Recycle Bin – restore accidentally deleted user accounts!

New Domain Controller Best Practices and Troubleshooting

A lot of guides and how-to videos out there show you the basics and the bare-bones – this is real world, git’r’done right stuff!

35 Minute Video Walk-thru of a Good Domain Controller configuration – read the rest of the blog for updates and corrections!

The first step to setting up a new domain, or creating a new domain controller for an existing domain, is, of course, to install the OS. We’ll assume that has already been done.

If you are adding a 2019 DC to an existing Domain, you will probably need to migrate the domain to use DFSR instead of FRS for syncing Active Directory and DNS – see the DFSR Migration section.

In the video above, I may do things in a different order, but here are the first steps:

A. Set a static IP address and Public DNS servers. The DNS servers you set here will become the DNS Forwarders of your new Domain Controller.

Well known public dns are ; ; ; and

B. Name the new server something those who come after you will understand. This means your organization name, year it was created and the server’s role should be in the name of the server. For a business named Kearan Company, our first domain controller could be Kearan-19-DC – created in 2019, acting as a Domain Controller. -DCFS, -RDS, -APP, -SQL, -Web, -Intranet are all possibly good role names to use. Just make sure that it is not a very long name – there are limits! 16 Characters for the server name should be fine. (Changing the name requires a reboot!) (Update:) It has come to my attention that Domain Controller In-Place-Upgrades are easier and more reliable than before, so maybe having the OS in the name is not a perfect idea, in case you update from 2012 to 2019 OS. If this is something you would be comfortable with, make a naming convention – Kearan-DC-001 ; Kearan-RDS-002 ; Kearan-RDS2-003 ; etc.

C. Use Server Manager to install the Active Directory Domain Services role and DNS Server – see the above video for a walk-thru on that process.

D. With the roles installed and the server rebooted, promote to a domain controller! Document the DSRM Password!

  • Domain name is very important – keep it short and informative. For a company named Contoso Specialty Products Supply Company, a domain such as “Contoso” or “CSPSC” would be perfect. The dot-local (Contoso.local) is preferred, as it is the default. You can use another such as .corp or .main – I have even seen .private – but stay away from the major top level domain extensions such as .com, .net and .org as these can cause a conflict between local DNS and Public DNS. So, New info has come to light – .local and other ‘private’ extensions are being sold as a TLDs (Top Level Domains) now, so the new Best Practice is to make your domain a sub-domain of a domain you own. So, you own (but not .com), you would want to make your local domain something like or This way, you can also get SSL certs for your local domain names.
  • Document the DSRM password where it can be found in the future! Just in case.
  • Netbios name is just a short version of the domain name – so if you’re using – make the netbios “contoso” – it has a 15 character limit, so be brief.
  • Reboot and Log in to your new Domain!

E. Set some important DNS settings to squash problems before they happen

  • Set Aging/Scavanging for all zones
  • Apply to existing Active Directory-integrated zones
  • Allow zone transfers to servers listed on the Name Servers tab
  • Automatically notify the severs listed on the Name Servers tab
  • Check and/or set forwarders

F. Open AD Users and Computers and create a good AD structure!

You will need to move newly added users and computers from their default “Users” and “Computers” containers into your structure, but it will make organization and group policy much easier to manage in the future.

(See below for an AD User Import powershell!)

G. Copy the Administrator user and create a domain admin user based on your company – such as KearanIT. Add the new admin to the Backup Operators group. Log off of Administrator and log in with the new domain admin account. Now, DISABLE ADMINISTRATOR!
Move the new Admin account into the Service Accounts OU created as part of the good structure.

H. Make the Domain Controller a Reliable Time Server using the commands below:

w32tm /config /manualpeerlist:"" /syncfromflags:manual /reliable:yes /update
w32tm /config /reliable:yes
net stop w32time && net start w32time
w32tm /query /peers

Powershell, CMD Line and Troubleshooting for
Domain Controllers

DFSR Migration

First, raise the Forest functional level to as high as possible – must be at least 2008 R2. Now get the Global State:

DFSRMig /GetGlobalState

Start the migration –

DFSRMig /SetGlobalState 1

Check on progress –

DFSRMig /GetMigrationState

Once the migration state says all domain controllers are synced, then go to SetGlobalState 2 – GetMigrationState until that is synced, then SetGlobalState 3 until that is synced. 3 is the Final state – you are all on DFSR now!

Powershell to Move FSMO Roles

Run the following in an Admin Powershell window on the server you want to be the new FSMO role holder:

Move-ADDirectoryServerOperationMasterRole -Identity $env:computername -OperationMasterRole 0,1,2,3,4

Don’t forget to Move the Last Two FSMO Roles using ADSIEdit.

Powershell to Import Users from a CSV file

You will need to copy the powershell code below into a new Powershell ISE module, then save it as a .ps1 and edit it for your needs. Create the CSV file with the two lines after “format of file:”

# - Imports given CSV file
# format of file:
# Firstname,Lastname,SAM,OU,Password,Description,EmailAddress
# bobby,kearan,bkearan,"OU=Employees,OU=Users,OU=_Kearan,DC=kearan,DC=local",Pl3asech@ngeme,Awesome IT Engineer,[email protected]
$csvPath = "C:\KearanCode\ADUserImport.csv" # Read-Host -Prompt 'path to the csv file'
$Servername = $env:computername # Read-Host -Prompt 'Name of the DC (servername)'
$Users = Import-Csv -Path $csvPath
foreach ($User in $Users)
$Displayname = $User.'Firstname' + " " + $User.'Lastname'
$UserFirstname = $User.'Firstname'
$UserLastname = $User.'Lastname'
$OU = $User.'OU'
$SAM = $User.'SAM'
$Description = $User.'Description'
$Password = $User.'Password'
$Email = $User.'EmailAddress'
New-ADUser -Name "$Displayname" -DisplayName "$Displayname" -samaccountname $SAM -UserPrincipalName $SAM -GivenName "$UserFirstname" -Surname "$UserLastname" -Description "$Description" -Emailaddress "$Email" -AccountPassword (ConvertTo-SecureString $Password -AsPlainText -Force) -Enabled $true -Path "$OU" -ChangePasswordAtLogon $false –PasswordNeverExpires $false -server $Servername

You will need to set them all to not need to change their password on first logon by using the following two commands:

Import-Module ActiveDirectory
Get-ADUser -Filter * -SearchBase "OU=_Kearan,DC=Kearan,DC=Local" | Set-ADUser -ChangePasswordAtLogon:$False

DCDiag Commands

The following does a report and saves it in a txt file on the root of C:\ (adjust to your preferred file path)

DCDiag /c /v /f:c:\dcdiag.txt

The following does a report and attempts to fix any issues it found and puts the txt file in the root of C:\

DCDiag /fix /v /f:c:\dcdiag.txt

Force a Time Zone Change

For some reason, Windows has become a bit difficult about changing the time zone. Below is a command line to see the time zone and change it. Last line outputs a list of time zone names that can be used. Open the cmd window as admin to run this.

tzutil /g
tzutil /s "Central Standard Time"
tzutil /l

Group Policy Central Store

A central store for Group Policy is a good thing to implement.  This allows all domain controllers to access the same policies no matter what version of server they are running.  To set this up, follow the below steps:

  1. Create the Central Store on a Domain Controller by creating the policy definitions folder: C:\Windows\SYSVOL\domain\Policies\PolicyDefinitions
  2. Copy all of the contents of C:\Windows\PolicyDefinitions into the newly created folder.

You now have a Central Store – as this gets replicated to all domain controllers.

Now you can download more up to date .admx files – such as windows 10 and windows 11 policies – or Google Chrome templates.  Extract those and then copy over to the central store.  Put the .admx files with the rest of the .admx files, and copy the language files from the “en” folder to the “en” folder in the central store.  You do not need to copy all the other language files unless you will be using them.

— what would you like to see covered next? Comment below —

The 7 FSMO Roles

5 FSMO roles? Oh, no. There are Hidden FSMO roles that they don’t tell you about!  They don’t want you to know about these until you run into a problem! There are really 7 FSMO Roles to know about.

Have you even been unable to demote a domain controller?  It tells you that it can’t determine the fSMORoleOwner – even though a netdom query FSMO returns all 5 roles?

You may also get: “The Directory service is missing mandatory configuration information, and is unable to determine the ownership of floating single-master operation roles”

Well, there are two hidden roles: CN=Infrastructure,DC=ForestDnsZones  and CN=Infrastructure,DC=DomainDnsZones

So, the next time you are transferring FSMO roles, you need to move these two as well – before you Decom the old Role Holder!

Run adsi edit as admin.

This image has an empty alt attribute; its file name is fmso_0.png
Connect to

Right click on ADSI Edit, select Connect to the naming context 

This image has an empty alt attribute; its file name is fmso_1.png

Click and expand the new “Default naming context” – click on the connection point, move to the right column and click Infrastructure:

This image has an empty alt attribute; its file name is fsmo_2.png

Right click and select properties or double click to edit.

Scroll to fSMORoleOwner

This image has an empty alt attribute; its file name is fsmo_3.png
fSMORoleOwner line

You may see something like : CN=NTDS Settings\0ADEL:aae73bb2-d552-4b61-a6e0-7ce4e09dcc47,CN=oldservername\0ADEL:234e4831-f988-4c2a-a1ca-db0f8b2643d8

This is an already decommed DC that never got the fSMO role moved.

Double click to edit.  Change the CN to match your normal FSMO role holder.  You can copy the fSMORoleOwner from the original “Default naming context” section – which is DC=yourdomain,DC=tld”

Repeat for naming context “DC=ForestDnsZones,DC=yourdomain,DC=tld”

This image has an empty alt attribute; its file name is fsmo_4.png

The fSMORoleOwner in each of the three “Infrastructure” sections should match.

Printers Deployed via Group Policy

So, ran into a situation the other day where some printers were added to some computers they were not supposed to be on. When we went to remove them – nobody could. Access denied. Enterprise Admin could not remove the printer from the computer.
Group Policy.

There are a few ways to deploy printers via group policy.
1. Click “Deploy” on your print server. Unless you want everyone and every system in the entire domain to have that printer – do not do this. You won’t know which policy it uses to deploy the printers, you won’t know where it is applied. ( probably sets a “printer” policy on the root of the domain )
2. Create a group policy using Computer Configuration –> Policies –> Windows Settings –>Printer Connections (on older DCs)  ( Don’t do this! )
3. Create a group policy using Computer Configuration –>Preferences –> Control Panel Settings –> Printers (Nobody will be able to delete these printers)
4. Create a group policy using User Configuration –> Preferences –> Control Panel Settings –> Printers (You will be able to delete these printers – and they will show back up on next reboot, unless removed from the policy)


O365 and your SPF Record

So, I recently discovered that Office 365 has a new trick up it’s sleeve – using SPF records WRONG.  Had several bouncing and rejection issues with some clients due to this new idiocy.

An SPF record is supposed to match your SENDING ip to the SPF record.  But now O365 is requiring the MX record – the Receiving ip – to be in the SPF record.  Why is this messed up?  Well, for one, many people use third party Spam Filtering services for their MX record – to filter out spam before it gets to their inbox.  So, many MX records are spam filters – not what is sending out the email.  Basically, O365 just opened up a Huge security hole.

Thanks, Microsoft.

Been Busy!

I’ve been doing a lot lately. I have stood up a Windows Server 2012 server, racked it and configured it for a client. I have updated boot code and firmware on firewalls and routers (Sonicwall, Cisco and Mikrotik). Fixed a couple “Read-only Filesystem” linux errors on Xen VMs.  Moved a few websites/email from a hosting service to our own hosting server.

Hmm, written down, it doesn’t seem like a whole lot – until you think about all the steps involved in moving a website, changing DNS in multiple places, syncing emails, doing documentation.


Labtech and MySQL to Monitor Exchange Backpressure

So, we decided we needed to monitor Exchange servers for Backpressure so we can be more proactive in preventing problems.

I wrote a script in Labtech to monitor the event log for incidents that indicate potential issues.  Check out the SQL concat!  The logic checks for existing tickets, either creates a ticket, makes a note on an existing ticket or closes the ticket it if the situation no longer exists.  I didn’t include putting time into the ticket, but that would be fairly easy as well.

See the script export below:

Resend EventLogs
SET:  @BackpressureEvent@ = SQLRESULT[SELECT Concat(eventlogs.TimeGen, ” “, eventlogs.Message) As dEvent FROM eventlogs WHERE eventlogs.`Message` like ‘%resource pressure increased from Medium to High%’ AND (timegen > DATE_SUB(NOW(), INTERVAL 1 HOUR)) AND ComputerID=%computerid% LIMIT 1]
IF  @Backpressureevent@  Contains  High  THEN  Jump to :Alert
SET:  @mysqlquery@ = SELECT COUNT(v_tickets.`TicketID`) FROM v_tickets WHERE v_tickets.`Subject` = ‘%locationname% / %computername% / Exchange Backpressure Detected’ ‘
SET:  @sqlresults@ = SQLRESULT[SELECT COUNT(v_tickets.`TicketID`) FROM v_tickets WHERE v_tickets.`Subject` = ‘%locationname% / %computername% / Exchange Backpressure Detected’]
IF  @sqlresults@  >=  1  THEN  Jump to :ProcessTicket
:Alert – Label
Note: Backpressure! – need to create a ticket!
:CheckTicket – Label
SET:  @mysqlquery@ = SELECT COUNT(v_tickets.`TicketID`) FROM v_tickets WHERE v_tickets.`Subject` = ‘%locationname% / %computername% / Exchange Backpressure Detected’ ‘
SET:  @sqlresults@ = SQLRESULT[SELECT COUNT(v_tickets.`TicketID`) FROM v_tickets WHERE v_tickets.`Subject` = ‘%locationname% / %computername% / Exchange Backpressure Detected’]
IF  @sqlresults@  >=  1  THEN  Jump to :UpdateTicket
:CreateTicket – Label
LOG:  Exchange Backpressure High! Creating Ticket
Create New Ticket for %clientid%\%computerid% Email:%ContactEmail% Subject:%locationname% / %computername% / Exchange Backpressure Detected
SET:  @eTicketId@ = SQLRESULT[SELECT v_tickets.`TicketID` FROM v_tickets WHERE v_tickets.`Subject` = ‘%locationname% / %computername% / Exchange Backpressure Detected’]
Send Email To:[email protected] Subject:Exchange Backpressure – %clientname% – %computername%
:UpdateTicket – Label
SET:  @eTicketId@ = SQLRESULT[SELECT v_tickets.`TicketID` FROM v_tickets WHERE v_tickets.`Subject` = ‘%locationname% / %computername% / Exchange Backpressure Detected’]
LOG:  Exchange Backpressure High! updating Ticket
Comment Ticket @eTicketId@ to Admin
:ProcessTicket – Label
Note: If the Ticket exists then Finish it.
SET:  @monitorticketid@ = SQLRESULT[SELECT v_tickets.`TicketID` FROM v_tickets WHERE v_tickets.`Subject` = ‘%locationname% / %computername% / Exchange Backpressure Detected’]
IF  @monitorticketid@  =  0  THEN  Exit Script
IF [SQL SELECT COUNT(ticketid) FROM tickets WHERE ticketid=@monitorticketid@]  <  1  THEN  Exit Script
IF @monitorticketid@ Ticket Status equals Resolved  THEN  Jump to :ClearMonitorTicketID
RUN SCRIPT:  _System Automation\Functions\Load Properties – Ticketing*
Finish Ticket @monitorticketid@ to @propTicketDefaultUserID@
:ClearMonitorTicketID – Label
SET:  @monitorticketid@ = 0
SET:  [STATE @fieldname@ticketid]  =  @monitorticketid@  for computer @computerid@
:EndProcessTicket – Label
:END – Label


Whats so good about Labtech?

Labtech.  RMM tool.  (Remote Monitoring and Management).

What is so great about it?  Well, once you learn it… once you understand it… you can do anything you want with it.

Like what?  Well, you have your normal RMM things like keep track of the computers on the network, what OS, what programs, keeping up with Microsoft patching, installing software, removing software, etc.  Then you have the monitoring – you can monitor just about anything you can think of from registry entries (including installed programs), event logs, pretty much anything you can find in a readable file on the computer, you can monitor and trigger alerts or even emails and text messages on those events.

Then you have scripting.  Some of the things I can accomplish with scripting:

Extract the backup status of a computer from the logs of the backup program and send an email if there is a failure.

Read a file version, compare to an internet site and send an alert if the two are different or off by more than two, etc.

Launch a series of powershell scripts to configure a new windows server, with variables put in when starting the script in Labtech.

With a little creativity, anything you can do via command line or powershell, can be done remotely, in the background, with Labtech.

Yes, that is very cool.

Linux Joy

My home computer has been running Ubuntu Desktop (Debian Linux) for years.  It has only been recently that I have had the pleasure of working with Linux Servers.  I have set up, configured and administered a few LAMP (Linux, Apache, MySQL and PHP) web servers.  For the last couple days, I have been playing with Citrix Xen Server and HyperV to proof-of-concept virtual host NIC Bonding / Teaming and it’s effect on existing Virtual Machines on each platform.

I set up an environment (Xen, then HyperV) on a test server (Dell R710) and added a few Linux VMs (Got to play with Ubuntu Server 16.04 LTS and CentOS 7).  Then I did the Bonding/Teaming and watched for what happened to the existing servers and network connectivity.

Xen Server did very, very well! It had zero network disruptions as NIC Bonding was created and put into place.  Very impressive.

HyperV was less impressive.  There were multiple connectivity disruptions and panic-inducing bouncing before it settled down.

All that, to show off a screenshot:

Updating Ubuntu and CentOS