BitsTransfer Logs

I just discovered Bitstransfer and am wondering is there a way to get a log file generated after a copy job finishes running.

Here is the code I am using:

$credential = Get-Credential -credential "domain\acctname"
$credential.Password | ConvertFrom-SecureString | Set-Content c:\temp\encrypted_password_acctname.txt

#Pass encrypted credentials to use for copying files 
$encrypted = Get-Content c:\temp\encrypted_password_acctname.txt | ConvertTo-SecureString
$credential = New-Object System.Management.Automation.PsCredential($credential.UserName, $encrypted)

#Copy File(s) from source to destination using bitstransfer 
Import-Module bitstransfer

$sourcePath = "C:\Temp\files.*"
$destPath = "\\servername\\Test"

Start-BitsTransfer -Source $sourcePath -Destination $destPath -Credential $credential

Although I dont think others on this site are a big fan, I always use Start-Transcript for my logging tasks. I also have a function that provides additional logging.

My $.02

1 Like

J,
Welcome back to the forum. :wave:t3: … long time no see. :wink:

I’m afraid that topic is way more complex than you might think it is. What problem would you like to solve with it? Assuming you’re about to transfer files in a domain network I’d recommend using robocopy instead of BITS.

Adding credentials to a script is most of times a bad idea and in Active Domain environments mostly unnecessary.
And it will only work for the user you used to encrypt the password and on the computer you used to encrypt the password. You cannot transfer the file with the encrypted password to another computer or not even use it with another account on the same computer. :man_shrugging:t3:

That’s unnecessary. Properly installed modules will be loaded automaticall by PowerShell if needed.

That will not work. The amount of source files has to be the same like the amount of destination files. Here for your reference the help for Start-BitsTransfer -Source

You know you have to use Complete-BitsTransfer to actually complete the transfer if the job finished transfering all bits and bytes, don’t you? :smirk: So you have to monitor the transfer and wait until all files are transfered. Then you need to complete it with the mentioned cmdlet. Otherwise you will only see temporary files in the destination.

I highly recommend reconsidering if you really need to use BITS. In most closed networks like an Active Directory domain environments it is not necessary. As I said earlier - robocopy does a fantastic job when it comes to transfering files. And it creates realy nice log files if you need. :wink:

1 Like

Yes, I am having some other issues with Bits Transfer. Basically, what I am trying to do is copy files from one server to another every day multiple times a day. The catch is that it needs to be run using a service account.

If I could do that, then I would set up a basic task scheduler job to run the script daily. Unfortunately, the files on the destination server that will be copied arrive on the destination throughout the day so the task scheduler job must run every 30 mins or so.

So far, I am having problems due to the service account part. I love Robocopy but my understanding is it doesnt allow to pass credentials. I tried using new-PSSession but that fails b/c the service acct doesnt have the necessary access.

I could use net use or new-PSSDrive but my understanding is that maps a drive. Not a big deal usually but if the task scheduler is running so frequently, I am not sure how ideal that is to constantly be mounting/dismounting drives.

What do you think?

You simply run the task scheduler task with the service account. :man_shrugging:t3: If that service account has the appropriate permissions on the target system you don’t have to tinker around with any credential stuff in your script at all. :wink:

Olaf’s approach to this is basically a one liner. You dont even need PowerShell.

I could use net use or new-PSSDrive

Again, using Olaf’s Robocopy suggestion, you simply use the UNC path for source and destination, no need to map any drives. As long as the service account has the proper permissions, it should again be a one liner in the Task Scheduler. Robo also solves your logging as well. It has very robust logging :slight_smile: I would recommend the -NP (No Percentage) argument to robo if you are using logging as it gets a bit ugly with large files.

I guess this is my $.04

2 Likes

hmmm interesting…what happens if the password expires? Is it just a matter of editing that task with the new password?

That is what we do here. Not sure if there is some elegant way around that. Another REALLY nice feature to Robo is it only updates new/changed files by default so your initial copy may take some time, but subsequent copies should go very fast.

The password for a service account??? … really?? … something like this never expires. :man_shrugging:t3:

haha I wish that were true where I am. Those were the days!

That’s why it is a service account. It has a complex long password. No one has to type it frequently because you set it once and it runs indefinitely actually. And it never expires. What a hassle would that be if all service accounts would have expiring passwords? … that’ll be an administrative nightmare … really. :man_shrugging:t3:

And BTW: expiring passwords are not recommended anymore even for people!!!

https://www.semperis.com/blog/nist-joins-microsoft-in-changing-how-we-should-think-about-passwords/#:~:text=Stop%20expiring%20passwords,unless%20there’s%20evidence%20of%20compromise.

Although I wholeheartedly agree with you Olaf, sadly, we as well are required to change our service accounts annually or when someone that knows the password leaves the company or takes on a new role. Our only saving grace is a well documented procedure to follow.

And why don’t you use Managed Service Accounts or even Group Managed Service Accounts? :man_shrugging:t3:

2 Likes

This is what I was going to add. Managed service accounts (good on one server) or group managed service accounts (can be assigned to multiple SPNs) are the cream of the crop in on-premise AD. The passwords are rotated out automatically by AD and you never have to input them into a service/task/etc. Just know that many attackers love people that store passwords in files, scheduled tasks, services, etc because they make the passwords so simple to retrieve. Using a managed account eliminates these attack surfaces.

2 Likes

I didn’t even mention managed services accounts have been around since server 2008 and I believe group managed came with 2k8 R2. (could be 2012 or i could just be wrong)

1 Like

Because we dont make the policies. They are mandated by our customer, and thats that. Cant say much more than that. I will say I concur with both of you. Thanks Olaf and Doug.

Hmmm … but if you are the IT service provider of your customers I’d expect you consult your customers accordingly. :man_shrugging:t3: Using these best practices makes the administration easier (cheaper) for you or the customer and more reliable and secure at the same time. I cannot imagine one single valid argument against those benefits. :man_shrugging:t3: :man_shrugging:t3: