'How to pass arguments to program when using variable as path
I'm trying to start a program in a Start-job scriptblock using a variable for the path. Here is the line:
$using:plinkdir\plink.exe -telnet $using:ip -P $using:port | TimeStamp >> "$using:LogDir\WeightLog_$(get-date -f MM-dd-yyyy).txt"
All the there variables work, whole line works when I use c:\plink in place of the $plink variable. It errors out on the -telnet so is not getting the arguments to plink.
Here is the $var's and job:
$LogDir = "c:\users\user" # Log file output directory
$PlinkDir = "C:" # plink.exe directory
$SerialIP = "1.1.1.1" # serial device IP address
$SerialPort = 10000 # port to log
function CaptureWeight {
Start-Job -Name WeightLog -ScriptBlock {
# Bring Variables into job from callers scope
#$LogDir = $using:LogDir
#$PlinkDir = $using:PlinkDir
#$SerialIP = $using:SerialIP
#$SerialPort = $using:SerialPort
# Set TimeStamp format
filter timestamp {"$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_"}
# Start plink, pipe output to TimeStamp, redirect to log file
$using:PlinkDir\plink.exe -telnet $using:SerialIP -P $using:SerialPort | TimeStamp >> "$using:LogDir\WeightLog_$(get-date -f MM-dd-yyyy).txt"
}
}
Thanks!
Solution 1:[1]
This answer is based on some assumptions and a hunch of what might work for what is explained in your question and on comments.
First of all, to explain "It didn't lose any data when PowerShell was killed unexpectedly.", this is because >>
(alias for Out-File -Append
) is:
- Opening the file stream
- Appending output to the file
- Closing the stream
So, when you kill the job, what's there is still there basically. I did recommended you to use Set-Content
but this was before I understood what you were doing, in this case, this wouldn't be an option.
The alternative proposed here is to use StreamWriter
, which is nice because we can keep the file stream open and append to the file as needed without the need to close the stream each time (this will also take care of the "blank line between every line in the output to the log file"). To get around the killing the Job but still saving the results to the file we can use a try / finally
statement.
$LogDir = "c:\users\user" # Log file output directory
$PlinkDir = "C:" # plink.exe directory
$SerialIP = "1.1.1.1" # serial device IP address
$SerialPort = 10000 # port to log
function CaptureWeight {
Start-Job -Name WeightLog -ScriptBlock {
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$sw = [System.IO.StreamWriter]::new("$using:LogDir\WeightLog_$(Get-Date -f MM-dd-yyyy).txt")
& "$using:PlinkDir\plink.exe" -telnet $using:SerialIP -P $using:SerialPort | TimeStamp
}
finally {
$sw.ForEach('Flush')
$sw.ForEach('Dispose')
}
}
}
$job = CaptureWeight # For testing, save the job
Start-Sleep -Seconds 60 # wait 1 minute
$job | Stop-Job # kill the job
Get-Content "$LogDir\WeightLog_$(Get-Date -f MM-dd-yyyy).txt" # Did it work?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |