ConvertTo-CSV not creating valid CSV file

by Optimaximal at 2013-03-18 05:49:39

I have the following code that is executed as part of a SQL DTS task to create a CSV of some data.

#Globals
$JDWSupplierno = "90671"

#Paths
$GlobalPath = "X:\JDWilliams\Stock"
$SCPPath = "C:\Program Files\WinSCP\WinSCP.com"
$ArchivePath = "Archive"

#Files
$InputFile = "stock_export.csv"
$OutputFile = $JDWSupplierno + "STOCK$(get-date -f yyyyMMddhhmmss).csv"

$ControlUUID = "0,$JDWSupplierno"
$Ref = "99"

# Step 1 - Modify SQL exported file to add data.

#Import CSV
$CSV = Import-Csv $($GlobalPath + $InputFile)

#Prepare the output file and export first line
$ControlUUID > $($GlobalPath + $OutputFile)

#Add the CSV to the new file
$CSV | ConvertTo-Csv -NoTypeInformation >> $($GlobalPath + $OutputFile)

#Now add the footer line
"$Ref,$($CSV.Count + 3)" >> $($GlobalPath + $OutputFile)

#End


This creates the file and it seems structurally valid, but when opened in Excel (for example) it loads the Text Import Wizard as all the data is in Column A, even though the structure is ‘correct’.
Opening the file in CSVed either gives a ‘file corrupt’ error or shows random characters in Column 1, although if you view the csv in the built-in text editor, it is fine.

The CSV generated by SQL (product_stock.csv) is fine before it’s altered.

Help? :slight_smile:
by AlexBrassington at 2013-03-18 06:32:26
It looks like you’re double converting the content. $CSV is imported as a CSV file on line 19 and then you convert this in-memory CSV into a CSV again to output it on line 25. Try using Export-CSV instead.

Failing that, how exactly is the file invalid? Try processing a subset of the data and seeing how it’s corrupted.
by Optimaximal at 2013-03-18 07:00:32
[quote="AlexBrassington"]It looks like you’re double converting the content. $CSV is imported as a CSV file on line 19 and then you convert this in-memory CSV into a CSV again to output it on line 25. At that point you haven’t modified anything in the file so you should just be able to output the CSV straight away without the Convert-CSV command.

Failing that, how exactly is the file invalid? Try processing a subset of the data and seeing how it’s corrupted.[/quote]

The reason it seems it is being ‘double converted’ is because it’s being read from one file to a new one (hence $InputFile and $OutputFile). If there’s a more efficient way of writing it and I’m just missing something blindingly obvious, do share :slight_smile:

As for ‘how’ it is invalid, I’m not sure. I get no specific errors by Excel, CSVed or Powershell (when converting the file). It just all seems to be written into column A whenever it’s opened by any program. The commas are there, as are the line breaks.
by mjolinor at 2013-03-18 07:20:04
What are the column headings on your original CSV file?
by Optimaximal at 2013-03-18 07:26:24
Line Type, Product Reference & InStockIndicator.

The key issue is these are written into line 2 on the modified file. The $ControlUUID is written as line 1 before the values are dumped from $CSV into the file. Finally, the values on line #28 are written into the final row, including a total count of all rows.

Unfortunately, this is all very rigid because it’s being used to send data to a third party and I’ve got no room to manoeuvre.
by mjolinor at 2013-03-18 07:30:12
It appears you have a CSV file with 3 columns, and you’re trying to bracket that with lines that only have 2 values. That’s going to screw up a CSV. All the lines have to have the same number of values.
by Optimaximal at 2013-03-18 07:37:09
I wasn’t aware of that making a difference, but that was a small bit of housekeeping I did to tidy up the code ahead of posting here. Originally there were dummy commas and null values in both the first and last row. It makes no difference once I put them back in. :confused:
by Optimaximal at 2013-03-18 08:03:49
I’ve also added some code to look for the double quotes generated by the ConvertTo-CSV cmdlet and stripping them out to see if that’s why. Nadda.

I’ve also commented out the code that adds the initial and final lines, to no avail.

$CSV | ConvertTo-Csv -NoTypeInformation >> $($GlobalPath + $OutputFile) breaks the CSV.
by Optimaximal at 2013-03-18 08:11:24
Ok, I missed the comment AlexBrassington made earlier about using ExportCSV. I did experiment with it but cannot remember the reason why I went for ConvertTo-CSV. I’ll look back over my notes.

Edit - ahh, yes. ExportCSV creates the CSV headers at the top of the file based on the values in a table.
by mjolinor at 2013-03-18 08:14:38
Have you tried just creating a pair of custom objects from the data you want to add, and creating a new object array with those objects and the imported CSV data, and then exporting the new array?
by Optimaximal at 2013-03-18 08:27:28
How is that done in PS? I’m new to the language/syntax. Is it a case of…

$CSV = @()
$CSV = $Header
$CSV += Import-Csv $($GlobalPath + $InputFile)
$CSV += $Footer

$CSV| Export-Csv $($GlobalPath + $OutputFile) -NoTypeInformation


???
by mjolinor at 2013-03-18 08:43:30
Something like this:

$first =
new-object psobject -Property @{
'Line Type' = <some value>
'Product Reference' = <some value>
InStockIndicator = <some value>
}

$last =
new-object psobject -Property @{
'Line Type' = <some value>
'Product Reference' = <some value>
InStockIndicator = <some value>
}

$CSV = Import-Csv $($GlobalPath + $InputFile)

$NewArray = @($first + $CSV + $last)

$NewArray |
select 'Line Type','Product Reference',InstockIndicator |
Export-Csv $NewArray -NoTypeInformation $($GlobalPath + $OutputFile)


You’ll need to fill in the values you’re wanting to add in the field you want them in.
by Optimaximal at 2013-03-18 08:49:57
But my guess is that will create something like…

‘Line Type’, ‘Product Reference’, ‘In Stock Indicator’
0,90671,
1,A56110,N
1,A56111,Y
1,A56112,N
1,A56113,N
1,A5616,Y
1,A5617,N
1,A5618,N
1,A5619,Y
1,A56210,N
99,12,


I need the first and second rows swapped.

edit - Also,
$NewArray = @($first + $CSV + $last)
fails with the error [System.Management.Automation.PSObject] doesn’t contain a method named ‘op_Addition’
by mjolinor at 2013-03-18 08:59:49
Like this?

‘Line Type’, ‘Product Reference’, ‘In Stock Indicator’
1,A56110,N
0,90671,
1,A56111,Y
1,A56112,N
1,A56113,N
1,A5616,Y
1,A5617,N
1,A5618,N
1,A5619,Y
1,A56210,N
99,12,
by Optimaximal at 2013-03-18 09:02:20
No…


"0","90671",""
"Line Type","Product Reference","InStockIndicator"
"1","A56110","N"
"1","A56111","Y"
"1","A56112","N"
"1","A56113","N"
"1","A5616","Y"
"1","A5617","N"
"1","A5618","N"
"1","A5619","Y"
"1","A56210","N"
"99","12",""


…and we’re back to step one! :slight_smile:

Maybe I need to do everything but the first line then open the file, position the cursor at 0,0 and insert the value then?

I’m this far from tearing this all up and using a stored procedure in SQL :confused:
by mjolinor at 2013-03-18 09:10:47
If you do that, you’re going to screw up the column headings of the CSV file. The line with the column headings MUST be first. If this is what you really want to do, then you need to treat all of this as pure text data and not CSV.

$first = '"0","90671",""'
$last = '"99","12",""'

$newfile = @()

$newfile += $first
$newfile += get-content $($GlobalPath + $InputFile)
$newfile += $last
$newfile | set-content $($GlobalPath + $OutputFile)
by Optimaximal at 2013-03-18 09:17:03
Is it possible to count the ‘content’ of the imported file? The ‘12’ in the last column is/was a count of the number of rows of the CSV, with 3 added to it because there are 3 rows extra.

edit - Nope, doesn’t work (returns null/0). Creates a valid CSV though!

Looks like I’m going to be importing the CSV ‘as’ a CSV for the count.
by mjolinor at 2013-03-18 09:19:11
This will fix the op-addition error, and produce a proper csv file:
$first =
@(new-object psobject -Property @{
'Line Type' = 0
'Product Reference' = 90671
InStockIndicator = $null
})

$last =
@(new-object psobject -Property @{
'Line Type' = 99
'Product Reference' = 12
InStockIndicator = $null
})

$CSV = Import-Csv $($GlobalPath + $InputFile)

$NewArray = @($first + $CSV + $last)

$NewArray |
select 'Line Type','Product Reference',InstockIndicator |
Export-Csv $NewArray -NoTypeInformation $($GlobalPath + $OutputFile)


But it’s not going to be in the order you’ve specified.
by mjolinor at 2013-03-18 09:22:49
[quote]
Is it possible to count the ‘content’ of the imported file? The ‘12’ in the last column is/was a count of the number of rows of the CSV, with 3 added to it because there are 3 rows extra.[/quote]

It’s possible, but I’m starting to suspect this would all be a lot easier if you’d explain what the objective is. What is the purpose of this file after it’s been created?
by Optimaximal at 2013-03-18 09:23:51
It’s not graceful, but…
$CSV = Import-Csv $($GlobalPath + $InputFile)

$first = '"0","'+$JDWSupplierno+'",""'
$last = '"99","'+$($CSV.Count + 3)+'",""'

$newfile = @()

$newfile += $first
$grabCSV = get-content $($GlobalPath + $InputFile)
$newfile += $grabCSV
$newfile += $last
$newfile | set-content $($GlobalPath + $OutputFile)


…works!

Thanks for the help guys.
by Optimaximal at 2013-03-18 09:26:06
[quote="mjolinor"]It’s possible, but I’m starting to suspect this would all be a lot easier if you’d explain what the objective is. What is the purpose of this file after it’s been created?[/quote]The objective was to get some data in the format specified out of a database and sent off to a third-party for interpreting. Their template cannot be altered as it’s standard for a number of companies they work with.

Frankly, if they knew the CSV files would technically bad, they’d probably change it, but it’s not my place to say.

Is there any video on howto fill template using powershell?