Help with file output

Hi

I am new to powershell and I am hoping someone can help me with this script.

To be honest, I had help writing it since I have never interfaced with SQL server before. the script works and gives me this output

“product_id”,“mpn”
“780”,“18734”
“793”,“18729”
“830”,“BCDC5U”
“870”,“215”
“894”,“633”
“900”,“027”
“5187”,“LRP-1”
“5221”,“A82VDY”
“5230”,“MZW600”
“5247”,“27115”
“5250”,“LB-01”

the problem is that the first field, in the output CSV is the key to the file and when I try to import it I get an error. I am trying to find a way to make the first field numeric and the second field text . FYI.. I have tested the import by removing the Quotes for the first field and I have no problems with the import

I could not figure a way to make that change to my script (below) so, the first field is saved as numeric (without the quotes).

any help or pointers would be welcome

thanks to everyone !!

Gary

does this mean you get an error in Excel? can you elaborate a bit more?

Even if i Export-Csv on an object with an explicit [Int] property, it still ends up being a quoted number in the raw .csv

thank you for your response

Sorry I did not make that clear.. I wish it was excel :slight_smile:

I take the output of the script and import it into another database and it is that import process that is giving me a problem

thanks

Gary

ok, is that also a PowerShell thing, or not?
I think Export-Csv is going to keep messing this up for you. From what I’m seeing, quoting the property values is just baked in.

E.g.

$array = @(
 [pscustomobject]@{
 ProductID = 1234
 name = "stuff"
 },
 [pscustomobject]@{
 ProductID = 321
 name = "things"
})

Now I have a quick example array. Piping it directly to ConvertTo-Csv reveals the quotes:

PS> $array | ConvertTo-Csv -NoTypeInformation
"ProductID","name"
"1234","stuff"
"321","things"

Playing around though, I looked at ConvertTo-Json and it respects the original integer type:

PS> $array | ConvertTo-Json
[
    {
        "ProductID":  1234,
        "name":  "stuff"
    },
    {
        "ProductID":  321,
        "name":  "things"
    }
]

Writing that out to a file and then re-ingesting it in PowerShell

PS> $array | ConvertTo-Json | Out-File C:\temp\data.txt
PS> $import = Get-Content C:\temp\data.txt | ConvertFrom-Json
PS> $import

ProductID name
--------- ----
     1234 stuff
      321 things
# checking the type of one of the ProductID properties
PS> $import[0].ProductID.GetType()

IsPublic IsSerial Name                                     BaseType
-------- -------- ----                                     --------
True     True     Int32                                    System.ValueType

So, if you are using PowerShell for the import process, I would maybe consider using Json instead of a CSV to preserve property value types.

1 Like

Does your import process require a csv? Could you take the data in PowerShell and use that directly to insert into the database? You should be able to construct a SQL INSERT query using the data from $table.

thank you for thinking out of the box….

I am going from msft SQL to big Commence ecommerce (do not know what they use)

since this seems hard, I have contacted big commerce support and they have done something to allow me to use the key as text in a CSV

I have done some small tests and it seems to be working now. I will know on Sunday when I do the entire load.

thanks again!

Gary

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.