Approach ideas?

Looking for ideas on an approach for this.

 

-Directory contains ~370 xml files , each in their own subdirectory.

-Need to run a script against all files and append data to a single CSV after parsing elements from these files.

 

My thoughts are to use an identifier from each xml to append to a new line(They have unique ID numbers) , starting with the first item, then append the rest of the data from that report to the same line. Then, using a ForEach loop, parse through the rest of the $items in $directory. The data in the XML is appended to a specific $column dependent on the xml element.

 

Does this sound like the best approach? Can ForEach do this?
Opinions welcomed! Thanks in advance.

Sure it’s possible. The best approach is a rather opinionated thing. All that matters is that you have something the works for your use case, and meets the KPI’s you need.

Anytime you are looking at a lot of ( or large files) you need to take under considerations that will take as long as it takes. There are performance tweak to do with anything, but based on your simple scan, extract a small piece of data, should not be that big of a hit, depending on how large those XML files are and how much has to be scanned to find the bits you are after.

Grabbing content is just a matter of Select-String, or RegEx matches and append to a CSV.