# CSV File Imports in.Net

I understand this is a rookie inquiry, yet I'm seeking a straightforward remedy - it feels like there need to be one.

What's the most effective means to import a CSV file right into a strongly-typed data structure? Once more straightforward = far better.

0
2019-05-03 18:26:44
Source Share

If you are anticipating rather intricate circumstances for CSV parsing, do not also invent of rolling our very own parser . There are a great deal of superb devices around, like FileHelpers, or perhaps ones from CodeProject.

The factor is this is a rather usual trouble and also you can wager that a whole lot of software program programmers have actually currently thought of and also addressed this trouble.

0
2019-12-03 04:04:12
Source

I concur with @NotMyself. FileHelpers is well examined and also takes care of all sort of side instances that you'll at some point need to manage if you do it on your own. Have a look at what FileHelpers does and also just write your very own if you are definitely certain that either (1) you will certainly never ever require to take care of the side instances FileHelpers does, or (2) you enjoy creating this sort of things and also are mosting likely to be tickled when you need to parse things similar to this :

1, "Bill", "Smith", "Supervisor", "No Comment"

2, 'Drake,', 'O'Malley', "Janitor,

Oops, I'm not priced estimate and also I'm on a new line!

0
2019-12-03 04:03:55
Source

I needed to make use of a CSV parser in.NET for a task this summer season and also decided on the Microsoft Jet Text Driver. You define a folder making use of a link string, after that quiz a file making use of a SQL Select declaration. You can define solid kinds making use of a schema.ini file. I really did not do this in the beginning, yet after that I was obtaining negative outcomes where the sort of the information had not been quickly noticeable, such as IP numbers or an access like "XYQ 3.9 SP1".

One constraint I faced is that it can not take care of column names over 64 personalities ; it trims. This should not be a trouble, other than I was managing really inadequately made input information. It returns an ADO.NET DataSet.

This was the most effective remedy I located. I would certainly watch out for rolling my very own CSV parser, given that I would possibly miss out on several of completion instances, and also I really did not locate any kind of various other free CSV parsing plans for.NET around.

EDIT : Also, there can just be one schema.ini file per directory site, so I dynamically added to it to highly type the required columns. It will just highly - type the columns defined, and also presume for any kind of undefined area. I actually valued this, as I was managing importing a liquid 70+column CSV and also really did not intend to define each column, just the misbehaving ones.

0
2019-12-03 03:33:41
Source

Brian offers a wonderful remedy for transforming it to a highly keyed in collection.

A lot of the CSV parsing approaches offered do not think about running away areas or several of the various other nuances of CSV documents (like cutting areas). Below is the code I directly make use of. It is a little bit harsh around the sides and also has virtually no mistake coverage.

public static IList<IList<string>> Parse(string content)
{
IList<IList<string>> records = new List<IList<string>>();

bool inQoutedString = false;
IList<string> record = new List<string>();
StringBuilder fieldBuilder = new StringBuilder();
{

{
// If it's a \r\n combo consume the \n part and throw it away.
{
}

if (inQoutedString)
{
{
fieldBuilder.Append('\r');
}
fieldBuilder.Append('\n');
}
else
{
fieldBuilder = new StringBuilder();

record = new List<string>();

inQoutedString = false;
}
}
else if (fieldBuilder.Length == 0 && !inQoutedString)
{
{
}
{
inQoutedString = true;
}
{
fieldBuilder = new StringBuilder();
}
else
{
}
}
{
if (inQoutedString)
{
fieldBuilder.Append(',');
}
else
{
fieldBuilder = new StringBuilder();
}
}
{
if (inQoutedString)
{
{
fieldBuilder.Append('"');
}
else
{
inQoutedString = false;
}
}
else
{
}
}
else
{
}
}

return records;
}


Note that this does not take care of the side instance of areas not being deliminated by double quotes, yet meerley having actually a priced estimate string within it. See this post awhile of a far better expanation along with some web links to some correct collections.

0
2019-05-22 12:25:23
Source

I was burnt out so i changed some things i created. It attempt is to envelop the parsing in an OO fashion whle lowering the quantity of models via the file, it just repeats as soon as on top foreach.

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.IO;

namespace ConsoleApplication1
{
class Program
{

static void Main(string[] args)
{

// usage:

// note this wont run as getting streams is not Implemented

// but will get you started

CSVFileParser fileParser = new CSVFileParser();

// TO Do:  configure fileparser

PersonParser personParser = new PersonParser(fileParser);

List<Person> persons = new List<Person>();
// if the file is large and there is a good way to limit
// without having to reparse the whole file you can use a
// linq query if you desire
foreach (Person person in personParser.GetPersons())
{
}

// now we have a list of Person objects
}
}

public abstract  class CSVParser
{

protected String[] deliniators = { "," };

protected internal IEnumerable<String[]> GetRecords()
{

Stream stream = GetStream();

String[] aRecord;
{
StringSplitOptions.None);

yield return aRecord;
}

}

protected abstract Stream GetStream();

}

public class CSVFileParser : CSVParser
{
// to do: add logic to get a stream from a file

protected override Stream GetStream()
{
throw new NotImplementedException();
}
}

public class CSVWebParser : CSVParser
{
// to do: add logic to get a stream from a web request

protected override Stream GetStream()
{
throw new NotImplementedException();
}
}

public class Person
{
public String Name { get; set; }
public String Address { get; set; }
public DateTime DOB { get; set; }
}

public class PersonParser
{

public PersonParser(CSVParser parser)
{
this.Parser = parser;
}

public CSVParser Parser { get; set; }

public  IEnumerable<Person> GetPersons()
{
foreach (String[] record in this.Parser.GetRecords())
{
yield return new Person()
{
Name = record[0],
DOB = DateTime.Parse(record[2]),
};
}
}
}
}

0
2019-05-13 14:29:09
Source

If you can assure that there are no commas in the information, after that the most basic means would possibly be to make use of String.split.

As an example :

String[] values = myString.Split(',');
myObject.StringField = values[0];
myObject.IntField = Int32.Parse(values[1]);


There might be collections you can make use of to aid, yet that's possibly as straightforward as you can get. Simply see to it you can not have commas in the information, or else you will certainly require to parse it much better.

0
2019-05-08 15:49:07
Source

There are 2 write-ups on CodeProject that give code for a remedy, one that makes use of StreamReader and also one that imports CSV data making use of the Microsoft Text Driver.

0
2019-05-08 15:28:52
Source
0
2019-05-08 15:09:07
Source

An excellent straightforward means to do it is to open the file, and also read each line right into an array, connected checklist, information - framework - of - your - selection. Take care concerning taking care of the first line though.

This might more than your head, yet there appears to be a straight means to access them too making use of a connection string.

Why not attempt making use of Python as opposed to C# or VB? It has a wonderful CSV component to import that does all the hefty training for you.

0
2019-05-08 14:57:30
Source