Enclosing every field with double quotes in CSV file using c# - c#

I have found several threads on people wishing to REMOVE quotes from their CSV file, but not adding them. And the ones I have found about adding quotes have not helped my case.
I'm using Microsoft.Office.Interop.Excel and am creating a CSV file that will be read by a program that oddly enough requires each field to be in double quotes. However when I write to a cell using, for example:
xlSheet.Cells[1,1] = "\"" + id + "\"";
my output is """id"""
Is there any fix for this? My client also wishes to be able to open the file in Excel, hence my use of Microsoft.Office.Interop

You dont really have to write to the file using Microsoft.Office.Interop.Excel instead just write to a file using a StreamWriter with the name of the file as Your_File.csv. And still u can open this CSV file using Excel. Remember to use proper delimiters in the CSV file. Hope this helps.

Excel may be interpreting your "id" value as a literal string rather than a number, then adding additional quotes to it when it converts it to CSV.
Rather than adding quotes, store the value as a string instead of a number:
xlSheet.Cells[1,1] = id.ToString();

Related

in C# when I convert excel to text with the help of excel saveAs method then extra double quotes come with string(text) data

in C# when I convert excel to text with the help of excel saveAs method then extra double quotes come with string(text) data, for this
Problem after did google, we found one solution to use .prn(printer file) then issue resolved but it's printer file so data cut.full data not coming in printer file. I'm not found the proper solution so that.
what is the best approach to get full data without double quotes?
Thanks,
Kapil
I think your converting .csv file into text file. As .csv file contains commas at last of each line, you have to remove them.
System.IO.File.WriteAllText(fileName,File.ReadAllText(fi.FullName).Replace(",", "\t"));
This will remove last commas in each row.

Export datatable to Excel asp.net : How to Format excel cells to text in Response.Write()? [duplicate]

Does anyone happen to know if there is a token I can add to my csv for a certain field so Excel doesn't try to convert it to a date?
I'm trying to write a .csv file from my application and one of the values happens to look enough like a date that Excel is automatically converting it from text to a date. I've tried putting all of my text fields (including the one that looks like a date) within double quotes, but that has no effect.
I have found that putting an '=' before the double quotes will accomplish what you want. It forces the data to be text.
eg. ="2008-10-03",="more text"
EDIT (according to other posts): because of the Excel 2007 bug noted by Jeffiekins one should use the solution proposed by Andrew: "=""2008-10-03"""
I know this is an old question, but the problem is not going away soon. CSV files are easy to generate from most programming languages, rather small, human-readable in a crunch with a plain text editor, and ubiquitous.
The problem is not only with dates in text fields, but anything numeric also gets converted from text to numbers. A couple of examples where this is problematic:
ZIP/postal codes
telephone numbers
government ID numbers
which sometimes can start with one or more zeroes (0), which get thrown away when converted to numeric. Or the value contains characters that can be confused with mathematical operators (as in dates: /, -).
Two cases that I can think of that the "prepending =" solution, as mentioned previously, might not be ideal is
where the file might be imported into a program other than MS Excel (MS Word's Mail Merge function comes to mind),
where human-readability might be important.
My hack to work around this
If one pre/appends a non-numeric and/or non-date character in the value, the value will be recognized as text and not converted. A non-printing character would be good as it will not alter the displayed value. However, the plain old space character (\s, ASCII 32) doesn't work for this as it gets chopped off by Excel and then the value still gets converted. But there are various other printing and non-printing space characters that will work well. The easiest however is to append (add after) the simple tab character (\t, ASCII 9).
Benefits of this approach:
Available from keyboard or with an easy-to-remember ASCII code (9),
It doesn't bother the importation,
Normally does not bother Mail Merge results (depending on the template layout - but normally it just adds a wide space at the end of a line). (If this is however a problem, look at other characters e.g. the zero-width space (ZWSP, Unicode U+200B)
is not a big hindrance when viewing the CSV in Notepad (etc),
and could be removed by find/replace in Excel (or Notepad etc).
You don't need to import the CSV, but can simply double-click to open the CSV in Excel.
If there's a reason you don't want to use the tab, look in an Unicode table for something else suitable.
Another option
might be to generate XML files, for which a certain format also is accepted for import by newer MS Excel versions, and which allows a lot more options similar to .XLS format, but I don't have experience with this.
So there are various options. Depending on your requirements/application, one might be better than another.
Addition
It needs to be said that newer versions (2013+) of MS Excel don't open the CSV in spreadsheet format any more - one more speedbump in one's workflow making Excel less useful... At least, instructions exist for getting around it. See e.g. this Stackoverflow: How to correctly display .csv files within Excel 2013?
.
Working off of Jarod's solution and the issue brought up by Jeffiekins, you could modify
"May 16, 2011"
to
"=""May 16, 2011"""
I had a similar problem and this is the workaround that helped me without having to edit the csv file contents:
If you have the flexibility to name the file something other than ".csv", you can name it with a ".txt" extension, such as "Myfile.txt" or "Myfile.csv.txt". Then when you open it in Excel (not by drag and drop, but using File->Open or the Most Recently Used files list), Excel will provide you with a "Text Import Wizard".
In the first page of the wizard, choose "Delimited" for the file type.
In the second page of the wizard choose "," as the delimiter and also choose the text qualifier if you have surrounded your values by quotes
In the third page, select every column individually and assign each the type "Text" instead of "General" to stop Excel from messing with your data.
Hope this helps you or someone with a similar problem!
2018
The only proper solution that worked for me (and also without modifying the CSV).
Excel 2010:
Create new workbook
Data > From Text > Select your CSV file
In the popup, choose "Delimited" radio button, then click "Next >"
Delimiters checkboxes: tick only "Comma" and uncheck the other options, then click "Next >"
In the "Data preview", scroll to the far right, then hold shift and click on the last column (this will select all columns). Now in the "Column data format" select the radio button "Text", then click "Finish".
Excel office365: (client version)
Create new workbook
Data > From Text/CSV > Select your CSV file
Data type detection > do not detect
Note: Excel office365 (web version), as I'm writing this, you will not be able to do that.
WARNING: Excel '07 (at least) has a(nother) bug: if there's a comma in the contents of a field, it doesn't parse the ="field, contents" correctly, but rather puts everything after the comma into the following field, regardless of the quotation marks.
The only workaround I've found that works is to eliminate the = when the field contents include a comma.
This may mean that there are some fields that are impossible to represent exactly "right" in Excel, but by now I trust no-one is too surprised.
While creating the string to be written to my CSV file in C# I had to format it this way:
"=\"" + myVariable + "\""
In Excel 2010 open a new sheet.
On the Data ribbon click "Get External Data From Text".
Select your CSV file then click "Open".
Click "Next".
Uncheck "Tab", place a check mark next to "Comma", then click "Next".
Click anywhere on the first column.
While holding the shift key drag the slider across until you can click in the last column, then release the shift key.
Click the "text" radio button then click "Finish"
All columns will be imported as text, just as they were in the CSV file.
Still an issue in Microsoft Office 2016 release, rather disturbing for those of us working with gene names such as MARC1, MARCH1, SEPT1 etc.
The solution I've found to be the most practical after generating a ".csv" file in R, that will then be opened/shared with Excel users:
Open the CSV file as text (notepad)
Copy it (ctrl+a, ctrl+c).
Paste it in a new excel sheet -it will all paste in one column as long text strings.
Choose/select this column.
Go to Data- "Text to columns...", on the window opened choose "delimited" (next). Check that "comma" is marked (marking it will already show the separation of the data to columns below) (next), in this window you can choose the column you want and mark it as text (instead of general) (Finish).
HTH
Here is the simple method we use at work here when generating the csv file in the first place, it does change the values a bit so it is not suitable in all applications:
Prepend a space to all values in the csv
This space will get stripped off by excel from numbers such as " 1"," 2.3" and " -2.9e4" but will remain on dates like " 01/10/1993" and booleans like " TRUE", stopping them being converted into excel's internal data types.
It also stops double quotes being zapped on read in, so a foolproof way of making text in a csv remain unchanged by excel EVEN IF is some text like "3.1415" is to surround it with double quotes AND prepend the whole string with a space, i.e. (using single quotes to show what you would type) ' "3.1415"'. Then in excel you always have the original string, except it is surrounded by double quotes and prepended by a space so you need to account for those in any formulas etc.
(Assuming Excel 2003...)
When using the Text-to-Columns Wizard has, in Step 3 you can dictate the data type for each of the columns. Click on the column in the preview and change the misbehaving column from "General" to "Text."
This is a only way I know how to accomplish this without messing inside the file itself. As usual with Excel, I learned this by beating my head on the desk for hours.
Change the .csv file extension to .txt; this will stop Excel from auto-converting the file when it's opened. Here's how I do it: open Excel to a blank worksheet, close the blank sheet, then File => Open and choose your file with the .txt extension. This forces Excel to open the "Text Import Wizard" where it'll ask you questions about how you want it to interpret the file. First you choose your delimiter (comma, tab, etc...), then (here's the important part) you choose a set columns of columns and select the formatting. If you want exactly what's in the file then choose "Text" and Excel will display just what's between the delimiters.
(EXCEL 2007 and later)
How to force excel not to "detect" date formats without editing the source file
Either:
rename the file as .txt
If you can't do that, instead of opening the CSV file directly in excel, create a new workbook then go to
Data > Get external data > From Text and select your CSV.
Either way, you will be presented with import options, simply select each column containing dates and tell excel to format as "text" not "general".
What I have done for this same problem was to add the following before each csv value:
"="""
and one double quote after each CSV value, before opening the file in Excel. Take the following values for example:
012345,00198475
These should be altered before opening in Excel to:
"="""012345","="""00198475"
After you do this, every cell value appears as a formula in Excel and so won't be formatted as a number, date, etc. For example, a value of 012345 appears as:
="012345"
None of the solutions offered here is a good solution. It may work for individual cases, but only if you're in control of the final display. Take my example: my work produces list of products they sell to retail. This is in CSV format and contain part-codes, some of them start with zero's, set by manufacturers (not under our control). Take away the leading zeroes and you may actually match another product.
Retail customers want the list in CSV format because of back-end processing programs, that are also out of our control and different per customer, so we cannot change the format of the CSV files. No prefixed'=', nor added tabs. The data in the raw CSV files is correct; it's when customers open those files in Excel the problems start. And many customers are not really computer savvy. They can just about open and save an email attachment.
We are thinking of providing the data in two slightly different formats: one as Excel Friendly (using the options suggested above by adding a TAB, the other one as the 'master'. But this may be wishful thinking as some customers will not understand why we need to do this. Meanwhile we continue to keep explaining why they sometimes see 'wrong' data in their spreadsheets.
Until Microsoft makes a proper change I see no proper resolution to this, as long as one has no control over how end-users use the files.
I have jus this week come across this convention, which seems to be an excellent approach, but I cannot find it referenced anywhere. Is anyone familiar with it? Can you cite a source for it? I have not looked for hours and hours but am hoping someone will recognize this approach.
Example 1: =("012345678905") displays as 012345678905
Example 2: =("1954-12-12") displays as 1954-12-12, not 12/12/1954.
Hi I have the same issue,
I write this vbscipt to create another CSV file. The new CSV file will have a space in font of each field, so excel will understand it as text.
So you create a .vbs file with the code below (for example Modify_CSV.vbs), save and close it. Drag and Drop your original file to your vbscript file. It will create a new file with "SPACE_ADDED" to file name in the same location.
Set objArgs = WScript.Arguments
Set objFso = createobject("scripting.filesystemobject")
dim objTextFile
dim arrStr ' an array to hold the text content
dim sLine ' holding text to write to new file
'Looping through all dropped file
For t = 0 to objArgs.Count - 1
' Input Path
inPath = objFso.GetFile(wscript.arguments.item(t))
' OutPut Path
outPath = replace(inPath, objFso.GetFileName(inPath), left(objFso.GetFileName(inPath), InStrRev(objFso.GetFileName(inPath),".") - 1) & "_SPACE_ADDED.csv")
' Read the file
set objTextFile = objFso.OpenTextFile(inPath)
'Now Creating the file can overwrite exiting file
set aNewFile = objFso.CreateTextFile(outPath, True)
aNewFile.Close
'Open the file to appending data
set aNewFile = objFso.OpenTextFile(outPath, 8) '2=Open for writing 8 for appending
' Reading data and writing it to new file
Do while NOT objTextFile.AtEndOfStream
arrStr = split(objTextFile.ReadLine,",")
sLine = "" 'Clear previous data
For i=lbound(arrStr) to ubound(arrStr)
sLine = sLine + " " + arrStr(i) + ","
Next
'Writing data to new file
aNewFile.WriteLine left(sLine, len(sLine)-1) 'Get rid of that extra comma from the loop
Loop
'Closing new file
aNewFile.Close
Next ' This is for next file
set aNewFile=nothing
set objFso = nothing
set objArgs = nothing
Its not the Excel. Windows does recognize the formula, the data as a date and autocorrects. You have to change the Windows settings.
"Control Panel" (-> "Switch to Classic View") -> "Regional and Language
Options" -> tab "Regional Options" -> "Customize..." -> tab "Numbers" -> And
then change the symbols according to what you want.
http://www.pcreview.co.uk/forums/enable-disable-auto-convert-number-date-t3791902.html
It will work on your computer, if these settings are not changed for example on your customers' computer they will see dates instead of data.
Without modifying your csv file you can:
Change the excel Format Cells option to "text"
Then using the "Text Import Wizard" to define the csv cells.
Once imported delete that data
then just paste as plain text
excel will properly format and separate your csv cells as text formatted ignoring auto date formats.
Kind of a silly work around, but it beats modifying the csv data before importing. Andy Baird and Richard sort of eluded to this method, but missed a couple important steps.
In my case, "Sept8" in a csv file generated using R was converted into "8-Sept" by Excel 2013. The problem was solved by using write.xlsx2() function in the xlsx package to generate the output file in xlsx format, which can be loaded by Excel without unwanted conversion. So, if you are given a csv file, you can try loading it into R and converting it into xlsx using the write.xlsx2() function.
EASIEST SOLUTION
I just figured this out today.
Open in Word
Replace all hyphens with en dashes
Save and Close
Open in Excel
Once you are done editing, you can always open it back up in Word again to replace the en dashes with hyphens again.
A workaround using Google Drive (or Numbers if you're on a Mac):
Open the data in Excel
Set the format of the column with incorrect data to Text (Format > Cells > Number > Text)
Load the .csv into Google Drive, and open it with Google Sheets
Copy the offending column
Paste column into Excel as Text (Edit > Paste Special > Text)
Alternatively if you're on a Mac for step 3 you can open the data in Numbers.
(EXCEL 2016 and later, actually I have not tried in older versions)
Open new blank page
Go to tab "Data"
Click "From Text/CSV" and choose your csv file
Check in preview whether your data is correct.
In сase when some column is converted to date click "edit" and then select type Text by clicking on calendar in head of column
Click "Close & Load"
If someone still looking for answer, the line below worked perfectly for me
I entered =("my_value").
i.e. =("04SEP2009") displayed as 04SEP2009 not as 09/04/2009
The same worked for integers more than 15 digits. They weren't getting trimmed anymore.
If you can change the file source data
If you're prepared to alter the original source CSV file, another option is to change the 'delimiter' in the data, so if your data is '4/11' (or 4-11) and Excel converts this to 4/11/2021 (UK or 11-4-2021 US), then changing the '/' or '-' character to something else will thwart the unwantwed Excel date conversion. Options may include:
Tilde ('~')
Plus ('+')
Underscore ('_')
Double-dash ('--')
En-dash (Alt 150)
Em-dash (Alt 151)
(Some other character!)
Note: moving to Unicode or other non-ascii/ansi characters may complicate matters if the file is to be used elsewhere.
So, '4-11' converted to '4~11' with a tilde will NOT be treated as a date!
For large CSV files, this has no additional overhead (ie: extra quotes/spaces/tabs/formula constructs) and just works when the file is opened directly (ie: double-clicking the CSV to open) and avoids pre-formatting columns as text or 'importing' the CSV file as text.
A search/replace in Notepad (or similar tool) can easily convert to/from the alternative delimiter, if necessary.
Import the original data
In newer versions of Excel you can import the data (outlined in other answers).
In older versions of Excel, you can install the 'Power Query' add-in. This tool can also import CSVs without conversion. Choose: Power Query tab/From file/From Text-CSV, then 'Load' to open as a table. (You can choose 'do not detect data types' from the 'data type detection' options).
I know this is an old thread. For the ones like me, who still have this problem using Office 2013 via PowerShell COM object can use the opentext method. The problem is that this method has many arguments, that are sometimes mutual exclusive. To resolve this issue you can use the invoke-namedparameter method introduced in this post.
An example would be
$ex = New-Object -com "Excel.Application"
$ex.visible = $true
$csv = "path\to\your\csv.csv"
Invoke-NamedParameter ($ex.workbooks) "opentext" #{"filename"=$csv; "Semicolon"= $true}
Unfortunately I just discovered that this method somehow breaks the CSV parsing when cells contain line breaks. This is supported by CSV but Microsoft's implementation seems to be bugged.
Also it did somehow not detect German-specific chars. Giving it the correct culture did not change this behaviour. All files (CSV and script) are saved with utf8 encoding.
First I wrote the following code to insert the CSV cell by cell.
$ex = New-Object -com "Excel.Application"
$ex.visible = $true;
$csv = "path\to\your\csv.csv";
$ex.workbooks.add();
$ex.activeWorkbook.activeSheet.Cells.NumberFormat = "#";
$data = import-csv $csv -encoding utf8 -delimiter ";";
$row = 1;
$data | %{ $obj = $_; $col = 1; $_.psobject.properties.Name |%{if($row -eq1){$ex.ActiveWorkbook.activeSheet.Cells.item($row,$col).Value2= $_ };$ex.ActiveWorkbook.activeSheet.Cells.item($row+1,$col).Value2 =$obj.$_; $col++ }; $row++;}
But this is extremely slow, which is why I looked for an alternative. Apparently, Excel allows you to set the values of a range of cells with a matrix. So I used the algorithm in this blog to transform the CSV in a multiarray.
function csvToExcel($csv,$delimiter){
$a = New-Object -com "Excel.Application"
$a.visible = $true
$a.workbooks.add()
$a.activeWorkbook.activeSheet.Cells.NumberFormat = "#"
$data = import-csv -delimiter $delimiter $csv;
$array = ($data |ConvertTo-MultiArray).Value
$starta = [int][char]'a' - 1
if ($array.GetLength(1) -gt 26) {
$col = [char]([int][math]::Floor($array.GetLength(1)/26) + $starta) + [char](($array.GetLength(1)%26) + $Starta)
} else {
$col = [char]($array.GetLength(1) + $starta)
}
$range = $a.activeWorkbook.activeSheet.Range("a1:"+$col+""+$array.GetLength(0))
$range.value2 = $array;
$range.Columns.AutoFit();
$range.Rows.AutoFit();
$range.Cells.HorizontalAlignment = -4131
$range.Cells.VerticalAlignment = -4160
}
function ConvertTo-MultiArray {
param(
[Parameter(Mandatory=$true, Position=1, ValueFromPipeline=$true)]
[PSObject[]]$InputObject
)
BEGIN {
$objects = #()
[ref]$array = [ref]$null
}
Process {
$objects += $InputObject
}
END {
$properties = $objects[0].psobject.properties |%{$_.name}
$array.Value = New-Object 'object[,]' ($objects.Count+1),$properties.count
# i = row and j = column
$j = 0
$properties |%{
$array.Value[0,$j] = $_.tostring()
$j++
}
$i = 1
$objects |% {
$item = $_
$j = 0
$properties | % {
if ($item.($_) -eq $null) {
$array.value[$i,$j] = ""
}
else {
$array.value[$i,$j] = $item.($_).tostring()
}
$j++
}
$i++
}
$array
}
}
csvToExcel "storage_stats.csv" ";"
You can use above code as is; it should convert any CSV into Excel. Just change the path to the CSV and the delimiter character at the bottom.
Okay found a simple way to do this in Excel 2003 through 2007. Open a blank xls workbook. Then go to Data menu, import external data. Select your csv file. Go through the wizard and then in "column data format" select any column that needs to be forced to "text". This will import that entire column as a text format preventing Excel from trying to treat any specific cells as a date.
This issue is still present in Mac Office 2011 and Office 2013, I cannot prevent it happening. It seems such a basic thing.
In my case I had values such as "1 - 2" & "7 - 12" within the CSV enclosed correctly within inverted commas, this automatically converts to a date within excel, if you try subsequently convert it to just plain text you would get a number representation of the date such as 43768. Additionally it reformats large numbers found in barcodes and EAN numbers to 123E+ numbers again which cannot be converted back.
I have found that Google Drive's Google Sheets doesnt convert the numbers to dates. The barcodes do have commas in them every 3 characters but these are easily removed. It handles CSVs really well especially when dealing with MAC / Windows CSVs.
Might save someone sometime.
I do this for credit card numbers which keep converting to scientific notation: I end up importing my .csv into Google Sheets. The import options now allow to disable automatic formatting of numeric values. I set any sensitive columns to Plain Text and download as xlsx.
It's a terrible workflow, but at least my values are left the way they should be.
I made this VBA macro which basically formats the output range as text before pasting the numbers. It works perfectly for me when I want to paste values such as 8/11, 23/6, 1/3, etc. without Excel interpreting them as dates.
Sub PasteAsText()
' Created by Lars-Erik Sørbotten, 2017-09-17
Call CreateSheetBackup
Columns(ActiveCell.Column).NumberFormat = "#"
Dim DataObj As MSForms.DataObject
Set DataObj = New MSForms.DataObject
DataObj.GetFromClipboard
ActiveCell.PasteSpecial
End Sub
I'm very interested in knowing if this works for other people as well. I've been looking for a solution to this problem for a while, but I haven't seen a quick VBA solution to it that didn't include inserting ' in front of the input text. This code retains the data in its original form.

Find a pattern and replace an element of it

I have the following problem:
I am trying to split the rows of a CSV file but the thing is that sometimes I read the following line:
string input = "a,b,c,d,\"V=12.503,I=0.194\",e,f"
I use the following code
string[] SplittedLine= input.split(',');
The result is that i get an extra column because the data \"V=12.503,I=0.194\" has a comma inside, but when I open the CSV file with excel i noticed that Excel doesn't add an extra column because it doesn't split that data into two different data. How can I properly split this CSV file considering this situation?
You are encountering commas in the "cells" of your CSV, which by convention (but not by any standard) are escaped by wrapping the cell data with double quotes. You also need to be aware that the quote-escaped string can contain quote literals.
Let's say you had a name column and someone's name was
Jonathan "Jake" Smith, Jr.
That would be encoded as
"Jonathan ""Jake"" Smith, Jr."
You can certainly improve your code to handle those cases. However, that problem has been solved before. If you don't want to reinvent the wheel, there are a number of solid open source libraries that handle the headache of parsing CSV files. The one I use is
http://www.codeproject.com/Articles/9258/A-Fast-CSV-Reader

CSV printing issue

I have a program the writes to a csv two columns. One is a ratio, of the form 1:100, or something similar, and the other is a score.
These values are stored in a Dictionary and printed as follows:
foreach (String s in p.normalizedScore.Keys)
{
sw.WriteLine(s + DELIM + p.normalizedScore[s]);
}
where sw is a StreamWriter and DELIM is a comma. The outputs is as follows:
1:10,7.498378506
0.111111111,18.46320676
0.736111111,30.08283816
1:10000 ,40.80688802
1:100000 ,51.93716854
1:1000000,62.89993635
1:10000000,73.54010349
The scores are all correct, but 2 of the ratios are printed incorrectly (it should be increasing 10 fold, so there should be a 1:100 and a 1:1000). When I enter the debugger, I find that at the time of printing, it's still reading all the ratios correctly, meaning I can't locate any place in my code where the ratios are wrong. Does anyone have an idea as to what the problem might be?
Edit: The above output was copied directly from Excel, but if I look at it in Notepad, the data seems fine, so it seems to me the problem is with Excel. (Still don't know what it is mind you.)
As per the comments - this was an Excel formatting issue, not a code issue.
Adding Cory's comment to the answer because I think it adds significant value:
If you put quotes around your ratios, Excel shouldn't try any funny
business with formatting (you won't see the quotations in Excel,
that's just part of the CSV spec for qualifying your values).
Do not just double click the file and open in excel. Open a new worksheet and import from text file. You will then specify it's a column of text - or you can create your csv with a text qualifier and use that in the import as well.
alternatively you can add a space in front of your s variable.
sw.WriteLine(" "+ s + DELIM + p.normalizedScore[s]);
' 1:10' won't be treated as an expression.

How to delete a string from a text file?

I currently have two strings assigned - domain,subdomain
How could I delete any matched occurrences of these strings in a text file?
string domain = "127.0.0.1 test.com"
string subdomain = "127.0.0.1 sub.test.com"
I don't think using a regex would be ideal in this situation.
How can this be done?
You need to:
Open the existing file for input
Open a new file for output
Repeatedly:
Read a line of text from the input
See if it matches your pattern (it's unclear at the moment what pattern you're looking for)
If it doesn't, write the line to the output (or if you're only trying to remove bits of lines, work out which bit you want to write out)
Close both the input and output (a using statement will do this automatically)
Optionally delete the original file and rename the new one if you want to effectively replace the original.
var result = Regex.Replace(File.ReadAllText("file.txt"),
#"127\.0\.0\.1 test\.com|127\.0\.0\.1 sub\.test\.com", string.Empty);
Then write to file obtained result.

Categories

Resources