Encoding.UTF8
is actually the default encoding used by WriteAllLines
and ReadAllLines
. So if reading and writing using this encoding "corrupts" your data, you need to use a different one.
You need to determine what the original encoding of the file located at FilePath
is and then specify it like this
File.ReadAllLines(FilePath, *encoding*);
File.WriteAllLines(FilePath, WholeFile, *encoding*);
A likely encoding would be Encoding.Default
(windows-1252), try it out. If that doesn't work, you have to check how the file is actually written before you append to it.
However, if it contains a lot of non-character data as your screenshots indicate, maybe you have to consider the file to be a "binary" type. In this case you should use ReadAllBytes
/ WriteAllBytes
, split the file manually into lines (searching the byte array for \r\n
) and then insert new data at the desired locations. You need to convert strings to a byte array for this purpose using Encoding.GetBytes("...")
(using the right encoding).
Taking some code from another linked answer, full code for this would like:
static class MyEnumerableExtensions
{
//For a source containing N delimiters, returns exactly N+1 lists
public static IEnumerable<List<T>> SplitOn<T>(
this IEnumerable<T> source,
T delimiter)
{
var list = new List<T>();
foreach (var item in source)
{
if (delimiter.Equals(item))
{
yield return list;
list = new List<T>();
}
else
{
list.Add(item);
}
}
yield return list;
}
}
public InsertLine()
{
byte[] bytes = File.ReadAllBytes(...);
List<List<byte>> lines = bytes.SplitOn((byte)'\n').ToList();
string lineToInsert = "Insert this";
byte[] bytesToInsert = Encoding.Default.GetBytes(lineToInsert);
lines.Insert(2, new List<byte>(bytesToInsert));
File.WriteAllBytes(..., lines.SelectMany(x => x).ToArray());
}