'Why does my file sometimes disappear in the process of reading from it or writing to it?

I have an app that reads from text files to determine which reports should be generated. It works as it should most of the time, but once in awhile, the program deletes one of the text files it reads from/writes to. Then an exception is thrown ("Could not find file") and progress ceases.

Here is some pertinent code.

First, reading from the file:

List<String> delPerfRecords = ReadFileContents(DelPerfFile);

. . .

private static List<String> ReadFileContents(string fileName)
{
    List<String> fileContents = new List<string>();
    try
    {
        fileContents = File.ReadAllLines(fileName).ToList();
    }
    catch (Exception ex)
    {
        RoboReporterConstsAndUtils.HandleException(ex);
    }
    return fileContents;
}

Then, writing to the file -- it marks the record/line in that file as having been processed, so that the same report is not re-generated the next time the file is examined:

MarkAsProcessed(DelPerfFile, qrRecord);

. . .

private static void MarkAsProcessed(string fileToUpdate, string 
qrRecord)
{
    try
    {
        var fileContents = File.ReadAllLines(fileToUpdate).ToList();
        for (int i = 0; i < fileContents.Count; i++)
        {
            if (fileContents[i] == qrRecord)
            {
                fileContents[i] = string.Format("{0}{1} {2}"
qrRecord, RoboReporterConstsAndUtils.COMPLETED_FLAG, DateTime.Now);
            }
        }
        // Will this automatically overwrite the existing?
        File.Delete(fileToUpdate);
        File.WriteAllLines(fileToUpdate, fileContents);
    }
    catch (Exception ex)
    {
        RoboReporterConstsAndUtils.HandleException(ex);
    }
}

So I do delete the file, but immediately replace it:

File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);

The files being read have contents such as this:

Opas,20170110,20161127,20161231-COMPLETED 1/10/2017 12:33:27 AM
Opas,20170209,20170101,20170128-COMPLETED 2/9/2017 11:26:04 AM
Opas,20170309,20170129,20170225-COMPLETED
Opas,20170409,20170226,20170401

If "-COMPLETED" appears at the end of the record/row/line, it is ignored - will not be processed.

Also, if the second element (at index 1) is a date in the future, it will not be processed (yet).

So, for these examples shown above, the first three have already been done, and will be subsequently ignored. The fourth one will not be acted on until on or after April 9th, 2017 (at which time the data within the data range of the last two dates will be retrieved).

Why is the file sometimes deleted? What can I do to prevent it from ever happening?

If helpful, in more context, the logic is like so:

internal static string GenerateAndSaveDelPerfReports()
{
    string allUnitsProcessed = String.Empty;
    bool success = false;
    try
    {
        List<String> delPerfRecords = ReadFileContents(DelPerfFile);
        List<QueuedReports> qrList = new List<QueuedReports>();
        foreach (string qrRecord in delPerfRecords)
        {
            var qr = ConvertCRVRecordToQueuedReport(qrRecord);
            // Rows that have already been processed return null
            if (null == qr) continue;
            // If the report has not yet been run, and it is due, add i
to the list
            if (qr.DateToGenerate <= DateTime.Today)
            {
                var unit = qr.Unit;
                qrList.Add(qr);
                MarkAsProcessed(DelPerfFile, qrRecord);
                if (String.IsNullOrWhiteSpace(allUnitsProcessed))
                {
                    allUnitsProcessed = unit;
                }
                else if (!allUnitsProcessed.Contains(unit))
                {
                    allUnitsProcessed = allUnitsProcessed + " and "  
unit;
                }
            }
        }
        foreach (QueuedReports qrs in qrList)
        {
            GenerateAndSaveDelPerfReport(qrs);
            success = true;
        }
    }
    catch
    {
        success = false;
    }
    if (success)
    {
        return String.Format("Delivery Performance report[s] generate
for {0} by RoboReporter2017", allUnitsProcessed);
    }
    return String.Empty;
}

How can I ironclad this code to prevent the files from being periodically trashed?

UPDATE

I can't really test this, because the problem occurs so infrequently, but I wonder if adding a "pause" between the File.Delete() and the File.WriteAllLines() would solve the problem?

UPDATE 2

I'm not absolutely sure what the answer to my question is, so I won't add this as an answer, but my guess is that the File.Delete() and File.WriteAllLines() were occurring too close together and so the delete was sometimes occurring on both the old and the new copy of the file.

If so, a pause between the two calls may have solved the problem 99.42% of the time, but from what I found here, it seems the File.Delete() is redundant/superfluous anyway, and so I tested with the File.Delete() commented out, and it worked fine; so, I'm just doing without that occasionally problematic call now. I expect that to solve the issue.



Solution 1:[1]

// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);

I would simply add an extra parameter to WriteAllLines() (which could default to false) to tell the function to open the file in overwrite mode, and not call File.Delete() at all then.

Do you currently check the return value of the file open?


Update: ok, it looks like WriteAllLines() is a .Net Framework function and therefore cannot be changed, so I deleted this answer. However now this shows up in the comments, as a proposed solution on another forum:

"just use something like File.WriteAllText where if the file exists, the data is just overwritten, if the file does not exist it will be created."

And this was exactly what I meant (while thinking WriteAllLines() was a user defined function), because I've had similar problems in the past.

So, a solution like that could solve some tricky problems (instead of deleting/fast reopening, just overwriting the file) - also less work for the OS, and possibly less file/disk fragmentation.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1