Beyond JSON, many real-world applications process CSV files (data exports, bulk imports), binary files (images, PDFs, compiled data), and need to inspect file metadata. CSV is ubiquitous in enterprise data exchange — every ERP, CRM, and analytics tool exports CSV. Binary formats appear in file uploads, report generation, and image processing. FileInfo and DirectoryInfo provide rich metadata access. Knowing how to handle these formats reliably, including edge cases like different encodings and locked files, completes the file I/O skill set.
CSV Processing
// ── Manual CSV parsing — handles simple cases ─────────────────────────────
public async Task<List<UserImport>> ParseCsvAsync(string filePath)
{
var users = new List<UserImport>();
using var reader = new StreamReader(filePath, Encoding.UTF8);
// Skip header
await reader.ReadLineAsync();
string? line;
int lineNum = 1;
while ((line = await reader.ReadLineAsync()) is not null)
{
lineNum++;
if (string.IsNullOrWhiteSpace(line)) continue;
var parts = line.Split(',');
if (parts.Length < 3)
{
Console.WriteLine($"Warning: line {lineNum} has too few columns, skipping.");
continue;
}
users.Add(new UserImport
{
Name = parts[0].Trim(),
Email = parts[1].Trim(),
Role = parts[2].Trim(),
});
}
return users;
}
// ── Writing CSV ───────────────────────────────────────────────────────────
public async Task ExportToCsvAsync(IEnumerable<User> users, string filePath)
{
await using var writer = new StreamWriter(filePath, append: false, Encoding.UTF8);
await writer.WriteLineAsync("Id,Name,Email,CreatedAt");
foreach (var user in users)
{
// Escape commas in field values
string safeName = user.Name.Contains(',') ? $"\"{user.Name}\"" : user.Name;
await writer.WriteLineAsync($"{user.Id},{safeName},{user.Email},{user.CreatedAt:O}");
}
}
Split(',') breaks on fields that contain commas (e.g., an address field like "London, UK" which is quoted in proper CSV). For production CSV processing, use the CsvHelper NuGet package (dotnet add package CsvHelper). It handles quoted fields, escaped characters, custom delimiters (tab-separated values), and automatic mapping to C# classes. The manual approach above is acceptable for controlled internal files; use CsvHelper for any user-supplied or third-party CSV.IFormFile.OpenReadStream() and CopyToAsync() — never buffer the entire upload in memory. Set a reasonable file size limit in Program.cs: builder.Services.Configure<FormOptions>(o => o.MultipartBodyLengthLimit = 10 * 1024 * 1024) (10MB) and validate file size in the action before processing. Unbounded file uploads can exhaust server memory.FileStream throws IOException: The process cannot access the file because it is being used by another process. Use a try/catch around file access in cases where locking is possible. For ASP.NET Core applications writing to log files or shared export directories, use FileShare.ReadWrite in the FileStream constructor and consider whether multiple processes may write concurrently — in which case a dedicated logging library (Serilog) or a queue-based export pipeline is more appropriate.BinaryReader and BinaryWriter
// ── Write binary data ─────────────────────────────────────────────────────
using var writeStream = File.Create("data.bin");
using var writer = new BinaryWriter(writeStream);
writer.Write(42); // int (4 bytes)
writer.Write(3.14); // double (8 bytes)
writer.Write("Hello"); // length-prefixed string
writer.Write(new byte[] { 1, 2, 3 }); // raw bytes
// ── Read binary data ──────────────────────────────────────────────────────
using var readStream = File.OpenRead("data.bin");
using var reader = new BinaryReader(readStream);
int i = reader.ReadInt32(); // 42
double d = reader.ReadDouble(); // 3.14
string s = reader.ReadString(); // "Hello"
byte[] b = reader.ReadBytes(3); // { 1, 2, 3 }
FileInfo — Rich Metadata
// FileInfo provides metadata without opening the file
var info = new FileInfo("uploads/report.pdf");
Console.WriteLine($"Name: {info.Name}");
Console.WriteLine($"Directory: {info.DirectoryName}");
Console.WriteLine($"Extension: {info.Extension}");
Console.WriteLine($"Size: {info.Length / 1024.0 / 1024.0:F2} MB");
Console.WriteLine($"Created: {info.CreationTimeUtc:O}");
Console.WriteLine($"Modified: {info.LastWriteTimeUtc:O}");
Console.WriteLine($"ReadOnly: {info.IsReadOnly}");
Console.WriteLine($"Exists: {info.Exists}");
// Common validation in file upload handlers
public bool IsValidUpload(FileInfo file) =>
file.Exists
&& file.Length <= 10 * 1024 * 1024 // max 10MB
&& AllowedExtensions.Contains(file.Extension.ToLowerInvariant());
private static readonly HashSet<string> AllowedExtensions =
new(StringComparer.OrdinalIgnoreCase) { ".jpg", ".jpeg", ".png", ".webp", ".pdf" };
Common Mistakes
Mistake 1 — Splitting CSV on comma without handling quoted fields
❌ Wrong — breaks on "Smith, John",john@example.com,Admin:
var parts = line.Split(','); // splits "Smith, John" into two parts!
✅ Correct — use CsvHelper or implement RFC 4180 compliant parsing for production CSV.
Mistake 2 — Not validating file size and type before processing uploads
❌ Wrong — no size check allows multi-GB uploads that exhaust server memory.
✅ Correct — validate file size and extension immediately on upload before any processing begins.