'Log4net integration with google-big-query

I'm trying to capture the logs using log4net package and store it in google bigquery table. I have successfully captured the logs and stored it in file. I can able to read the existing table using C# however not sure how to create a new table and update all the logs using Google.Apis.Bigquery.v2 package.

It would be great if someone guide how to solve it.

This is the documentation which I'm referring. https://developers.google.com/resources/api-libraries/documentation/bigquery/v2/csharp/latest/classGoogle_1_1Apis_1_1Bigquery_1_1v2_1_1Data_1_1JobConfigurationQuery.html#ab086707c20c7703b9c0a3d113fc71aa7

Still unclear what to tweak.

I found an alternate way to do this.

First I stored all the logs to json file using log4net.ext.json. gave the path of the json file in the method (LoadFromFile) which writes all the logs to google big query.

If there's better way, please advise.

 public void LoadFromFile(
           string projectId = "project-id",
           string datasetId = "dataset-id",
           string tableId = "table-id",
           string filePath = @"C:\Users\Documents\Logs\log.json")
            {
                var jsonpath = @"C:\Users\config.json";  //service account credentials
                var credentials = GoogleCredential.FromFile(jsonpath);
    
                BigQueryClient client = BigQueryClient.Create(projectId, credentials);
                // Create job configuration
                var uploadCsvOptions = new UploadCsvOptions()
                {
                    SkipLeadingRows = 1,  // Skips the file headers
                    Autodetect = true,
                    AllowQuotedNewlines = true
                };
                using (FileStream stream = File.Open(filePath, FileMode.Open))
                {
                    // Create and run job
                    // Note that there are methods available for formats other than CSV
                    BigQueryJob job = client.UploadCsv(
                        datasetId, tableId, null, stream, uploadCsvOptions);
                    job.PollUntilCompleted();  // Waits for the job to complete.
                                               // Display the number of rows uploaded
                    BigQueryTable table = client.GetTable(datasetId, tableId);
                    Console.WriteLine(
                        $"Loaded {table.Resource.NumRows} rows to {table.FullyQualifiedId}");
                }
    
            }


Solution 1:[1]

Log4net cannot be directly integrated with BigQuery to store the logs. For storing the logs from log4net into BigQuery, log4net can be integrated with Cloud Logging using Google.Cloud.Logging.Log4Net which is a .NET client library. You can install this package from NuGet and add it to the project. After getting the logs in the Cloud Logging, you can create a sink in the project which will route the logs. The destination of the sink can be set to the BigQuery table. The logs need not to be saved in separate files and logs can be routed to BigQuery.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Shipra Sarkar