Let’s assume that you have webapp deployed as an app service in Azure. It would be great if we could utilize Log Analytics to capture logs. Currently it is possible to capture various types of details: Windows metrics, infrastructure logs etc., however, Log Analytics is unable to capture application logs. However, by using Webjob or Function we can push logs to OMS manually. Let’s go.
Webapp setup
First, we need to modify application to capture logs in storage. We need to reconfigure Trace
so Azure can handle its results. In MVC or WebAPI application we need to add listener in configuration file:
1 2 3 4 5 6 7 8 9 |
< system.diagnostics> < trace> < listeners> < add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=2.8.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="AzureDiagnostics"> < filter type="" /> < /add> < /listeners> < /trace> < /system.diagnostics> |
Next, we go to Azure Portal, choose our Webapp, go to Diagnostic Logs and enable Application Logging (Blob). We need to configure the storage connection string (so we need to create storage account).
Effect: when you try to log something using Trace
, there will be a file in storage. File is appended and rolled each hour.
Okay, we have logs stored in Azure. Now its time to capture them and push to OMS.
Capturing
In order to capture file we can simply use Webjob or Azure Function. We can use storage trigger so the function will be executed each time there is new file or existing one is modified.
Unfortunately:
- Storage trigger is not guaranteed to execute every time.
- There also might be delays.
- We don’t have information whether the file is new or modified.
- Trigger works simply by scanning whole storage container. It might result in slower response times if you have lots of logs. Also, scan results are stored in the storage.
Good thing is dead letter queue. In case of failure, trigger will execute up to five times and if we are still unable to process the file it will be moved to special queue with poisoned files.
Here goes the code for pushing file to Log Analytics:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 |
using System; using System.IO; using System.Linq; using System.Net.Http; using System.Runtime.Serialization; using Microsoft.Azure.WebJobs; using Microsoft.Azure.WebJobs.Host; using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Blob; using Newtonsoft.Json; namespace LoggingInfrastructure { public class Functions { // Update customerId to your Operations Management Suite workspace ID static string customerId = "OMS workspace id"; // For sharedKey, use either the primary or the secondary Connected Sources client authentication key static string sharedKey = "OMS key"; // LogName is name of the event type that is being submitted to Log Analytics static string LogName = "ApplicationLog"; // You can use an optional field to specify the timestamp from the data. If the time field is not specified, Log Analytics assumes the time is the message ingestion time static string TimeStampField = ""; // Send a request to the POST API endpoint public static void PostData(string signature, string date, string json, TextWriter log) { string url = "https://" + customerId + ".ods.opinsights.azure.com/api/logs?api-version=2016-04-01"; HttpClient client = new System.Net.Http.HttpClient(); client.DefaultRequestHeaders.Add("Accept", "application/json"); client.DefaultRequestHeaders.Add("Log-Type", LogName); client.DefaultRequestHeaders.Add("Authorization", signature); client.DefaultRequestHeaders.Add("x-ms-date", date); client.DefaultRequestHeaders.Add("time-generated-field", TimeStampField); HttpContent httpContent = new StringContent(json, System.Text.Encoding.UTF8); httpContent.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/json"); var result = client.PostAsync(new Uri(url), httpContent).Result; log.WriteLine(result.StatusCode); } // Build the API signature public static string BuildSignature(string message, string secret) { var encoding = new System.Text.ASCIIEncoding(); byte[] keyByte = Convert.FromBase64String(secret); byte[] messageBytes = encoding.GetBytes(message); using (var hmacsha256 = new System.Security.Cryptography.HMACSHA256(keyByte)) { byte[] hash = hmacsha256.ComputeHash(messageBytes); return Convert.ToBase64String(hash); } } // You pass path to blob in BlobTrigger attribute public static void ProcessWebAppBlob([BlobTrigger("logging/{name}")] Stream myBlob, string name, string blobTrigger, TextWriter log) { log.WriteLine($"Handling webapp blob:{name}"); var message = new BlogLogEntry { Path = blobTrigger }; string connectionString = AmbientConnectionStringProvider.Instance.GetConnectionString("AzureWebJobsStorage"); CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString); CloudBlobClient client = storageAccount.CreateCloudBlobClient(); var blob = client.GetBlobReferenceFromServer(new Uri($"{client.BaseUri}{message.Path}")); var entries = new StreamReader(blob.OpenRead()) .ReadToEnd() .Split(new[] {Environment.NewLine}, StringSplitOptions.RemoveEmptyEntries) .Select(l => new LogEntry { Content = l, Path = message.Path }).ToArray(); PushLogEntry(entries, log); } private static void PushLogEntry(object[] entries, TextWriter log) { var json = JsonConvert.SerializeObject(entries); var datestring = DateTime.UtcNow.ToString("r"); string stringToHash = "POST\n" + json.Length + "\napplication/json\n" + "x-ms-date:" + datestring + "\n/api/logs"; string hashedString = BuildSignature(stringToHash, sharedKey); string signature = "SharedKey " + customerId + ":" + hashedString; PostData(signature, datestring, json, log); } } public class BlogLogEntry { public string Path { get; set; } } public class LogEntry { public string Path { get; set; } public string Content { get; set; } } } |
You can also introduce Service Bus and push there message after finding new blob. Service Bus guarantees that the message will not get lost.
You can also use Azure Function instead of Webjob — it might be simpler.