What are good/important steps to take when adding Application Insights to a mature piece of software?

58 Views Asked by At

I have been assigned to an old application of ours, and tasked with detecting bottlenecks and improving general speed in the application.

Some context on the application-stack is that it's currently using .net 7, onprem in IIS, EF, MSSQL

Because I have some experience using Application Insights, and there is currently no monitoring software in the app, I created an Application Insights instance in Azure, and did a basic configuration in our startup.cs.

This setup means I have some basic logging of requests in AI, but not really a lot of the details I am looking for.

So I wanted to ask here, what would be good steps to take from here, mostly because I don't really know all the things I don't know :) I have never done this from scratch myself, I have always just worked with applications that already had AI configured and setup, so the implementation was never something I thought of.

The main question I am asking here is how much manual work needs to be done inside the sourcecode, or are there better ways to do it?

  • I want to know more about exceptions, also those being caught in a try/catch/finally
  • I want to know more about where time is spent, split among time spent in a controller, and time spent waiting for a database-response.

I am aware that I can find the most time-consuming controller actions, and manually add logging of event and custom timings, but there might be a better, more generic, way to do this?

1

There are 1 best solutions below

0
Matt On

I'd have just left a quick comment on your question but I'm reputation poor, so here we are! APM & Logging is a subject near & dear to me though.

Recommending any specific APM tool might be a bit controversial but, for your use-case, I'd look into New Relic's .NET Agent. It has a lot of "out of the box" monitoring to get you started without much effort..although many others do as well.

Datadog seems to be another favorite but they only offer a free trial and I can't speak to how their Agent works. New Relic has 100GB/month of data free forever, just with only 1 "full" user(3 total but with different tiers). If it's just you getting a feel for things, that might be a good route to take. The New Relic Agent will automatically monitor a lot of the common code in applications (database calls, http calls, etc) to get you started. The query language, NRQL, isn't too hard to get used to and I believe they'll have some default charts ready for you to see what's being collected.

They also have log ingestion, you'd just need to configure a log forwarder, like FluentD or similar, to send your local IIS logs to their cloud. And for bonus points: MSSQL Monitoring

Once you've got a feel for what data is available and identify something to focus on, you can add their API calls to the source code, or a couple other config options, to add more monitoring in specific places.

There are a lot of monitoring options out there, and in the long-term it's quite possible you'll find others that perform better, but the automatic monitoring and Free-tier seem like they'd be a good fit you while you're getting into it.