Application Insights is using a specific folder structure for saving the JSON files: Application Insights resource name + guid/telemetry type/date/hour/. Each filename includes a unique guid, the date and time. One JSON file can hold one or multiple telemetry items (pageview, event, dependency, etc).

There is not a limit for the number of data that is exported. The only limit is the max size of the Azure storage account, but that goes up to 500GB. It’s not likely that you will hit that limit with exporting telemetry data.

Use a Stream Analytics job to push exported telemetry data to a SQL database

The continuous export feature of Application Insights lets us keep telemetry data longer than 90 days. All this exported telemetry data is kinda useless if we don’t do anything with it. Exporting is one thing, but the next step is to do something with this data. That is where the Azure Stream Analytics job steps in. A Stream Analytics job can process data real time, that means the moment a new event, file or another kind of data pops up in the input data source it will be processed by the job. The job let us define an input and output data source. It’s basically a way of copying efficiently data between resources. This feature also allows us to write a query that explicitly tells which data should be transferred to the output data source. A stream analytics job instance is build up of the following parts:

Input This is the source, options are events hub, blob storage or IoT hub. Function Functions can be called in the query, e.g. transforming data. Query Stream Analytics query language. Query the data from the input data source. Output Output data source to push the data to. Options are SQL database, blob storage, event hub, table storage, service bus queue, service bus topic, cosmos DB, Power BI, data lake store, azure function.

Like I said earlier telemetry data is stored as a JSON file on Azure storage location. Below an example of an exported custom event.