Statistics and Natively Compiled Procedures

SQL Server 2014, TSQL
Statistics are one of the single most important driving factors for the behavior of the query optimizer. The cardinality estimates stored within the statistics drive costing and costing drives the decision making of the optimizer. So, how does this work with the new SQL Server 2014 natively compiled procedures? Differently. In-memory tables do not maintain their statistics automatically. Further, you can't run DBCC SHOW_STATISTICS to get information about those statistics, so you can't tell if they're out of date or not or what the distribution of the data is within them. So, if I create some memory optimized tables, skip loading any data into them and then run this standard query: [crayon-5a6d1e9298757120577329/] The estimates in the execution plan for the query show some pretty wild values: That's an estimated number of rows…
Read More

The Utility of Execution Plans in Natively Compiled Procedures

SQL Server 2014, TSQL
I'm actually having problems identifying the utility of execution plans when working with natively compiled procedures. Or, put another way, why bother? I've posted a couple of times on natively compiled procedures and SQL Server execution plans. I've found the differences interesting and enlightening, but I'm seriously questioning why I should bother, at least currently. I'm sure there will be many changes to the behaviors of the natively compiled procedures and their relationship with execution plans. But right now, well, let's look at an example. I have three simple tables stored in-memory. Here's the definition of one: [crayon-5a6d1e929cbb3209486411/] I can then create the following code as a natively compiled procedure: [crayon-5a6d1e929cbc5486811507/] When I call for an estimated plan (remember, no actual plans) I'll get this: If you click on it,…
Read More