Вопрос

I have noticed an interesting performance change that happens around 1,5 million entered values. Can someone give me a good explanation why this is happening?

Table is very simple. It is consisted of (bigint, bigint, bigint, bool, varbinary(max)) I have a pk clusered index on first three bigints. I insert only boolean "true" as data varbinary(max).

From that point on, performance seems pretty constant.

Legend: Y (Time in ms) | X (Inserts 10K)

enter image description here

I am also curios about constant relatively small (sometimes very large) spikes I have on the graph.

Actual Execution Plan from before spikes.

Actual Execution Plan from before spikes

Legend:
Table I am inserting into: TSMDataTable
1. BigInt DataNodeID - fk
2. BigInt TS - main timestapm
3. BigInt CTS - modification timestamp
4. Bit: ICT - keeps record of last inserted value (increases read performance)
5. Data: Data
Bool value Current time stampl keeps

Enviorment
It is local.
It is not sharing any resources.
It is fixed size database (enough so it does not expand).
(Computer, 4 core, 8GB, 7200rps, Win 7).
(Sql Server 2008 R2 DC, Processor Affinity (core 1,2), 3GB, )

Нет правильного решения

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top