I'm developing a SQL Server 2012 Express and developer edition (with latest Service Pack) solution.
In my database I have a table CODES with codes. This table has a FLAG column indicating that a code has been printed, read or dropped. Codes are grouped by another column, LEVEL. CODES table has CODE and LEVEL as primary key.
I'm going to update table CODES very quickly, and if I do SELECT COUNT(code) FROM CODES WHERE FLAG=1 to get all codes read, sometime, I block that table, and when we have many many rows, SELECT COUNT CPU goes to 100%.
So, I have another table, STATISTICS to store how many codes has been printed, read or dropped. When I update a row in CODES table, I add 1 to STATISTICS table. I have tried this two ways:
With an UPDATE statement after updating CODES table.
declare @printed bigint;
set @printed = (Select CODES_PRINTED from STADISTICS where LEVEL = @level)
if (@printed is null)
begin
insert dbo.STADISTICS(LEVEL, CODES_PRINTED) values (@level, 1)
end
else
begin
update dbo.STADISTICS set CODES_PRINTED = (@printed + 1) where LEVEL = @level;
end
With a TRIGGER in CODES table.
ALTER trigger [dbo].[UpdateCodesStatistics] on [dbo].[CODES]
after update
as
SET NOCOUNT ON;
if UPDATE(FLAG)
BEGIN
declare @flag as tinyint;
declare @level as tinyint;
set @flag = (SELECT FLAG FROM inserted);
set @level = (SELECT LEVEL FROM inserted);
-- If we have printed a new code
if (@flag = 1)
begin
declare @printed bigint;
set @printed = (Select CODES_PRINTED from STADISTICS where LEVEL = @level)
if (@printed is null)
begin
insert dbo.STADISTICS(LEVEL, CODES_PRINTED) values (@level, 1)
end
else
begin
update dbo.STADISTICS set CODES_PRINTED = (@printed + 1) where LEVEL = @level;
end
end
END
But in both cases I lost data. After running my program I check CODES table and STATISTICS table and statistics data doesn't match: I have less printed codes and read codes in STATISTICS than in CODES table.
This is STATISTICS table that I'm using now:
CREATE TABLE [dbo].[BATCH_STATISTICS](
[CODE_LEVEL] [tinyint] NOT NULL,
[CODES_REQUESTED] [bigint] NOT NULL CONSTRAINT [DF_BATCH_STATISTICS_CODES_REQUESTED] DEFAULT ((0)),
[CODES_PRINTED] [bigint] NOT NULL CONSTRAINT [DF_BATCH_STATISTICS_CODES_PRINTED] DEFAULT ((0)),
[CODES_READ] [bigint] NOT NULL CONSTRAINT [DF_BATCH_STATISTICS_CODES_READ] DEFAULT ((0)),
[CODES_DROPPED] [bigint] NOT NULL CONSTRAINT [DF_BATCH_STATISTICS_CODES_DROPPED] DEFAULT ((0)),
[CODES_NOREAD] [bigint] NOT NULL CONSTRAINT [DF_BATCH_STATISTICS_CODES_NOREAD] DEFAULT ((0)),
CONSTRAINT [PK_BATCH_STATISTICS] PRIMARY KEY CLUSTERED
(
[CODE_LEVEL] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
By the way, I'm updating and inserting very quickly (more than 1200 rows in a minute).
Any idea what's happening or how can I do it better?
insertedanddeletedcan contain multiple (or no) rows. So idioms likeset @flag = (SELECT FLAG FROM inserted)are fundamentally broken. From your description, it sounds like an indexed view could work for you instead, something like this:and:
And now SQL Server will (behind the scenes) maintain this data automatically and you don't have to write any triggers (or explicitly maintain a separate table)