Lets say I have an Oracle 10g table that logs all response times on my system. How can I get a count slow transactions by date, including zero counts for dates where no transactions were slow?
As an example
create table response_times (system_datetime timestamp, server_name varchar2(3), response_time number(9));
insert into response_times (system_datetime, server_name, response_time)
values (to_date('01/05/2014 12:30', 'DD/MM/YYYY HH24:MI'), 'D01', 500);
insert into response_times (system_datetime, server_name, response_time)
values (to_date('01/05/2014 13:45', 'DD/MM/YYYY HH24:MI'), 'D02', 700);
insert into response_times (system_datetime, server_name, response_time)
values (to_date('01/05/2014 14:01', 'DD/MM/YYYY HH24:MI'), 'D01', 2500);
insert into response_times (system_datetime, server_name, response_time)
values (to_date('02/05/2014 07:45', 'DD/MM/YYYY HH24:MI'), 'D02', 2500);
insert into response_times (system_datetime, server_name, response_time)
values (to_date('02/05/2014 08:30', 'DD/MM/YYYY HH24:MI'), 'D02', 500);
I want to see the number of transactions where the response time was more than 2000 milliseconds, like this:
TRUNC(system_datetime) D01 D02
===================== === ===
2014/05/01 1 0
2014/05/02 0 1
2014/05/03 0 0
Is there a way to retrieve this in a single sql statement? In practice there will be millions of rows, with (hopefully) only a few slow transactions on each day, and on some days there will be no slow transactions at all.
Now I know I can write a short PL/SQL procedure to obtain this using a temporary table of candidate dates, and a loop to issue a select count(*) for each time period in order to get the count values, but I am hoping there is a more elegant way...