tSQLt AssertEqualsTable - unexpected results when table schema doesn't match

StackOverflow https://stackoverflow.com/questions/17716307

  •  03-06-2022
  •  | 
  •  

Domanda

I noticed the other day that you can write a test where there are more columns in the Actual table that in the Expected table and the test will still pass if the the data matches in the columns that exist in both.

Here is an example:

if exists(select * from INFORMATION_SCHEMA.ROUTINES where ROUTINE_SCHEMA='UnitTests_FirstTry' and ROUTINE_NAME='test_AssertEqualsTable_IgnoresExtraColumnsInActual')
begin
    drop procedure  UnitTests_FirstTry.test_AssertEqualsTable_IgnoresExtraColumnsInActual
end
go

create procedure UnitTests_FirstTry.test_AssertEqualsTable_IgnoresExtraColumnsInActual
as
begin

    IF OBJECT_ID(N'tempdb..#Expected') > 0 DROP TABLE [#Expected];
    IF OBJECT_ID(N'tempdb..#Actual') > 0 DROP TABLE [#Actual];

create table #expected( a int null) --, b int null, c varchar(10) null)
create table #actual(a int, x money null)
insert #expected (a) values (1)
insert #actual (a, x) values (1, 22.51)
--insert #expected (a, b, c) values (1,2,'test')
--insert #actual (a, x) values (1, 22.51)

exec tSQLt.AssertEqualsTable '#expected', '#actual'

end
go

exec tSQLt.Run 'UnitTests_FirstTry.test_AssertEqualsTable_IgnoresExtraColumnsInActual'
go

I noticed this when I removed some extra columns from the Expected table of a test that no longer needed those columns, but I forgot to remove the same columns from the Actual table and my test still passed, which to me was a bit off putting. This only happens when the Actual table has more columns. If the expected has more columns an error is generated. Is this correct? Does anyone know what the reasoning is behind this behavior?

È stato utile?

Soluzione

Although not particularly well documented in this respect, the AssertEqualsTable routine only looks at the data in the table - not that the columns are the same. To check whether the table structures are the same, use AssertResultSetsHaveSameMetaData. I wrote a bit about this in this article.

You can of course run both in the same test, and the test will only pass if both checks pass.

I guess the reason for the split would be because there may be rare instances where you care about either the data or the metadata being consistent for your test, but not both.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top