In SQL Server, what's the best way to split multiple delimited strings into rows without a function?
-
16-03-2021 - |
Pergunta
I know there are numerous examples of how to split strings in SQL Server. However, they are often for a single delimited string. What's the best way to split multiple delimited strings without using a function (SQL Server 2016)?
Here's what I'm starting with:
GroupId Val1 Val2
----------- --------- ------
G1 1,2,3 a,b,c
G2 4,5,6 d,e,f
G3 1,2,3,4 w,x,y,z
Here's what I would like to end up with:
GroupId Val1 Val2
------- ------- -------
G1 1 a
G1 2 b
G1 3 c
G2 4 d
G2 5 e
G2 6 f
G3 1 w
G3 2 x
G3 3 y
G3 4 z
Based on everything I've read, I think I may need to use the XML method because it allows me to specify an order, but I haven't been able to find an example of this method where multiple columns are split.
Here's the T-SQL to create the source table:
DROP TABLE IF EXISTS #TestData;
CREATE TABLE #TestData(GroupId VARCHAR(10),
Val1 VARCHAR(100),
Val2 VARCHAR(100)
);
INSERT INTO #TestData
VALUES ('G1', '1,2,3', 'a,b,c'),
('G2', '4,5,6', 'd,e,f'),
('G3', '1,2,3,4', 'w,x,y,z');
SELECT *
FROM #TestData;
Thank you in advance!
Solução
Transform the delimited string to a JSON array, and use OPENJSON, which exposes the lexical ordering for arrays. eg
This
select *
from openjson('["w","x","y","z"]')
outputs
key value
--------- -------
0 w
1 x
2 y
3 z
So
DROP TABLE IF EXISTS #TestData;
CREATE TABLE #TestData(GroupId VARCHAR(10),
Val1 VARCHAR(100),
Val2 VARCHAR(100)
);
INSERT INTO #TestData
VALUES ('G1', '1,2,3', 'a,b,c'),
('G2', '4,5,6', 'd,e,f'),
('G3', '1,2,3,4', 'w,x,y,z');
with q as
(
SELECT GroupId, concat('["',replace(Val1,',','","'),'"]') Val1,concat('["',replace(Val2,',','","'),'"]') Val2
FROM #TestData
)
select GroupId, v1.value Val1, v2.value Val2
from q
cross apply openjson(Val1) v1
cross apply openjson(Val2) v2
where v1.[key] = v2.[key]
outputs
GroupId Val1 Val2
---------- -------- ---------------
G1 1 a
G1 2 b
G1 3 c
G2 4 d
G2 5 e
G2 6 f
G3 1 w
G3 2 x
G3 3 y
G3 4 z
Licenciado em: CC-BY-SA com atribuição
Não afiliado a dba.stackexchange