Question

I'm creating a table "InterviewTemp" , inserting data there, updating a second table with that data and then dropping the "InterviewTemp" table.

there is an example:

CREATE TABLE [entrevistasTemp](
    [id_usuario] [int] NULL,
    [id_entrevista] [int] NULL,
    [comentarios] [varchar](300) NULL
)

INSERT [entrevistasTemp] ([id_usuario], [id_entrevista], [comentarios]) VALUES (12099, 4515, CONVERT(TEXT, N'Riesgo muy alto.  Marun Victoria, '))
INSERT [entrevistasTemp] ([id_usuario], [id_entrevista], [comentarios]) VALUES (15347, 4516, CONVERT(TEXT, N'Riesgo muy alto.  Marun Victoria, '))

UPDATE entrevistas 
    set entrevistas.comentarios = entrevistasTemp.comentarios 
    from entrevistasTemp
WHERE entrevistas.id = entrevistasTemp.id_entrevista

drop table entrevistasTemp

there is a better way to do this?

EDIT: just inserting 4.5k rows

Was it helpful?

Solution

Create a temporary table instead of a table:

CREATE TABLE #entrevistasTemp(
    [id_usuario] [int] NULL,
    [id_entrevista] [int] NULL,
    [comentarios] [varchar](300) NULL
)

INSERT #entrevistasTemp ([id_usuario], [id_entrevista], [comentarios]) VALUES (12099, 4515, CONVERT(TEXT, N'Riesgo muy alto.  Marun Victoria, '))
INSERT #entrevistasTemp ([id_usuario], [id_entrevista], [comentarios]) VALUES (15347, 4516, CONVERT(TEXT, N'Riesgo muy alto.  Marun Victoria, '))

UPDATE entrevistas 
    set entrevistas.comentarios = #entrevistasTemp.comentarios 
    from #entrevistasTemp
WHERE entrevistas.id = #entrevistasTemp.id_entrevista

drop table #entrevistasTemp

OTHER TIPS

Answer solely depending upon size of data being inserted and frequency of rows being accessed.

If you have a large dataset then you can create a table insert data into that table then implement indexing in the table then use that table for any further operations and thereafter drop the table.

If data size is limited. Then going with answer by aF would be preferable.

Even better than the temporary table (if your version of SQL Server supports it [2005+]) is a table variable. When you create a temporary table, SQL Server must recompile the query any time it runs. Table variables don't have this issue. They are also created in memory rather than on disk, and have fewer locking and transaction log contention issues.

Code would look like this:

DECLARE @entrevistasTemp TABLE
(
    [id_usuario] [int] NULL,
    [id_entrevista] [int] NULL,
    [comentarios] [varchar](300) NULL
)

INSERT INTO @entrevistasTemp ([id_usuario], [id_entrevista], [comentarios]) VALUES (12099,
    4515, CONVERT(TEXT, N'Riesgo muy alto.  Marun Victoria, '))
INSERT INTO @entrevistasTemp ([id_usuario], [id_entrevista], [comentarios]) VALUES (15347,
    4516, CONVERT(TEXT, N'Riesgo muy alto.  Marun Victoria, '))

UPDATE entrevistas 
    SET entrevistas.comentarios = et.comentarios 
    FROM @entrevistasTemp et
WHERE entrevistas.id = et.id_entrevista
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top