I’ve got a stored procedure to upload a csv file to a database table. From what I’ve read that’s the fastest way to load csv data to SQL.
Cool, but I’ve had errors when I went to deploy to production.
Here are just a few things I’ve learned.
1. The permissions used are different when using windows login vs sql login (to call the stored procedure).
2. If you use a windows login to call the stored procedure it will use the permissions of the user to read the file (no matter what you what the SQL process is set to run).
3. If you use a SQL login then account the SQL process is setup to run as is used.
4. Bulk Insert doesn’t like to pull data from any server, but the server that SQL is running on. If you do try pulling data from another server then make sure you use a UNC path.
After all that I still had permissions trouble.
So my application copied the csv file to a directory on the SQL server. Then calling the procedure to do the Bulk Insert worked without issue.
Below is my stored procedure. You just need to pass it the path to the file you want to import.
It will load it to a table dbtemp.
Make sure that table has the same fields as your source file.
SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_ImportFile] @filename varchar(1000) AS BEGIN SET NOCOUNT ON; declare @sql varchar(4000) select @sql = ' BULK INSERT dbtemp FROM ''' + @filename + ''' WITH ( FIELDTERMINATOR = ''","'', FIRSTROW = 2, ROWTERMINATOR = ''\n'' )' exec (@sql) update dbtemp set funding = REPLACE(funding,'"',''), initialdate = REPLACE(initialdate,'"','') END