This is then problematic for the reasons above. My conclusion is that the connections from SAS to SQL Server are transient, with each connection resulting in a new process ID. SELECT name, database_id INTO #table3 FROM SELECT name, database_id INTO #table2 FROM Libname TMP odbc NOPROMPT="Driver= Server=XXXXXXXX Database=TempDB Trusted_Connection=yes " bulkload=yes schema=dbo Options sastrace=',d' sastraceloc=saslog nostsuffix With that in mind, submit the below code in SAS EG (modify as required). So, the Temporary Tables persist only for the life of the process ID that created them, and global temporary tables can be accessed by other process ID's, but again *they only exist during the life of the process ID that created them*. You'll see that #table2 and #table3 have disappeared. Now, close the first query window, and resubmit the code in the second query window. (You can submit to get the process ID, but it's also displayed next to your userid in SSMS). In the 2nd query window, you can see that table1, #table2, and #table3 exist, but you can only access table1 and #table3 #table2 can only be accessed by the process ID that created it. SELECT * FROM tempdb.INFORMATION_SCHEMA.Tables WHERE table_name LIKE '%table%' Second, do this in SQL Server Management Studio: It's a bit long, but goes into details about temp tables and table variables. However, since I put some work into this, here goes, apologies if some of this is redundant.įirst of all, have a look at. I was in the midst of replying to the OP by the solution(s) were posted. Still happy to hear any suggested improvements. On (qMaster.id1=#qTrans.id1 and qMaster.id2=#qTrans.id2) *upload work.qTrans to SQL server as temporary table Ĭreate table sql.'#qTrans'n (insertbuff=1000) as *SAS dataset of records to delete from the SQL data Looks like below works: libname sql odbc dsn= user="." password="." schema=dbo connection=shared So I suppose to SQL, I may be two different sessions. I suspect it's because I use implicit pass through to write the SAS transaction table to SQL server, then use explicit pass through to execute the MERGE statement. And I had to use #tableNames to get it to work. Looks like adding connection=shared to the libname statement was the key. Thanks I hadn't seen that page of the docs. And I don't know anything about SQL server temp tables, would #qTrans persist long enough to be seen by my execute statement? What about if I made it #qTrans? I suppose I don't have much to gain by making it an official temporary table rather than just dropping qTemp at the end, but figured it's worth a shotĪny other recommendations for improvements to this approach?. create table sql.'#qTrans'n but couldn't get it to work. I feel like when I upload qTrans to SQL server I should make it a temporary table, but I couldn't get it to work. On (qMaster.id1=qTrans.id1 and qMaster.id2=qTrans.id2) *PROC SQL to upload transaction table to SQL, then execute the SQL server merge *SAS transaction dataset of records to delete from the SQL data *SQL master table (sql libref points to sql server database) I upload my transaction dataset to SQL server, then use explicit pass through to execute the SQL server MERGE statement. I have a SQL server master table, and a SAS dataset of transactions which are deletions to be applied.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |