Counter of Festivals

Ashok Blog for SQL Learners and Beginners and Experts

Wednesday, 16 July 2014

OUTPUT clause in INSERT/UPDATE/DELETE statements

How do I use an INSERT statement's OUTPUT clause to get the identity value?




update URLDETSAMP SET Name='test'
output  INTO URLDETSAMP(name)

SQL Server 2005 introduces a new TSQL feature that allows you to retrieve data affected by insert/update/delete statements easily. This is achieved by the use of OUTPUT clause which can reference columns from the inserted and deleted tables (that are available from triggers currently) or expressions. The OUTPUT clause can be used to return results to the client or consume it on the server into a temporary table or table variable or permanent table. More details on the restrictions of the OUTPUT clause and usage can be obtained from the SQL Server Books Online documentation.
Let us take a look at a common scenario now and how OUTPUT clause can be used to solve the problem. Use of identity column as primary key in a table is a fairly common practice. Additionally, if you have references to the identity column you need to know the value explicitly before inserting the related row. This scenario gets complicated if multiple identity values are generated as a result of inserting multiple rows. In this case, there is no easy way to determine the values that were inserted without using a trigger to populate a temporary table with the generated identity values from the inserted table for example. To demonstrate this scenario, we can first see a trigger based implementation that works in SQL Server 2000 and 2005:
use tempdb;
create table itest ( i int identity not null primary key, j int not null unique );
create trigger insert_itest on itest after insert
    insert into #new ( i, j )
    select i, j
      from inserted;
create table #new ( i int not null, j int not null );
insert into itest ( j )
select from sysobjects as o;
-- Newly inserted rows and identity values:
select * from #new;

-- #new can be used now to insert into a related table:
drop table #new, itest;

This code can be re-written in SQL Server 2005 using the OUTPUT clause like below:

create table itest ( i int identity not null primary key, j int not null unique )
create table #new ( i int not null, j int not null)
insert into itest (j)
output inserted.i, inserted.j into #new
select o.object_id from sys.objects as o
select * from #new
drop table #new, itest;
Now from this example, you can see the integration of OUTPUT clause with existing DML syntax.
Another common scenario is auditing of data in a table using triggers. In this case, the trigger uses information from the inserted and updated tables to add rows into the audit tables. The example below shows code that uses OUTPUT clause in UPDATE and DELETE statements to insert rows into an audit table.
create table t ( i int not null );
create table t_audit ( old_i int not null, new_i int null );
insert into t (i) values( 1 );
insert into t (i) values( 2 );
update t
   set i  = i + 1
output deleted.i, inserted.i into t_audit
 where i = 1;
delete from t
output deleted.i, NULL into t_audit
 where i = 2;
select * from t;
select * from t_audit;
drop table t, t_audit;
Hope you have now got an understanding of the OUTPUT clause in SQL Server 2005. It is a powerful feature that enables you to eliminate use of triggers in some cases or send results to client as part of the data modification operation efficiently.

Thursday, 3 July 2014

Components of Data flow?

Components includes -
  1. Data source(s).
  2. Transformations.
  3. Destination(s).
Component 1 -  Data Flow Sources
Data Flow SourcesDescription
OLE DB SourceConnects to OLE DB data source such as SQL Server, Access, Oracle, or DB2.
Excel SourceReceives data from Excel spreadsheets.
Flat File SourceConnects to a delimited or fixed-width file.
Raw File SourceDo not use connection manager. It produces a specialized binary file format for data that is in transit.
XML SourceDo not use connection manager. Retrieves data from an XML document.
ADO.NET SourceThis source is just like the OLE DB Source but only for ADO.NET based sources.
CDC SourceReads data out of a table that has change data capture (CDC) enabled. Used to
retrieve only rows that have changed over duration of time.
ODBC SourceReads data out of table by using an ODBC provider instead of OLE DB.

Transformation CategoriesTransformations
Row TransformationsCharacter Map
Copy Column
Data Column
Derived Column
OLE DB Command
 Rowset Transformations   Aggregate
 Percentage sampling/Row sampling
 Split and Join Transformations     Conditional split
 Look up
 Merge join
 union All
 Business intelligence transformations     Data Mining Query
 Fuzzy Look Up
 Fuzzy Grouping
 Term Extraction
 Term Look up
 Script Transformations Script
 Other Transformations     Audit
Cache Transform
 Export Column
 Import Column
 Row Count
Component 3 Data Flow Destinations
Data Flow DestinationsDescription 
 ADO.NET DestinationExposes data to other external processes such as a .NET application.
 Data Reader DestinationAllows the ADO.NET Data Reader interface to consume data, similar to the ADO.NET Destination.
 OLE DB DestinationOutputs data to an OLE DB data connection like SQL Server, Oracle or Access.
 Excel DestinationOutputs data from the Data Flow to an Excel spreadsheet.
 Flat file DestinationEnables you to write data to a comma-delimited or fixed-width file.
 Raw file DestinationOutputs data in a binary format that can be used later as a Raw File Source. It’s usually used as an intermediate persistence mechanism.
 ODBC DestinationOutputs data to an OLE DB data connection like SQL Server, Oracle or Access.
 Record set DestinationWrites the records to an ADO record set. Once written, to an object variable, it can be looped over a variety of ways in SSIS like a Script Task or a Foreach Loop Container.
 SQL Server DestinationThe destination that you use to write data to SQL Server. This destination has many limitations, such as the ability to only write to the SQL Server where the SSIS package is executing. For example – If you’re running a package to copy data from Server 1 to Server 2, then the package must run on Server 2. This destination is largely for backwards compatibility and should not be used.