Checksum Transformation

by Darren Green 6 Sep 2014 15:38

The Checksum Transformation computes a hash value, the checksum, across one or more columns, returning the result in the Checksum output column. The transformation provides functionality similar to the T-SQL CHECKSUM function, but is encapsulated within SQL Server Integration Services, for use within the pipeline without code or a SQL Server connection.

As featured in The Microsoft Data Warehouse Toolkit by Joy Mundy and Warren Thornthwaite from the Kimbal Group. Have a look at the book samples especially Sample package for custom SCD handling.

All input columns are passed through the transformation unaltered, those selected are used to generate the checksum which is passed out through a single output column, Checksum. This does not restrict the number of columns available downstream from the transformation, as columns will always flow through a transformation. The Checksum output column is in addition to all existing columns within the pipeline buffer.

The Checksum Transformation uses an algorithm based on the .Net framework GetHashCode method, it is not consistent with the T-SQL CHECKSUM() or BINARY_CHECKSUM() functions. The transformation does not support the following Integration Services data types, DT_NTEXT, DT_IMAGE and DT_BYTES.

ChecksumAlgorithm Property

There ChecksumAlgorithm property is defined with an enumeration. It was first added in v1.3.0, when the FrameworkChecksum was added. All previous algorithms are still supported for backward compatibility as ChecksumAlgorithm.Original (0).

  • Original - Orginal checksum function, with known issues around column separators and null columns. This was deprecated in the first SQL Server 2005 RTM release.
  • FrameworkChecksum - The hash function is based on the .NET Framework GetHash method for object types. This is based on the .NET Object.GetHashCode() method, which unfortunately differs between x86 and x64 systems. For that reason we now default to the CRC32 option.
  • CRC32 - Using a standard 32-bit cyclic redundancy check (CRC), this provides a more open implementation.

The component is provided as an MSI file, however to complete the installation, you will have to add the transformation to the Visual Studio toolbox by hand. This process has been described in detail in the related FAQ entry for How do I install a task or transform component?, just select Checksum from the SSIS Data Flow Items list in the Choose Toolbox Items window.

Downloads

The Checksum Transformation is available for SQL Server 2005, 2008 (R2), 2012 and 2014. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed.

Checksum Transformation for SQL Server 2005

Checksum Transformation for SQL Server 2008

Checksum Transformation for SQL Server 2012

Checksum Transformation for SQL Server 2014

Version History

SQL Server 2014

Version 4.0.0.28 – SQL Server 2014 release. Includes upgrade support for 2005, 2008 and 2012 packages to 2014.
(6 Sep 2014)

SQL Server 2012

Version 3.0.0.28 – Fixed issue which casued reuse of 208 UI assembly. No runtime impact, just prevented UI from working on 2012 only install.
(23 Feb 2013)

Version 3.0.0.27 – SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012.
(5 Jun 2012)

SQL Server 2008

Version 2.0.0.27 – Fix for CRC-32 algorithm that inadvertently made it sort dependent. Fix for race condition which sometimes lead to the error Item has already been added. Key in dictionary: '79764919' . Fix for upgrade mappings between 2005 and 2008.
(19 Oct 2010)

Version 2.0.0.24 - SQL Server 2008 release. Introduces the new CRC-32 algorithm, which is consistent across x86 and x64.. The default algorithm is now CRC32.
(29 Oct 2008)

Version 2.0.0.6 - SQL Server 2008 pre-release. This version was released by mistake as part of the site migration, and had known issues.
(20 Oct 2008)

SQL Server 2005

Version 1.5.0.43 – Fix for CRC-32 algorithm that inadvertently made it sort dependent. Fix for race condition which sometimes lead to the error Item has already been added. Key in dictionary: '79764919' .
(19 Oct 2010)

Version 1.5.0.16 - Introduces the new CRC-32 algorithm, which is consistent across x86 and x64. The default algorithm is now CRC32.
(20 Oct 2008)

Version 1.4.0.0 - Installer refresh only.
(22 Dec 2007)

Version 1.4.0.0 - Refresh for minor UI enhancements.
(5 Mar 2006)

Version 1.3.0.0 - SQL Server 2005 RTM. The checksum algorithm has changed to improve cardinality when calculating multiple column checksums. The original algorithm is still available for backward compatibility. Fixed custom UI bug with Output column name not persisting.
(10 Nov 2005)

Version 1.2.0.1 - SQL Server 2005 IDW 15 June CTP. A user interface is provided, as well as the ability to change the checksum output column name.
(29 Aug 2005)

Version 1.0.0 - Public Release (Beta).
(30 Oct 2004)

Screenshot

Checksum Transformation Editor dialog

Comments (46) -

11/7/2008 8:15:18 AM #

Arjan Fraaij

Great task to have.

Just one question is it guranteed that the returned checksum is always unique?

This was also said on the T-SQL 2000 BINARY_CHECKSUM() but in real this is not the case:
SELECT BINARY_CHECKSUM('aa','AA','Arjan')
-----------
4225134

SELECT BINARY_CHECKSUM('BQ','AA','Arjan')
-----------
4225134

Ofcource the change that this occures limits by the number of columns you use in the checksum, but the change still exists.

Kind regards,
Arjan Fraaij

Arjan Fraaij Netherlands

11/7/2008 10:35:40 AM #

Darren Green

Checksums like this that produce a 32-bit integer by their very nature cannot be guaranteed to unique. You only have 4 bytes to store your result in which limits the number of possible values, so collisions are bound to occur. For example using a checksum to differentiate between rows in a large table would be very risky, but using it to detect changes between old and new row values is much more appropriate, but you must still cater for the risk of collisions. The use of the hash for index scenarios is also reasonable, as here the checksum is only used as part of the search criteria to improve the initial search performance.

For a unique hash, use function such as MD5 or one of the cryptographic hash functions. You will however produce a much larger value, probably in excess of 200 bytes, so immediately storage requirements increase, and also search and match performance will decrease. The choice is yours.

Darren Green United Kingdom

12/19/2008 9:17:02 PM #

K Boone

It seems that the CRC32 algorithm mistakenly considers the row number when generating its checksum.  To test this, I created a SQL table named TestTable with identity column ID and string column TestData.  I added a bunch of records with the value 'ABC' for TestData in each record.  

I used this table as an OLE DB source sorted by ID ascending and then selected only column TestData as an input column when running it through the Checksum Transformation.  Each row produced a different checksum.  Next, I sorted by ID descending in the source, and again each row produced a different checksum.  Comparing both results, from top to bottom the checksums matched.  That is, row #1 produced the same checksum both times.

Now when I changed from CRC32 to either of the other options, as expected every row got the same checksum.  

Any idea what's going on here?

K Boone United States

12/23/2008 2:36:09 PM #

Robert G.

I discovered the same "bug" myself. Since the other two options (Original and FrameworkChecksum) have known issues I want to avoid using them, but this issue with the CRC32 option makes it impossible to use. I have breifly reviewed the code (I'm not a dot.net programmer by any means) but it seems the stream that you are pushing to the hash function is somehow grabbing the internal row number on each row that it processes.

I can't imagine this is by design, unless I'm missing something here.

Robert G. United States

1/28/2009 3:25:33 AM #

Hendra

Hi,

Is Checksum-transformation compatible in 64 bit environment ? Please any comment..

Thanks.

Hendra Indonesia

3/23/2009 7:31:39 PM #

Noemi

Hello,

I just wanted to share some information about the possible CRC32 BUG. I'm running on SQL2008 (64) and I've installed the latest Checksum release for this SQL server version. However, be aware that if you are using the Checksum with CRC32 to detect row level changes in combination with Conditional Split, you will see that from the moment a row is identified as different, the rest of rows that come after that one will also be identified as different (which doesn't happen with the "original" version). I'm not sure if this is an issue that someone is working on, any one knows?

Noemi United Kingdom

3/26/2009 3:19:44 PM #

Ryan

I concur with Noemi.  I've tried instances of both SQL2005 and SQL2008 under Server 2003 x64 and have had the exact same experience.  As soon as one row changes, the remaining rows fail the checksum comparison through the conditional split, regardless of their update status.

Ryan United States

4/10/2009 12:56:39 AM #

craig

Problems with CRC32:  I import the exact same xml file twice.  The second time returns a different checksum number, making it appear that the row changed, when in fact it didn't.  Using SQL 2008 32 bit.


craig United States

4/16/2009 4:06:00 PM #

Tom

Glad someone took the initiative to create this component, will be very helpful.
would like to see a status regarding the comments on this version. A bit leery to use it at this point.
Thanks

Tom United States

4/17/2009 6:11:14 PM #

Chengshu

We have run into a checksum issue . We run the checksum package on a 64-bit production machine and we have seen this error from time to time.  We are looking for a solution on this as well..   Any idea ?   Please help.

We used the Checksum Version "1.5.0.16" on SQL 2005 Enterprise edition.  

OnError,DBSW0212,DMZMGMT\sgalvining,Validate Transform Data,{DD72865F-C8B1-4DF4-BF33-3724865CE957},{4E0CB18F-0373-414C-9852-4F29B28BD627},4/13/2009 5:03:49 AM,4/13/2009 5:03:49 AM,-1073450910,0x,System.ArgumentException: Item has already been added. Key in dictionary: '79764919'  Key being added: '79764919'
   at System.Collections.Hashtable.Insert(Object key, Object nvalue, Boolean add)
   at System.Collections.Hashtable.SyncHashtable.Add(Object key, Object value)
   at Konesans.Dts.Component.Helpers.CRC32..ctor(UInt32 aPolynomial, Boolean cacheTable)
   at Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransform.PreExecute()
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper90 wrapper)

OnError,DBSW0212,DMZMGMT\sgalvining,Provider,{6F1B4233-8A08-4688-827F-FB98DC7F975F},{4E0CB18F-0373-414C-9852-4F29B28BD627},4/13/2009 5:03:49 AM,4/13/2009 5:03:49 AM,-1073450910,0x,System.ArgumentException: Item has already been added. Key in dictionary: '79764919'  Key being added: '79764919'
   at System.Collections.Hashtable.Insert(Object key, Object nvalue, Boolean add)
   at System.Collections.Hashtable.SyncHashtable.Add(Object key, Object value)
   at Konesans.Dts.Component.Helpers.CRC32..ctor(UInt32 aPolynomial, Boolean cacheTable)
   at Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransform.PreExecute()
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper90 wrapper)

Chengshu United States

5/5/2009 5:39:58 PM #

c34miller

We get the exact same error.  Anyone have any ideas?

Error: 2009-05-04 11:39:05.47
   Code: 0xC0047062
   Source: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX [102]
   Description: System.ArgumentException: Item has already been added. Key in dictionary: '79764919'  Key being added: '79764919'
   at System.Collections.Hashtable.Insert(Object key, Object nvalue, Boolean add)
   at System.Collections.Hashtable.SyncHashtable.Add(Object key, Object value)
   at Konesans.Dts.Component.Helpers.CRC32..ctor(UInt32 aPolynomial, Boolean cacheTable)
   at Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransform.PreExecute()
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper100 wrapper)
End Error

c34miller United States

5/7/2009 5:24:49 AM #

Data Info Leaders

The order in which fields are added (ticked) to the konesans checksum component matters. There are two checksum tasks in our standard Fact table pattern. For these two tasks to produce the same checksum value on the same columns, the columns need to be added (ticked) to the checksum tasks in the same order. Our theory is the component holds an internal array of the fields added, and fields are added in the order that they are ticked. Thus if you tick the fields in a different order you will get a different checksum generated.

Data Info Leaders Australia

5/7/2009 4:59:34 PM #

PedroCGD

HI Guys,
Do you suggest the use of this component?
Regards,
Pedro
www.pedrocgd.blogspot.com

PedroCGD Portugal

5/7/2009 5:51:52 PM #

Allan Mitchell

Hi Pedro.  Yes we like the component and use it often.  You have to understand the limitations that are detailed above and after you do that then it can be a very efficient way of doing comparisons.

Allan Mitchell Germany

5/18/2009 10:53:50 PM #

Aixa Roche

I have SQL Server 2005 SP2.  I installed your component and when I tried to drag the transformation into the Data Flow tab I get an error:
The component could not be added to the Data Flow task.  Please verify that this component is properly installed.  The data flow object "Konesans.dts.pipeline.chesumtransform.checksumtransform, Konesans.dts.pilepine.checksumtransform, version=1.0.0.0, culture=neutral,publickeytoken=b2ab4a111192992b" is not installed correctly on this computer.

I uninstalled and installed the product.  Same error.
Can you help?

Aixa Roche United States

5/20/2009 12:02:51 PM #

Alex Murray

Having downloaded and installed the Checksum Transform task, I'm experiencing the same issue as Aixa Roche:

The data flow object "Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransform, Konesans.Dts.Pipeline.ChecksumTransform, Version=1.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b" is not installed correctly on this computer. (Microsoft.DataTransformationServices.Design)

Anyone know how to resolve?

Thanks

Alex Murray United Kingdom

5/20/2009 5:44:42 PM #

Michael Rivera

I am responding to the comments about the ordinal position of each of the columns in the checksum. Based on the earlier post if you deselect and reselect the columns in the CheckSum component the column is placed in the bottom of the ordinal position. Well, I have tested this on a 64-bit machine and found that this is not the case. If you go back into the component the ordinal position is the same as it was prior to the change. Ordinal position is driven by the input of the component. I would like the flexiblity to change the ordinal position of this component if it is possible.

Michael Rivera United States

6/17/2009 5:55:22 PM #

yuqing

Can someone tell me if the  CRC32 bug is fixed or not? Is this component useful or not? Thanks

yuqing

6/19/2009 4:14:35 PM #

Jean Reece

I DL'd Checksum Transformation last week and am experiencing the same problem as described in the 3/23/09 post by Noemi.  I switched to the "Orginal" algorithm.  What are the known issues with this one?

Jean Reece United States

7/13/2009 12:23:41 PM #

Daniel Garner

Im suffering from the same problem posted by Noemi - 64 bit SQL Server. Is there any update on this i.e. any chance of getting the bug fixed as I would rather switch to the CRC32 checksum over the original.

Many Thanks
Dan.

Daniel Garner United Kingdom

8/12/2009 1:26:37 PM #

Ronald Kraijesteijn

I was looking for a manner to speed-up the 'lookup-process'; which rows already exist in my Datawarehouse and which are new.

I tried this component to generate a checksum of my businesskey as input voor de Slowly Changing Dimension Component (toddmcdermid) to speedup the lookup. Doing a lookup over a 4-bytes integer field is much faster (50%) instead of doing a lookup over a varchar(32) for example.

I tested all of the three algorithm but in every case, duplicated businesskeys comes up. This is caused by duplicated checksums of the component.

Is there a good and safe way to do generate a checksum over a Datawarehouse Business key which is an integer field? I know I can use the MD5 hash for example but this is 50% slower (I tested).

Thanks.

Ronald Kraijesteijn Netherlands

10/6/2009 4:22:14 PM #

Ellen Russell

I also get the intermittent error:
System.ArgumentException: Item has already been added. Key in dictionary: `79764919`  Key being added: `79764919`    
at System.Collections.Hashtable.Insert(Object key, Object nvalue, Boolean add)    
at System.Collections.Hashtable.SyncHashtable.Add(Object key, Object value)    
at Konesans.Dts.Component.Helpers.CRC32..ctor(UInt32 aPolynomial, Boolean cacheTable)    
at Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransform.PreExecute()    
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper90 wrapper)

Note the key value specified here is the same key value returned in both the other posts about this error.  I mention that this error is intermittent - it occurred more than once for a specific package, but not with every execution for that package.  In this package, I have four containers processing in parallel; each one of them includes a Data Flow task with a Checksum transform in it.  However, only one of the Data Flow tasks used the CRC32 algorithm - and this is the one that causes the package to fail.  The others used either Framework or Original.  Since this error occurs on PreExecute, always with the same Key value error, could this be caused by the SSIS engine attempting to add more than one type of Checksum object to the HashTables collections with the same key?  I'm fairly ignorant about the internals of SSIS and am not a .NET programmer, but the intermittent nature of the error screams "timing!" to me.  For others who have experienced this issue, do you have more than one Checksum transform (either CRC32 or one of the other types) potentially instantiating at the same time as a CRC32 type?  This transform is one of my can't-live-without objects, so until the CRC32 algorithm stabilizes, I'm still using the Original algorithm.

Ellen Russell United States

10/9/2009 4:50:23 PM #

Kazi Islam

I also have the same intermittent error:
System.ArgumentException: Item has already been added. Key in dictionary: '79764919'  Key being added: '79764919'
at System.Collections.Hashtable.Insert(Object key, Object nvalue, Boolean add)
at System.Collections.Hashtable.Add(Object key, Object value)
at System.Collections.Hashtable.SyncHashtable.Add(Object key, Object value)
at Konesans.Dts.Component.Helpers.CRC32..ctor(UInt32 aPolynomial, Boolean cacheTable)
at Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransform.PreExecute()
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper90 wrapper)

I have also multiple containers processes in parallel and each container has a data flow task with Checksum Transform in it. But all of my Checksum Transform uses the 'Original' algorithm. So, I don't think CRC32 algorithm is an issue here. Interestingly, this error even occurs with an empty table too and it might occur once for a specific package.

Kazi Islam Canada

10/19/2009 6:29:03 PM #

c34miller

Related to this error: System.ArgumentException: Item has already been added. Key in dictionary: `79764919`  Key being added: `79764919`    

We have done some significant testing on this.  We encountered this a few times in production and decided we needed both a short term and long term solution.  In the short term, we have discovered that removing any parallel processing seems to reduce the occurrence of the error.  One of our frustrations is that the error was more prevalent in production, where we have better machines with more processors, than in our test environment.  This was caused by more worker packages to running in parallel, each using the Konesans component.  By forcing precedence constraints on these components so they only run in serial appears to have resolved the problem (at least for us, your results may vary).

In the long term we are removing the Konesans component and replacing it with an execute SQL component that calls the SQL checksum functionality directly.

c34miller United States

12/17/2009 7:10:35 AM #

Erik van Dongen

It is getting boring.... but any news on the CRC32 algorithm? We get the same error as posted several times before. As we are about to go in production I would like this solved soon...

Erik van Dongen Netherlands

1/6/2010 6:58:32 PM #

Elijah

We have the same CRC32 issue. Once a record is determined to have different current and input checksums all remaining input records are assumed to be different. So if the 5th record in a set of 20 is the only one with new values (new checksum value) the first 4 are deemed no change but records 5 thru 20 are considered changes. The Original algorithm is fine. Can some explain why I should not use the Original algorithm?

Elijah Canada

1/7/2010 9:23:48 PM #

Elijah

For those wondering on a the CRC32 solution I found an alternative. There is a tranform on the Codeplex website called Multiple Hash. It allows you to use MD5, SHA1 plus more. It seems to work just fine. I defined my hash value on my table as Binary(16) with MD5. In the Conditional split I identify changes as (DT_STR,50,1252)New_Checksum != (DT_STR,50,1252)Old_Checksum. Here is the link. ssismhash.codeplex.com/wikipage

Elijah Canada

4/9/2010 8:49:03 PM #

srini

Hi,
In our SQL 2005 we have around 300 ssis packages. In all 300 packages we used mostly Checksum and Trash destination. is there any way to run these packages in sql server 2005 and sql server 2008 without modifying ssis packages for SQL 2008 checksum and trash destination.

Is there any way to make it compatible with SQL 05 and 08.

Thanks in Advance
Srini

srini United States

5/17/2010 3:24:01 PM #

Sandeep

Hi,
Even I am getting the same error with same key value as reported, Is there any solution or workaround for this?


Checksum Transformation [303]
   Description: System.ArgumentException: Item has already been added. Key in dictionary: '79764919'  Key being added: '79764919'
   at System.Collections.Hashtable.Insert(Object key, Object nvalue, Boolean add)
   at System.Collections.Hashtable.SyncHashtable.Add(Object key, Object value)
   at Konesans.Dts.Component.Helpers.CRC32..ctor(UInt32 aPolynomial, Boolean cacheTable)
   at Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransform.PreExecute()
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper100 wrapper)
End Error

Sandeep United States

9/20/2010 3:19:54 PM #

Chad

I am also interested to know if anyone is running Konesans Checksum Transformation after an upgrade to SQL Server 2008 R2 environment

Chad United States

10/15/2010 4:30:20 PM #

mark brunton

i have been using the checksum transformation for a while and have recently noticed an issue.

This is on the 2005 Checksum Transformation using either original or Framework

The checksum value is different when i exeute a package in Visual Studio versus executing it from a Job in Sql Server. (Using the execute package Utility for Sql Server produces the same result as Visual Studio).
{Although this does not happen for every row but the majority of them}

This is obvisiously an issue as i must test packages (in visual studio) before i upload them to integrated services, from which they are then run from a sql server job.

This is causing the datawarehouses i am working on to bloat.

Has anyone else had any experience of this, any help appreciated.


mark brunton United Kingdom

10/19/2010 10:35:03 PM #

Darren Green

A new release is now available that corrects the bug "Item has already been added. Key in dictionary: '79764919'  Key being added: '79764919'". This was a race condition that could have happened regardless of the algorithm used.

The CRC-32 specific bug caused which made the checksum unpredictable, it inadvertently used the previous rows checksum as the seed has also been fixed.

The 2008 installer has also been changed to address an issue with side by side installs, as well as the upgrade mapping used for 2005 to 2008 upgrades not always behaving as it should.

Sorry it took so long.

Darren Green United Kingdom

10/20/2010 9:13:28 AM #

Darren Green

Mark, I recently exchanged emails with a colleague of yours. It sounds very much like this is down to the x86 vs x64 differences in the Original/Framework GetHashCode method of.NET which is used behind the scenes. Ensuring you run under the same processor architecture should resolve the differences.

Darren Green United Kingdom

1/18/2011 10:27:39 AM #

Kasper Kamp Simonsen

I have installed Version 2.0.0.27 - when using crc32 I dont get det correct CRC32 value back. Is there any solutions to this?

Kasper Kamp Simonsen Denmark

2/9/2011 3:28:11 AM #

Joe

I've come across this invalid checksum issues when comparing checksums from different sources (ie: to see if data has changed)

As mentioned before, The issue was resolved by making sure that the column orders of data going INTO the check sum components were EXACTLY the same (Including data types, lengths and order) for both checksum components.

It does not matter which order you click the columns in the checksum component, the order in which they are processed seems to 'reset' back to the order of the columns as they arrive via the components INPUT.
In my case, I found I had to re-write the SQL in my data source to specify a certain column order before consistent values where returned.

Joe Australia

3/4/2011 4:57:22 PM #

Daniel Barkman

Has anyone else experienced severe performance issues with this component? I am trying to calc a checksum on a table of several hundred thousand rows and it has taken hours. Any tips or secrets?

Daniel Barkman United States

4/14/2011 1:04:54 PM #

Patrick Peters

I want to have a CRC of a file read from a File Connection Manager. I used your component, but it calculates CRC’s based on the columns (per row). Is it possible to calculate the CRC for the whole file ?

Patrick Peters Netherlands

4/14/2011 1:05:33 PM #

Patrick Peters

I want to have a CRC of a file read from a File Connection Manager. I used your component, but it calculates CRC’s based on the columns (per row). Is it possible to calculate the CRC for the whole file ?

Patrick Peters Netherlands

5/13/2011 11:37:52 AM #

Matt Connolly

I'm using the framework algorithm because I find it performs much better than CRC, but I'm finding that the checksum calculated is the same regardless of which columns I select - so if I select columns A,B,C,D,E,F,G,H,I,J I get the same result as if I select columns A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T. Is there a maximum number of columns beyond which additional columns are ignored?

Matt Connolly United Kingdom

5/17/2011 9:42:51 AM #

Matt Connolly

My previous issue was not connected to the number of columns, it was caused by columns containing NULLs. It appears that when the framework algorithm encounters a NULL, it just stops processing. Having removed all columns which contain NULLs, the transformation now functions as expected.

Matt Connolly United Kingdom

12/1/2011 2:50:14 PM #

Brent Bunch

I've been trying (without success) to load the checksum transform in SQL Server Denali.  Try as I might, I'm not able to get the transform to show up in the SSIS toolbox.  Denali is supposed to automatically add available transforms to the ssis toolbox instead of requiring the transforms be manually added. However, that's not my experience.  

So far, I've added the transform to C:\Program Files\Microsoft SQL Server\110\DTS\PipelineComponents (and then for good measure added it to Program Files (x86).  I've verified the transform and common library are in the GAC (and then when that didn't work, reloaded them into the GAC).  So far, no dice.

Has anyone had this work for them?  Has the feature to autoload of transforms in Denali BIDS been implemented?

Brent Bunch United States

1/16/2012 8:45:10 AM #

ahmet kuru

will there be a version for sql 2012

ahmet kuru Turkey

2/3/2012 6:19:47 AM #

KOTESHCHOWDARY

i tried installing it,,how ever it pings me that , i dont have the adminstrator priveleges...
CAN some one help me on this??

thanks in advance

kotesh kavula

KOTESHCHOWDARY India

4/25/2012 9:24:05 AM #

Endre

SQL Server 2012 is released. When will the component be updated?

Is is not a good idea to open source this small project, so that we are not utterly dependent on Konesans to get updates? And then the user community could help improve it.

Endre Norway

6/5/2012 2:13:25 PM #

Darren Green

All, the 2012 version is now available, see download links above.
Kotesh, you must have administrator rights to run the install, no way round it.

Darren Green United Kingdom

6/22/2012 7:23:52 PM #

chrisRDBA

I have run the install on my 2K8 box, and see the DLL in the pipelineComponents folder, but dont see it available to add to the toolbox in the final step.

Any ideas?

Thanks!

chrisRDBA United States

Pingbacks and trackbacks (1)+

Add comment

  Country flag

biuquote
  • Comment
  • Preview
Loading

RecentComments

Comment RSS