Handling different row types in the same file

by Allan Mitchell 26 May 2005 14:00

Sometimes source systems like to send us text files that are a little out of the ordinary. More and more frequently we are seeing people being sent master and detail or parent and child rows in the same file. Handling this in DTS is painful. In SSIS that changes. In this article we'll take you through one file of this type and provide a solution which is more elegant than what we have today, and hopefully will give you ideas as to how you can handle your own versions of this type of file. As mentioned this is a simple solution but we could extend it to include more powerful processing techniques like performing lookups to see if we already have in our destination any of the rows coming through the pipeline etc. This is how our package should look when everything is finished.


The graphic below shows us the text file sent to us by our source system. As you can see the last column is the type of row so a "M" indicates a master row and a "D" indicates a detail row. In this example there is one master row and three detail rows.


We create a flat file connection manager that points to this file. We are not going to split the lines in the text file up into their respective columns yet because we have two types of rows and they have varying counts of columns. Instead we are going to read in the line as one string. This is how we set up the connection manager to do that.


Once we've done that we add a DataFlow task to our package and in the dataflow behind the task we add a Flat File Source Adapter which points to our earlier connection manager. The only thing we're going to change is the name of the ouput column, as shown here.


After doing that we are going to flow the output into a Conditional Split Transform. We will use this transform to interpret the line type through the use of the final column in the text file and direct the row down the correct output. With the Conditional Split transform in our example we will have three possible outputs flowing downstream. The first two are based on whether we have decided the row is a Master row or a Detail row and the third is one that we get for free and this is where all other rows that do not meet our criteria will flow.


So we've split up our rows and assigned them to the correct outputs we now need to hook them up to a Script Transform to break the line apart and create our real columns from the one line. When you drag an output from the Conditional Split Transform to the Script Transform you will be asked which output you would like to attach. In the Script Transform the first thing we are going to do is add some columns to the Output. When the data has flowed through this component, downstream components will see these columns and it will be easier for the package designer to assign them to their correct destinations. Here's how we do that.


We are only going to look at the script transform that handles the rows flowing down the master row output, but the code used can easily be adapted to also handle the rows flowing through the detail row output. We now have one column coming into this transform and we have a number of columns we have just created ready to take data from this column so let's write some script to do that.


All this code is doing is taking the column that comes in (TheWholeLineColumn) and splitting it up based on a comma delimiter. We then assign elements in the created array out to our ouput columns. All that is left now is to hook up our output from the script transform to a destination and for ease of use we have simply pointed the output to a Raw File Destination Adapter. We have added a dataviewer to the flow of data going into the Script transform so we can see the definition of the row going in. You can see this in the graphic below.


Finally, all that is left for us to do is run the package and watch the rows flow down the pipeline. What we should see is that one row flows to the master output and three rows flow down to the details output


Comments (6) -

3/3/2009 2:14:14 PM #


This is great! I'm working on something very similar, although the destination is a multi row (header/detail) flat file. I'm struggling.

James United States

5/14/2009 4:08:08 PM #


Would there be a way to force the flow to process those master rows first followed by the child rows?

The only way I can think of doing this would be to have some kind of additional staging tables.

Andy United Kingdom

5/16/2009 11:00:32 AM #

Darren Green

Andy, there is no way to do that inside a single Data Flow so staging it is really the best option, but you may consider raw files over tables. They are very efficient, although the real world results will depend on your architecture and overall process.

Darren Green United Kingdom

8/3/2009 8:26:20 PM #


that example is great i need to do all in revearse .create a file with invoice details 18 fiels and at the end of each invoice print total of only 4 fields,and all this should be in one text file.... thank you!if someone can give me advise.


8/20/2009 1:31:31 PM #


Awesome Post, We were trying to write code in .NET for this scenario, but your post saved the time and unnecessary coding


8/26/2009 8:52:03 PM #


This is a great post but with a source data like this how are you going to relate detail rows back to the header.  It seem like you'll end up with two useless lists of headers and lines.  I guess that's why it's a simple solution for us to build on for our own purposes.  :)

Jennifer United States

Add comment

  Country flag

  • Comment
  • Preview


Comment RSS
SQLIS | Looping over files with the Foreach Loop

Looping over files with the Foreach Loop

by Allan Mitchell 31 May 2005 14:00

In SQL Server 2000 Data transformation Services (DTS) it was a bit of a hack to be able to loop over files of a given type in a certain directory and import them into your destination. It involved a lot of "Glue Code" and a certain amount of fooling the package into going back to a previous task because it still had work to do. Well thankfully in SQL Server 2005 Integration Services (SSIS) that has all changed and this article is going to show you how.

The image below shows us how incredibly simple and clean the package will look when finished. There are some things worth pointing out at this stage. In the centre of the screen we see the Foreach Enumerator container and inside that we see the Data Flow task which houses the pipeline. At the bottom in the Connection Managers tray we see our Flat File Connection Manager (My Source File) and our OLEDB Connection Manager (My Destination). The Flat File Connection Manager is the one in which we are most interested for this article. Both of these managers are used in the Data Flow behind the DataFlow task. We will not be detailing the pipeline behind the DataFlow task in this article but it consists of a Flat File Source moving data to an OLEDB destination.


Let's begin then by opening up the Foreach enumerator and moving straight to the Collection node in the tree on our left. Below we see our information already populated.


What we see on the screen is pretty self explanatory but let's go through it anyway. We have chosen to enumerate over a file collection and ths is indicated by the value next to the Enumerator property at the top. We need to specify a folder over which to loop and for which type of files to look and we do that in the centre of the form. We are given three options as to what is returned when the loop finds a file in the folder at the bottom of the form. We can return the whole filename including extension and path, the name and extension or simply the name of the file found. Because our connection manager is going to need to know exactly where to find the file and it's name we have hosen the first option. The final thing we see on this screen is the ability to traverse subfolders. In our example we do not need to do this.

When the Foreach enumerator finds a file it needs to tell us about what it found and it does this by populating a variable. Click on to the Variable Mappings node now. Our package currently has no variables able to accept the name of the file so we are going to create a new one.


The next screen we see allows us to set the values of the variable. As we can see variables can be scoped in SSIS to certain executables in the package or to the package itself.


Here is how our variable looks with all its properties set.


Because the enumerator will only return us at most one value on every iteration we map our variable to an index of 0.


We have now configured everything as far as the Foreach enumerator is concerned. We now need to set the rpoerties of the Flat File Connection Manager. Highlight the manager in the tray at the bottom, right click and choose properties.


The important part of this dialog is highlighted and that is "Expressions". Click on the ellipses and we will be taken through to the next screen where we can start to create the expression. In the screen that follows, from the Property column drop the list down and choose ConnectionString


Now hit the ellises button to the right and we are taken through to the expression editor where we will build the actual expression itself.


Our requirements are pretty simple here and all we want to do is to retrieve the variable we defined earlier. To do this simply drag the variable from the list at the top to the expression text box at the bottom. Property Expressions can become very complex and we shall no dount be seeing more of them in future articles. After you have chosen the variable click OK


We now see that our expression is mapped to our ConnectionString property. Click OK

Finally we can now see our File Manager's Connection string property being mapped to an expression in the properties of the manager.


That's all there is to it. When the enumerator finds a file matching our requirements it will set the correct property on the connection manager and this will be used by the pipeline at runtime.

Comments (60) -

11/11/2008 7:31:02 AM #

kaushik saha

Excellent example, but i tried the same with sql 2008 IS and excel 2007, the same is working in sql 2005 IS and office 2003 with excel file.

I can see the provider difference between office2003(jet.oledb.4.0) and office2007(jet.oledb.12.0), rest are all same, nothing changed....

can u help to sort out why its working with sql2008/office2007..

kaushik saha Australia

11/11/2008 8:51:31 AM #

Allan Mitchell

Here is what I do

Use the Microsoft Office 12.0 Access Database Engine OLE DB Provider
ServerName is the full path to your XL file (and will be the expression you set)
On the [All] tree item I set the extended properties to Excel 12.0;HDR=YES

Looping over xlsx files in itself is not an issue

Works for me.

Allan Mitchell

12/4/2008 5:30:08 PM #

Crash Burke

Set DelayValidation = True on the connection manager and the data flow.

Crash Burke

12/10/2008 8:54:56 AM #


can any help me out on forloop and foreachloop containers
i am unable understand them

susheel India

12/17/2008 7:36:39 AM #

STeve Tahmosh

This is a fantastic example.
I am new to MS DB Tools (a long time Oracle person), and this type of capability is excellent.
I have a developer working with my direction, who has been charted to do just this as described above.
I extended the requirement that the value for the "My Source File Connection" "ConnectionString" property needs to come from a value in a Database Table.  I would think a simple transact-sql statement would do the trick.
This would be so that we can either: 1)  transparently move the folder to another location, or 2) have multiple environments on the same server, or 3) use the same code, unchanged (except for data) in dev and prod).
My developer said that this would require some ".net code", and is not a simple sql statement - the challenge is in getting the sql selected value into the property.
My question:
Is there a technique on this site which demonstrates how to make a property such as "ConnectionString" populated from a SQL statement?
I just learned of this site through a Wrox book "Professional SQL Server 2005 Integration Services"

STeve Tahmosh United States

12/17/2008 7:55:37 PM #

Allan Mitchell

OK so you read the value into a variable - easy
You now need to either look at Property Expressions or you need to look at Configurations.  You can set the property on the connection manager from there.

Allan Mitchell Germany

12/18/2008 4:55:13 AM #

STeve Tahmosh

Hi Allan,
Thanks - you are speaking to someone who knows SSIS.  I am not that person.  I will make another attempt to answer my question.
I can figure out how to query a table with a t-sql script.  That is not why I wrote this post.
I thought the original post was extremely helpful.

STeve Tahmosh United States

12/18/2008 6:05:25 AM #

STeve Tahmosh

Hi Allan,
You must be the same Allan Mitchell that co-authored the Wrox book professional sql serer 2005 integration services.
I am on chapter 1 :-)
I am simultaneously on Chapter 4 of the Wrox book on SSRS.
Basically, I would be interested in an overall view of where I should focus in order to accomplish my goal.
Given a large number of features in the tools, including scripting languages that one might assume familiarity with, how does one architect a system for flexibility (defined below)
Chapter 1 says "Variables allow you to dynamically configure a package at runtime.  Without variables, each time you wanted to deploy a package from development to production, you'd have to open the package and change all the hard-coded...  Now with variables you can just change the variables at deployment time...."
My goal is not to even have to change the variables at deployment time, but to write a package that queries database data at run time, and therefore would work in dev or production appropriately.
i.e., In one environ I have source files in c:\folder1, in a second environment, in c:\folder2, in production in d:\folder1, etc.
My original question was (perhaps better stated here):
What is the overall architectural components do I need (using your excellent tutorial on Looping over files) to accomplish this goal?
Do I need an ActiveX script (is that even the proper tool in SSIS any more) which invokes an ExecuteSQL TAsk to read the database variables and assign them to SSIS properties, and are those properties set with some kind of scripting tool?
In any case, if this question is too basic for this forum, I will continue along with Chapter 1 of the Professional Sql Server 2005 Integration Services (I'm almost done!), and may be able to answer this myself in no time.
Would welcome a pointer to a chapter in the book to focus on. (I am not looking to be spoon fed the code)
Keep up the good work.

STeve Tahmosh United States

12/18/2008 7:41:10 AM #

STeve Tahmosh

Hey, I'm continuing to plug along - it seems like the approach is in Chapter 3 of the Wrox Professional SQL Server 2005 Integration Services book.
Use an "Execute SQL" task and store the Result Set into Variables of (for a novice) Package Scope.
Use a "Script" Task to set the value of the "ConnectionString" property of the (using your example) "My Source File" connection to the value of the variable containing the Folder where files would live.
Am I getting warmer?
If so, seems I've got one thing remaining:
Dynamically Setting the ConnectionString property
Would this be done in the Script?
Is there a Script command to set, for example the ConnectionString Property to a value stored in a variable

STeve Tahmosh United States

1/8/2009 8:58:19 PM #


You are looking at combining this tutorial with the Shreding a recordset Tutorial

Ray United States

1/9/2009 8:28:13 AM #


In SQL 2008, the second image (For Each Loop Editor) looks different.
In order to see it, you have to select another 'Enumerator' type from the list (E.G: For Each Item Enumerator) and then select again the 'For Each File Enumerator'.
That will display the correct 'Enumerator Configuration' field as in this manual.
Good Luck!

lekfir Israel

1/13/2009 7:37:12 PM #


Hi there,

I did the above example and it works perfectly with text files, but I actually need to loop through tables in multiple Access files.  I repeated the same steps from above with a connection to an Access file instead of a text file and end up with this error: SISS Error Code DTS_E_CANNOTAQUIRECONNECTIONFROMCONNECTIONMANAGER.  The error comes up as soon as I set the ConnectionString property expression in the OLE DB connection manager.  The whole package runs smoothly when I set up multiple connections to all the Access files, but I cannot seem to get it to loop...any suggestions?

Thanks in advance for your help!


2/16/2009 10:02:34 PM #



I am having the same exact problem as you. Trying to do the same for each file loop, but with MS access databases. Did you ever figure this out. As soon as I change the connectionstring in the propery expression I get the same SISS Error code. SISS Error Code DTS_E_CANNOTAQUIRECONNECTIONFROMCONNECTIONMANAGER. I have tried it multiple times and have had no luck.

I hope you or someone else has a solution to this.

JesseLewis United States

2/24/2009 5:11:35 PM #


Hi!  Thanks for you advise above.

I have a related issue.  I have 4 For Each loops in series which all successfully loop through multiple Excel files of four different layouts, one loop for each layout.  My problem comes in when I don't have a file for all 4 loops.  The first, which is the only one that does not use the first row for column headers, will successfully "skip" the loop and move on without error.  The remaining three give me a "Package Validation Error ... code 0x80040E37 ... Opening rowset for "Sheet1$" failed.  Check that the object exists in the database."  All four are set to Delay Validation = True.  All four delete the input file if the remaining steps within the loop run successfully.

Is there a workaround for this?  One of my customer's main requirements is to be able to run any combination of the four file types wihtout all four necessarily being present.

Many thanks!

Kathleen United States

2/24/2009 7:25:13 PM #


Never mind:-)  We found it!  I needed to change the DelayValidation property of the Import task to True as well as the one in the Connection Manager.

Thanks anyway!

Kathleen United States

3/5/2009 7:32:04 AM #


i agree with this , which actually take all the files data and load it to single table .

but how to do if scenario is like this

i have 10 csv in a folder. so i need to load all the data of 10 csv to different table.

What i have tried is:
i used FOR EACH LOOP  container to read all the 10 csv files... but its loading all the data of 10 csv files to single table

praxy India

3/13/2009 5:43:12 PM #

Rahsaan Pringle

Amazingly enough, I am unable to get this type of loop working. I am familiar to VB.net, and new to SSIS. I have been all around the web looking for either tutorials or working code that shows me how to do what I need to do. My basic task is to loop through the files in a directory (csv or xls), pull the data (same structure in each file), and export it somewhere (csv or xls). If anyone out there could take the time to look at it, I would love to send you an example of my project, or some screen shots. The project is very small, and it only attempts to do what has been described here.

Rahsaan Pringle United States

4/22/2009 3:55:55 PM #


I have a req. where i need to loop through all the .sql files in a particular order and execute them.
So far iam successful in looping through a folder and execute all .sql files but how can i execute them one by one in a preferred order?

Any ideas? I was thinking to remane files with some numeric prefixes for eg; 1_filename, 2_filename

Now the issue is how do i loop in this order in SSIS? Im using a for loop container.

Binoj United States

4/23/2009 1:52:38 PM #


This example definitely works for me - I do however have a question about the variable that I created. I am wanting to store this filename in the database to indicate that this file has been processed, but, whenever I reference the variable, it attempts to define it as Unicode string[DT_WSTR] and Length = 0   ------- Why is that as with being length 0 - When I try and reference it in my derived column - it is blank. As it was a variable , how do I get this into my columns. Any ideas or pointers will be greatly appreciated. Thanks

Charkra United States

4/28/2009 7:38:54 PM #


hello guys can anyone let me know how to enable unicode in sql database 2)how to change oledbe connections to ado.net connection in configg files directly in the script


4/30/2009 4:26:19 PM #


Good after noon everyone.  I'm looking to get some help on a particular For Each Loop container that I'm attempting to create.  What I need is to loop through a collection of txt files, but after each enumeration I want to attach a number to the data within the file.  An example would be.

File1.txt (all records get a 1 appended to the end of each row)
File2.txt (all records get a 2 appended to the end of each row)
File3.txt (all records get a 3 appended to the end of each row)

Alvin United States

5/11/2009 3:29:46 AM #


This example was great!  Thank you so much.


5/12/2009 4:21:44 PM #

Richard Cranston

This was a great example; I was trying to use the File Watcher which had to be installed, but this was easier to use and it worked the first time..thanks

Richard Cranston United States

5/19/2009 9:15:03 AM #


This is very helpful.  We have a requirement to do something similar except I was considering using raw files.  How would the raw file source connection need to be set up to loop through the files?

gj United States

5/19/2009 1:36:02 PM #

Darren Green

GJ, The Raw File Source does not use a connection manager. The source wants the file name direct, and it can accept variables, so skip the section about setting a property expression. On your Raw File Source change the AccessMode property to be "File name from variable", and set the FileNameVariable property to be the User::FileWeJustFound variable.

Darren Green United Kingdom

5/20/2009 4:47:05 AM #


Thanks Darren.  I'm new to SSIS and I appreciate the help.  I've followed the above steps as suggested and the raw file source seems good now.  The raw file source connects to an OLE DB destination and I'd like to load each file into a different table in the ForEach Loop.  The file names are the same as the table names other than file extension.  How should the OLE DB destination be setup for this?

gj United States

5/20/2009 8:45:55 AM #

Darren Green

The OLE-DB Destination has a similar AccessMode property, select one of the "Table or view name variable" options depending if you want fast load or not. You need a second variable for this, the table name, which can be derived from the filename as you suggest. Use an expression on the variable to do this, see an example expression for getting the filename minus the extension at wiki.sqlis.com/.../ExpressionSamples.html. Similar to how you can set an expression on a connection manager in the example above, you can do this on a variable, so the value of the variable is actually from the expression. Set the EvaluateAsExpression property as true, and set the expression itself on the Expression property of your variable.

Darren Green United Kingdom

5/20/2009 11:52:57 PM #


I created a table variable specifying the table variable expression property to the "filename minus the extension" expression.  My raw file source is set to use the FileWeJustFound variable and the OLE DB destinations is set to use the table variable.  When I try to run the package without a value for the FileWeJustFound variable, it errors saying the file name hasn't been properly specified.  If I try to run with the path to one of the files as the value for the FileWeJustFound variable, it seems to make it through the first table and fails.  

Does there need to be a value specified for the file variable?  The error messages from when I set a value for the FileWeJustFound variable are - [Raw File Source [410]] Error: The output column "colum_name" (453) is mapped to an external metadata column that does not exist.  Thanks again for your help on this Darren.

gj United States

6/2/2009 4:40:13 AM #

Abhinay B

Brilliant Article, just what i needed. I'm also using a script to check if the file exists, so i didn't have to make much changes to that, just passed in the new variable.
But one problem with this approach is that, if we want to change the configuration or column length of the file, we have to remove the expression, choose a file, make changes and put back the expression.

Abhinay B Singapore

6/4/2009 9:45:57 PM #

James Baird-Kerr

All I can say is thank you, thank you, thank you!!!
Your excelent example was exactly what I was looking for and it let me implement my solution easily.
U R great!

James Baird-Kerr United States

6/18/2009 8:45:26 PM #


I have a requirement of looping through x files as explained above. But incase the load of a particular file fails, package should not stop but rather should continue with the other files execution.
Please suggest how can I do that ?

tarun United Kingdom

7/15/2009 5:03:28 PM #


First off, thanks for the example.  It is very useful for what I need to do.

Question 1:  I haven't seen this mentioned explicitly.  It is true that the Foreach Loop will always go through the files alphanumeric order?  I have file names which have a datestamp component to them (i.e.  file20090714, file20090715, ... etc) and wish to process them in this order.

Question 2:  I have file names which have both a datestamp and timestamp component to them (i.e. newfile20090715105307).  I'm only supposed to receive and process one file per day.  If there is more than one file in the directory then I wish to abort the processing.  Any thoughts on how I could do this?  

Any help you can provide will be very much appreciated.  Thank you.

sps Canada

7/20/2009 6:21:45 PM #


Muy bueno el articulo. Great job!

Nacho Uruguay

8/31/2009 8:28:48 AM #



I am getting the file name  inserted into table but not in order. I am inserting 100 files.  What i need is i want to insert name of the file as one record in the the table for the each file in am inserting.  Can anybody help me?

vyas India

10/5/2009 8:32:19 AM #


Thanks man! That was really helpful!

Mike Israel

11/9/2009 11:38:07 AM #


Thanks to Allan Mitchell and (U.S.) Kathleen's accurate comment on to set DelayValidation on the Data Flow task object & Source object (inside the task object).

I was able to finally, automatically parse in submitted Excel files!  Before, I wasted hours of my life manually setting them and therefore, redoing when I made mistakes.

Thanks a lot!!!

Rostand Philippines

11/25/2009 7:04:29 PM #


Hi, this post has been very helpful, thank you. Unfortunately I still seem to be running into an issue. I have a process which loops through files in a given folder. The file name is read and validated to determine the appropriate Data Flow Task to execute (as there are currently four types of files requiring unique data flows). The process runs great for the first file. However it appears that the Foreach Loop Container is not updating the Package Variable (in relation to this post example the variable is called "FileWeJustFound") with the next file name. I have delayed all validation on the connections and the dataflow tasks with no success. Any ideas?

TennesseePaul United States

11/30/2009 3:57:16 PM #


Found it, just need some turkey. The Precedence Constraint was set to "OR" when it should have been "AND" for Constraint and Expression.

TennesseePaul United States

12/10/2009 7:14:09 PM #

John Fuhrman

How can I do the same thing but with MDB files and keep from importing records that already exist in the destination tables?

John Fuhrman United States

12/27/2009 11:19:06 AM #


Hi, I need to read the files in the loop in the ascending order of time when the files were created. Is there a way to do it?

Deepak Denmark

1/22/2010 3:39:11 PM #


This example can be further extended with the MultipleFlatFiles Connector (right click Connector window) to process multiple input files in parallel!

Create a multiFlatFile source
Set the FlatFileDatasource to the Multipath Connection
then you need to copy and paste your Dataflow to create a second instance of it

run the package and you should see it processing both Dataflows! (may need to set concurrent threads etcs on the package)

ColinR United Kingdom

1/26/2010 3:09:54 PM #

Ronald Bijlhouwer


Interesting comment about the MultipleFlatFiles Connector.
However, I think a disadvantage of this is the missing possibility of skipping to the next file when an error occurs, like you can when using a Foreach Loop Container. Or am is mistaken?

Ronald Bijlhouwer Netherlands

2/11/2010 1:55:45 PM #

Koen Verbeeck


great article. Is it possible in some way to go to the next iteration of the loop? (like the 'continue' keyword in C#)
For example:
I loop over some files and I process their data in the data flow. In the data flow, I have a component that checks the validity of the data. If errors are found, the file is skipped and the foreach loop goes to the next file.

Koen Verbeeck Belgium

3/12/2010 3:54:55 PM #


This was very helpful.  What connection string is needed for SSIS to SQL Server 2000?  

I created a source OLDDB to SQL Server 2000, then have a foreach loop changing connection.

I was using:  
Provider=sqloledb;Data Source=SQLServerName;Integrated Security=SSPI;Initial Catalog=master

Then tried:
Data Source= SQLServerName;Initial Catalog=master;Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=False;


4/9/2010 11:56:40 AM #


This article is good, but as usual it did not work for me. I would like to loop through csv-files (all have the same structure) in a directory and insert the contents into a mssql table. I added a standard value for the file string variable so that I could create the mappings in the SSIS package. (The variable is used as the ConnectionString for the Flat file connection manager as shown above). Without the initial value I could not configure the mapping because it shows no file defined and the rest of the options are then deactivated. The loop works but the variable for the file name is not updated which results in the same file (standard value from variable) being imported 28 times (number of files in the dir). I don't understand why this example worked for everyone else. Or is my example differnt because of the mappings for the table? The variable is not readOnly. What do I need to change so that the File-Variable (i.e. "FileWeJustFound") is updated? Any help would be really appreciated....


P.S. I fantasize about punching everyone at Microsoft in the face....regularly. It feels so good to "say" that out loud.

LanceR Germany

4/22/2010 3:43:21 PM #



How I can specify the table target in there? I flat file config cannot find the selection of database and table to import ?

In normal import wizard there is a section where I can map file fields to table columns .


Luke Panama

6/18/2010 9:09:25 AM #


   this enough to know on iterating files by using foreachloop container


6/22/2010 10:54:05 PM #

Bob McC

When I use this task to loop through files defined as *.zip for example, it picks up files as if I defined my enumerator configuration with *.zip*

In other words it also picks files like bob.zipxxx, jim.zipaaa.  I ONLY want files that end in .zip  How can specify that?

Bob McC United States

7/1/2010 1:41:25 PM #


Excellent article, Allen. Thanks much for sharing it with the rest of us.

Shawn United States

9/1/2010 2:37:21 PM #

Ragan Martin

This article is excellent just what I needed.


Ragan Martin United States

2/2/2011 8:31:40 AM #



In the Property Expression Editor, i am not able view the connection string as mentioned above, can you guys help why it so



Bala Indonesia

2/21/2011 10:07:52 AM #

Darren Green

Bala, when you first open the Property Expression Editor there may be no properties shown. You will have to chose the one in the Property column, the cell of which is a drop-down.

Darren Green United Kingdom

3/8/2011 12:21:39 PM #


Excellent can you please provide me all transformations like this. I am eager to learn SSIS

Ravikiran India

3/10/2011 11:20:04 AM #


Excellent example. Now I am trying to import excel files to sql server db. I could setup data flow and have the data in sql table. Now, i need the excel file as a value in one of the column in the table.. how to do it?

roopa India

3/18/2011 6:00:15 AM #


Thank you for the wonderful example on the foreach loop container. In the above example, I would like to write the file names, start and end time of the data flow task to a log table. Can you please guide me how to achieve this?

Appreciate your help!!!!


Pal United States

4/14/2011 1:45:01 PM #


I'm using a container in SSIS 2008. It reads and imports multiple files .  If there is a problem while importing one file the package fails.  I'd like it to read the next file, and try to import that file.  Is there an easy way to do this?  Thanks.

bmo United States

4/16/2011 4:22:44 PM #


Thank you for the execlllent example and saved my a?? (Sorry too excited)

Appreciate your help!!!!


Jas United States

6/15/2011 2:19:02 PM #


Please can somebody explicitly attend to this?
I am using foreachloop to read 5 excel files. The method stated for the flatfiles never worked.I tried to use variable for the connectionString and/or the ExcelFilePath but I am getting this error after a lot of time-consuming troubleshooting.Please  can an expert clearly take me through the steps?
You can mail an example to me":current@gmail.com

The error

TITLE: Microsoft Visual Studio

Error at ForeachLoopExcel [Connection manager "Excel Connection Manager"]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available.  Source: "Microsoft JET Database Engine"  Hresult: 0x80004005  Description: "The Microsoft Jet database engine cannot open the file 'C:\Documents and Settings\MyDocument\My Documents\Integration Services Script Component\Projects'.  It is already opened exclusively by another user, or you need permission to view its data.".

Error at Data Flow Task [Excel Source [1]]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.


Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)



Current United Kingdom

3/28/2012 1:30:35 PM #


Excellent document. In addition to this I would like to know how to extract the file name that has been read recently.
Any information on this is really helpful

Goutam India

6/24/2012 10:18:41 PM #

Brian Foroud

But what about loading into differnt tables from each of these DYNAMICALLY? In the other words, what about the data flow task in your initial screenshot?

Brian Foroud United States

Add comment

  Country flag

  • Comment
  • Preview


Comment RSS