To illustrate this, an exemplary scenario is run through in a fictitious company. Consider the following initial situation: A server (Windows Server 2016) is operated with a SQL Server 2016, both in the English language version. The same applies to all other components, such as Visual Studio 2017 for managing the SSIS packages and the SSAS databases.
Now the regional settings of the server (date, decimals, thousands separator, etc.) are changed to German. In various SSIS packages, data is imported from an AS400 using a data flow task. This leads to various error messages:
- DTS_E_PRIMEOUTPUTFAILED with error 0xC0202029
- DTS_E_INDUCEDTRANSFORMFAILUREONERROR with error 0x0209072
- The value could not be converted because of a potential loss of data.
In Visual Studio, everything went through without a hitch. The data from the AS400 is called up via OLE DB (IBM DB2 for i IBMDA400 OLE DB Provider), but not via the SQL server agent. The error messages were generated there. The next step is to look at the data type of the error field shown. In this example it is numeric(18,2), the data type of the target column is also numeric(18,2). This leads to the following questions:
- Why can't the SQL server read this data type correctly, or why in Visual Studio but not in the SQL Server Agent?
- Why does this only affect fields with descendants?
We asked the database expert in our house how he would proceed in this situation. He started a test run for us.
"First of all, I would create a test SSIS for more complex error searches and take a closer look at the source task via the advanced editor. There are various possibilities to make column-by-column adjustments. For example, I will use the DWMENG column, which is also shown in the error message. Please note that all columns with numeric and decimal places must be adjusted.
"It is just as important to make sure that the accuracy of the commutation remains the same. If you then start a test run, the job runs without errors. However, you notice that the numbers were taken over without taking the decimal places into account and the data is therefore not correct, even though it was displayed correctly in the preview.
"Next, the regional settings are reset from German to English and tested again without changing anything in the SSIS package. As expected, the numbers are now correct. So the package is reopened in Visual Studio in the English regional setting, saved and republished. After a new test run, everything is then displayed perfectly and correctly. So we find that the AS400 interface and the SQL server can no longer reasonably handle the decimal places when changing the regional settings."
There are 2 possible solutions:
If you still want to use the speed of the OLE DB interface, the output column would have to be marked as "string [DT_STR]" in the advanced settings.
In the query on the source table, one must then adapt all converted fields to the decimal reality with the command REPLACE: REPLACE(COLUMN,',','.') AS Column.
Alternatively, e.g. for a small number of data records, the ADO.Net interface can be used instead of the OLE DB connector. This is similar in structure to the OLE DB interface. However, it can only handle nvarchar fields, so that a "Data Conversation" task would have to be built in, as only varchar fields are used there so far.