Friday, February 7, 2025

SSIS: REPLACENULL does not support (DT_DBTIMESTAMP2,7)

 Using the "REPLACENULL" functionality frequently in the "Derived Column" component, the "Conditional Split" component, and other places in SSIS where formulas can be applied is common.

However, I recently encountered an issue with the "DT_DBTIMESTAMP2" data type.

The following formula produced an error:

REPLACENULL(TestDt, (DT_DBTIMESTAMP2,7)"1900-01-01 00:00:00.0000000")

Error: 0xC020902A at Test Transformation, Derived Column [2]: The "Derived Column" failed because truncation occurred, and the truncation row disposition on "Derived Column.Outputs[Derived Column Output].Columns[TestDt]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.

This error occurs because "REPLACENULL" returns a "DT_WSTR" data type. To make it work, we need to convert "DT_DBTIMESTAMP2" to "DT_WSTR" and then convert it back to "DT_DBTIMESTAMP2", like this:

(DT_DBTIMESTAMP2,7)REPLACENULL((DT_WSTR,30)TestDt,"1900-01-01 00:00:00")


Alternatively, a more elegant solution is to replace "REPLACENULL" with an "IF" condition:

ISNULL(TestDt)?(DT_DBTIMESTAMP2,7)"1900-01-01 00:00:00":TestDt




Thursday, January 30, 2025

SSIS warning message about SSAS Cube processing: "Operation completed with XXX problems logged"

If you process SSAS cubes via SSIS packages you might notice a weird message like "Full Processing:Warning: Server: Operation completed with XXX problems logged."

How you can get that message (if you have that problem):

1. You can do a report, from your package's execution and get something like this:

SSIS processing SSAS Cube Warning message


2. You can run a T-SQL script against your SSIS server:

SELECT TOP 100 message_time, message
FROM [SSISDB].[internal].[operation_messages]
WHERE message_type = 110 AND message_source_type = 40
   AND message LIKE '%Warning: Server: Operation completed with%problems logged.'
ORDER BY message_time DESC;

If you have that problem you might have something like this:


The Problem.

1. SSIS Server does not provide you any details on that warning nor any associated problem.
2. SSAS Server also does not report any problems associated with that Cube processing.

Solution.

You can use "Extended Events" to capture these problems:

1. For that, you have to create an Extended Event Session using following script on your SSAS Server

<Create xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
  <ObjectDefinition>
    <Trace>
      <ID>SSAS_CubeProcessing_Stream</ID>
      <Name>SSAS_CubeProcessing_Stream</Name>
      <XEvent xmlns="http://schemas.microsoft.com/analysisservices/2011/engine/300/300">
        <event_session name="SSAS_CubeProcessing_Stream" dispatchLatency="0" maxEventSize="0" maxMemory="4096" memoryPartition="none" eventRetentionMode="AllowSingleEventLoss" trackCausality="true" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
          <event package="AS" name="ProgressReportError" />
          <target package="package0" name="event_stream" />
        </event_session>
      </XEvent>
    </Trace>
  </ObjectDefinition>
</Create>

2. Then you run "Watch Live Data" for that session

3. Run your SSIS Cube processing package and monitor the events.

As the result you'll get something like this:


A Problem description you can find in an Event's details under "TextData" name:


At the end, do not forget to delete your Extended Events monitoring Session.

Monday, May 13, 2024

Caveats of using an Expression for SQL Script in SSRS.

Why: Most of the time, when you want a flexibility of your SQL query you can use parameterization. However there might be a situation when you'd need to build a dynamic query. In my case I used SQL query within an expression to feed it to multiple data sources targeting different servers with the exact same query.

DataSet creation: Creation of a simple dataset.

I've created a sample dataset with a sample query:

-- That is a sample query
select top 10 * 
from sys.messages


After we created the dataset we can create a Tablix in our report and test it:


Creation of an Expression

1. Create a new parameter called "SQLExpression"


2. Go to the "Default Values" tab select "Specify values" and choose "fx" box

3. Specify Expression value:

Assign Expression to the DataSet: Return to the Sample DataSet and Specify newly created parameter as the source for DataSet Expression.

Within the expression replace the query by the parameter:
Save changes and try your report.
If  you've done exactly like I did you get an empty report.

Troubleshooting: If you specified your parameter as "Visible" you might notice that report processor aggregated all 3 rows of our query into the one, making it a single comment


Other cases: My case was very easy, but in the most of the cases you'll usually get a weird error message for your query, while the query itself runs fine in SSMS.

Lessons Learned: When we use SQL Query Expressions we have to follow these rules:
1. Use only DOUBLE comment or do not use comments at all.
2. Put at least one space or a tabulation before the very first symbol on each line.
3. Use semicolon symbol to separate multiple SQL instructions.
4. Do not use "GO" command.

The Fix: Change the default value for "SQLExpression" parameter to following query: 

/* That is the Fixed SQL Query
 refurbished for SSRS expression usage*/

 DECLARE @MessageID INT = 101;
 SELECT *
 FROM sys.messages
 WHERE message_id = @MessageID;

Save the parameter and re-run the report.

Enjoy the result of your SQL Query Expression:

Conclusion: Use of SQL Expressions is very easy if you fallow those 4 rules. 

Please let me know if you hit any other situation in question and I'll add it for others to avoid.



Sunday, February 7, 2021

Monitoring Analysis Server (SSAS) using Spotlight (Quest Software)

 This post is just answering two simple questions:

1. Can Quest Software's Spotlight successfully monitor SQL Server Analysis Server?

2. If it can, what SSAS parameters, databases' and cubes' details it monitors and provides information about?


To answer these question I'd have to download, install and configure Spotlight software.

I've downloaded the trial version from official Quest Software web site: 

https://www.quest.com/products/spotlight-on-sql-server-enterprise/


So, the answer for the first question is "Yes".

The newest 13.3 version of the application monitors not only Windows box and SQL Server, but  Analysis Services as well!


After fully configured on my local machine it looks like this


I've configured it to monitor my local SQL Server, Windows box and Analysis Services Server.
I'm not really interested in Windows or SQL Server and click to see "Analysis Services"

It is common Quest Software design and I do not have to learn on how to use it.
The screen provides following metric categories:
  • Connections
  • Memory
  • Storage
  • CPU
  • Server
When I click on some metrics it opens a menu, to see the details
For instance when I click on connections I get following drop down menu:

When I click on "Eye" sign I can see the list of current connections with all their parameters:

From there I can easily switch tab to current "Sessions"

Or just to list of most recent commands. If I select a command I can see the most recent query for a connection:

Memory section allows me to see general memory usage, caching metrics and threads' stats:

Storage section provides me information about my SSAS databases.
In this case I have only one AdventureWorks DB:

And from that screen I can easily switch to "Cubes"

CPU information is reporting only general box metrics of the processor like CPU Utilization, number of "Interrupts", "Kilobytes Transferred" and "Server Work Queues":

When I switched to "Processing" tab I could see other box metrics, such as Queue Length, Threads and Processes count, and Context Switching:

The "Server" section is least informative. It provides me only my server version and edition and status of Disk and Memory:


Besides of all these semi-static data there are supposed to be provided immediate data from my SSAS:
  • Queries Answered/s
  • Queries Requested/s
  • Queries From Cache Direct/s
  • Queries From File/s
  • Temp File Rows Written/s
  • Rows Read/s
  • Rows Written/s
  • Direct Hit Ratio
  • Current Latch waits
  • Current Lock waits
  • Query Pool Job Queue Length
  • Etc.

Unfortunately I have only one databases and couldn't generate a significant activity to rise those numbers above zero level.


Hope my info helped you to answer similar questions as mine.

And stay tuned, I plan to hit other SSAS monitoring tools and go beyond of it.

Monday, July 27, 2020

SSRS. Use of Lookup type functions.

There is a lot of blog posts in the Internet about how to use Lookup and other functions of this type in SSRS.
It is very straight forward and easy to use.
However, I've managed to make a mistake and I assume have the same problem if you got to that page.

The Error

At firs, here is the error I've got when I tried to use Lookup function in SSRS:
The expression used for the calculated field 'ProductName' includes an aggregate, RowNumber, RunningValue, Previous or lookup function. Aggregate, RowNumber, RunningValue, Previous and lookup functions cannot be used in calculated field expressions.

How did I get there?

At first, I've created two data sets "WorkOrders" and "Products"

Then I wanted to include "Product Name" into my "Work Order Report"
Here is How I've done it:

The WRONG WAY

(that is how it shouldn't been done)
Obviously, if I use a function I thought I have to use it in a calculated field

But That is Wrong

You should do it the right way:

The RIGHT WAY

That is not obvious, but you have to use "Query Field" to use the Lookup function:

Then you just have to specify New Field's name and insert a formula.
In my case I used following:

=Lookup(Fields!ProductID.Value, Fields!ProductID.Value, Fields!Name.Value, "Products")


That worked very well for me and I hope you will struggle less by reading this.
Good Luck


Wednesday, February 12, 2020

Use of Uniqueidentifier in Persisted calculated column.

This post is for you in case you decide to use Uniqueidentifier column in your table and then you think about including it into a Persisted calculated column.
You also might see it useful if you like weird or funny SQL Server behavior.

At first, here is the SQL Server version I have that problem on.
After a while Microsoft might fix that bug.
Microsoft SQL Server 2016 (SP2-CU11) (KB4527378) - 13.0.5598.27 (X64) 

To demonstrate the problem in the most simple way you can run following script:
DROP TABLE IF EXISTS #tbl_Error_Test;
GO
CREATE TABLE #tbl_Error_Test(
       GuidOne UNIQUEIDENTIFIER,
       GuidTwo as CAST(GuidOne as CHAR(36)) PERSISTED
  );
GO
INSERT INTO #tbl_Error_Test(GuidOne) VALUES (NewID()), (NewID());
GO
SELECT * FROM #tbl_Error_Test;

GO

It will return something like this:

As you can see, columns GuidOne and GuidTwo are different.
Moreover, if you run following command you will get very unpleasant error:
DBCC CHECKTABLE('#tbl_Error_Test') WITH NO_INFOMSGS, EXTENDED_LOGICAL_CHECKS;

Msg 2537, Level 16, State 106, Line 152
Table error: object ID -1485216446, index ID 0, partition ID 6052840780930088960, alloc unit ID 2738196559872000000 (type In-row data), page (4:1079768), row 0. The record check (valid computed column) failed. The values are 2 and 0.
Msg 2537, Level 16, State 106, Line 152
Table error: object ID -1485216446, index ID 0, partition ID 6052840780930088960, alloc unit ID 2738196559872000000 (type In-row data), page (4:1079768), row 1. The record check (valid computed column) failed. The values are 2 and 0.
CHECKTABLE found 0 allocation errors and 2 consistency errors in table '#tbl_Error_Test_____________________________________________________________________________________________________000000001926' (object ID -1485216446).

repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKTABLE (tempdb.dbo.#tbl_Error_Test_____________________________________________________________________________________________________000000001926).

If you try to check what the correct value should be, it won't make any good:
SELECT GuidOne, GuidThree = CAST(GuidOne as CHAR(36))
FROM #tbl_Error_Test;


The SQL Server is smart enough to recognize same function and instead of calculating the value it will return you the value stored in the table.

You can only do a trick if you change the function and convert Guid to 37 chars, instead of 36:
SELECT GuidOne, GuidFour = CAST(GuidOne as CHAR(37))

FROM #tbl_Error_Test;

Now you got the right result, but how to fix it in the table?

It is not easy, but possible. You just have to replace Guid column by itself, but from another table:
UPDATE G1 SET GuidOne = G2.GuidOne
FROM #tbl_Error_Test as G1
INNER JOIN #tbl_Error_Test as G2
       ON G1.GuidOne = G2.GuidOne;
GO
SELECT * FROM #tbl_Error_Test;

GO


It is fixed now and it also fixes DBCC CHECKDB error.

If you see the same behavior on your SQL Server you can vote for that bug to be fixed at Microsoft site:
https://feedback.azure.com/forums/908035-sql-server/suggestions/39694663-use-of-guid-column-in-persisted-calculated-column

Friday, December 20, 2019

Recovering from Severe Database corruption error 9100

In my previous post I outlined very dangerous SQL Server problem, caused by usually not very harmful commands "PARSE" and "TRY_PARSE": Having "NaN" value for REAL and FLOAT producing severe error.

This post will be about a major issue it causes and on how to fight it.

Will start from generating the problem.

Attention: Do not run that in production!!!
Here is a script to generate the issue:
DROP TABLE IF EXISTS tbl_Real_Test;
GO
CREATE TABLE tbl_Real_Test (ID UNIQUEIDENTIFIER, R REAL, F FLOAT);
GO
INSERT INTO tbl_Real_Test (ID, R, F)
SELECT NewID(), TRY_PARSE(r as REAL), TRY_PARSE(f as FLOAT) 
FROM (VALUES
       ('0','434534776544')
       ,('0.0000323', 'NaN')
       ,('3.3881317890172E+17','')
       ,('','000000000000x000000000000')
       ,('NULL','ABCDEFG')
       ,('Null','--------------------')
       ,('null','!!!!!!!!!!!!!!!!!!!!')
       ,('ABC','345435467655665676545')
       ,('NaN','5.3881317890172E+47')
       ,('Nan','0.0000000000345')
       ,('nan','434.34543543')
) as R(r,f);
GO
SELECT * FROM tbl_Real_Test
WHERE R Is not Null ;

GO

That script will generate following error:
Msg 9100, Level 23, State 2, Line 57
Possible index corruption detected. Run DBCC CHECKDB.

Troubleshooting:

You might try to run "DBCC CHECKDB" against that database like this:
DBCC CHECKDB('Slava_Temp_Test');

DBCC CHECKDB resulted couple of errors and besides of that it made an error Dump into an ERRORLOG file and created four other DUMP files:
Msg 2570, Level 16, State 3, Line 62
Page (1:16200), slot 1 in object ID 2002106173, index ID 0, partition ID 72057594042843136, alloc unit ID 72057594048479232 (type "In-row data"). Column "F" value is out of range for data type "float".  Update column to a legal value.
Msg 2570, Level 16, State 3, Line 62
Page (1:16200), slot 8 in object ID 2002106173, index ID 0, partition ID 72057594042843136, alloc unit ID 72057594048479232 (type "In-row data"). Column "R" value is out of range for data type "real".  Update column to a legal value.
DBCC results for 'tbl_Real_Test'.
There are 9 rows in 1 pages for object "tbl_Real_Test".

CHECKDB found 0 allocation errors and 2 consistency errors in table 'tbl_Real_Test' (object ID 2002106173).

Fixing attempt:

To fix the issue you might try following commands
DBCC CHECKDB ('Slava_Temp_Test' , REPAIR_REBUILD) WITH ALL_ERRORMSGS;
GO
DBCC CHECKDB ('Slava_Temp_Test' , REPAIR_ALLOW_DATA_LOSS) WITH ALL_ERRORMSGS;

GO

However, in order to perform the fix DBCC CHECKDB requires the database to be in a single user mode:
Msg 7919, Level 16, State 3, Line 141
Repair statement not processed. Database needs to be in single user mode.

The problem is that it might be not possible to switch database into a single user mode, because it is part of an Availability Group and experiencing very heavy user activity. The use of conventional SQL Server tool would cause disruption in Production for who knows for how much time.

Solution:

To solve that problem we do not have to switch database in single mode or restore it from a previous backup to eliminate the issue. You just run an update script over your problematic table and columns to replace bogus values, which SQL Server can't interpret correctly, by NULLs:
UPDATE tbl_Real_Test
SET R = CASE CAST(R as BINARY(4))
       WHEN CAST( PARSE('NaN' as REAL) as BINARY(4))
       THEN Null Else R END
       , F = CASE CAST(F as BINARY(8))
       WHEN CAST( PARSE('NaN' as FLOAT) as BINARY(8))
       THEN Null Else F END;
GO
SELECT * FROM tbl_Real_Test
WHERE R Is not Null OR F Is not Null;

GO

The result of this query should be like this:

Disclaimer and Advertisement:

There are probably a lot of other cases, which might cause "Database corruption error 9100" and if that particular solution does not apply to your environment you can contact me for the resolution.