Friday, December 20, 2013

Writing Advanced Snippets When Building Templates

Both Clarity and TM1 support the MultiDimension eXpression (MDX) query language through snippets. Snippets are single MDX phrases that can be evaluated as part of an MDX query. At first glance, it can seem intimidating, but starting with a few key functions, it's easier to start using it to put together advanced snippets.

Let's go through an example. Let's say we have the following cube.

Our dimensions are:
  • Year
  • Time
  • Entity
  • Account
  • Measures
When building a template in ClarityStudio, I can select Member, Children, IChildren, Descendants, IDescendants and BottomLevelDescendants by right-clicking on an element in a dimension hierarchy. Let's start by explaining how these ones map to MDX expressions, using the Full Year member from the Time dimension.

Clarity Function
Description
MDX Function
MDX Snippet
Members Included
Member
Show just the one element I picked from the list
None
[Time].[Full Year]
Full Year
Children
Show just the elements directly below the one I selected
[Time].[Full Year].Children
Q1, Q2, Q3, Q4
IChildren
Show the elements directly below the one I selected, plus the one I selected
{[Time].[Full Year].Children, [Time].[Full Year]}
Q1, Q2, Q3, Q4, Full Year
Descendants
Show all elements anywhere below the one I selected, regardless of distance
Descendants(
[Time].[Full Year])
Q1, Q2, Q3, Q4, Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec
IDescendants
Show all elements anywhere below the one I selected, regardless of distance, including the one I selected
{Descendants(
[Time].[Full Year]) , [Time].[Full Year]}
Q1, Q2, Q3, Q4, Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec, Full Year
Bottom Level Descendants
Show only the elements that have no children below the one I selected. These are also known as leaf elements.
Descendants(
[Time].[Full Year]
,,LEAVES)
Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec

Now that you've seen how the ones you already know map to MDX expressions, you can play with other MDX expressions to help get exactly the sets you want for your template.





Thursday, November 28, 2013

Automated Regression Testing for Clarity and TM1

  One of the best things about IBM Clarity7 and IBM Cognos TM1 is their flexibility and empowerment to build templates fast. This quick development time introduces a natural challenge... did we break something? Here are some examples of simple things that break old code.

Common Challenges That Break Old Code

  • Adding new children under a node that was previously a leaf
  • Changing network share names
  • Changing server names
  • Changing paths (of anything)
  • Renaming (of anything)
  • Changing scope statements
  • Changing stored procedure calculations 

Change is Good

  One solution is to never change the system. That is equivalent to shelving the product and going back to Excel. Given that option is out the door, we have to assume change WILL happen. In fact, we want to encourage change. We want the system to be the most valuable representation of the business for decision makers.

Trust

  The system should be the trusted, single source of the truth. Numbers must be consistent, reliable, punctual and accessible. Once trust in the numbers is broken, it is hard to rebuild that trust. Fear of losing this trust hinders many organizations from making any changes at all. This latency pushes users to go back to Excel and find other ways to get the job done under time pressure. Instead of helping information workers, the system sits idle. We need a way to keep the trust of users, while still improving and evolving the system to keep up with business realities. This solution must also be transparent. If anyone wishes to question the numbers, they should be able to do so. This helps them build confidence that they understand exactly how numbers flow and calculate across the system. If they are concerned something is wrong, they are empowered to investigate and identify issues themselves, without reliance on IT.

Excel is the Common Denominator

  Whether files are created manually, with TM1, Clarity or something else, Excel is the common denominator. Excel is the defacto business standard for analysis. Once we accept this, we can steer away from proprietary comparisons of weird format types. To build trust, we need to be transparent to users down to the Excel level.

Generating Excel Files

  Any reporting tool can automatically spit out Excel files. There is always a tool to export data to Excel. There is usually a tool to automate generation of set of Excel files. Even if this isn't available through a nice tool, there are lots of VBA code snippets to do the trick. Here is one on bulk reporting in TM1 that is quite good. So let's assume we now automatically generate 50 excel files with the click of a button.

Comparison

  After a full round of testing, we can take a copy of these generated files and call them our Gold Standard. From that point on, each night we generate the same set of reports and compare them to the Gold Standard version. If there is any difference between the files, we get an alert. This simple alert allows us to proactively notify users about a data challenge while we investigate. Instead of the users telling the team where the errors are, the team can tell users where the errors are. At first, this may sound like a bad thing, but it's not. Users get transparency into the deep level of regression testing that happens on a nightly basis. Over time, this transparency and notification build trust in the system. Users will be more comfortable using the data because they know we are watching our data quality very carefully. Users can invest more time doing analysis, rather than testing and data validation.

High Level Design

  Here is a quick diagram of the high level design for this process. First we build a list of the pairs of Excel files we want to compare each night. This could be as simple as comparing one directory to another. For each pair, we compare the Gold Standard Version to the one generated last night and capture the results. When reading results, if we encounter differences, we can decide what type of action to take or notification to send.


Next Steps

So, how do we build this? Firstly, you can ask us for help (http://www.clearinsight.ca @clearinsightfpm or info@clearinsight.ca)  or follow some of these steps.
  • Build or buy an Excel comparison tool. I recommend Diff Doc.
  • Build or buy a method to automatically generate sets of excel files periodically from your reporting tool. I recommend using the native scheduling tool in your reporting tool if you have one, or some VBA code for bulk reporting.
  • Build a process to compare these files and make appropriate action or send appropriate notification. I recommend using an SSIS package. Here is a link on how to start.





Tuesday, November 19, 2013

What is Hierarchy Security in IBM Clarity?

In IBM Clarity, hierarchy security is the security layer that controls who can see which dimension elements in a particular cube. Hierarchy security is just one component of the security profile. Here are some key definitions we need before discussing this further.


  • Users - These are the individual users or logins for the Clarity application
  • Roles - This is a group of users. A user can be a member of multiple roles. Roles can be members of other roles.
  • Menu Items - The folders and items that users can see when they each login. This can be very different from one user to the next, because of security.
  • Hierarchy Security - When running a report or template, which options are visible and/or selectable for a user.
  • Workflow - The sequential process of who enters numbers and who approves them for each entity.

What is Hiearchy Security and why is it important?
Let's use an example. Say we have a sales forecasting tool and each sales manager goes in to update their forecasts. Should all sales managers see everyone's information? Should they be able to update everyone's information? The most common scenario is that each sales manager can only see or update their own numbers. Regional managers can see a more broad area of the business, but they may not be able to update the numbers. Hierarchy security lets administrators set these boundaries. They only need to build one template, but since each person has different security, they are still limited to only what they should be able to see.

What if I Don't Set It Up?
Setting up hierarchy security is not required for Clarity to work. Clarity will work fine without it, but there will be no ability to filter options depending on the particular user running the template or report.

When Should I Use It?
The most challenging part of security is getting the business teams to decide on security groups and permissions. Once those permissions are decided upon, implementing it is quick. To facilitate the security discussion, I often use the attached Excel file. Using just Excel, we start with a list of all users and then discover roles, menu security settings and hierarchy security settings. Once documented, setting it up is easy.

Sending Email from IBM Cognos TM1 using Windows Powershell

Once you start building a TM1 solution, you'll quickly notice a lack of email components or tools... That's because there aren't any out of the box. Don't sweat it though. There are several good sites that describe how to use a VBscript file to send out an email. VBscript is fine and works well, but it's a very old technology. I think the replacement for VBscript is Microsoft Windows Powershell. It's already installed on your Windows server. I just needs a few commands to allow it to start accepting script requests. Below are some instructions on how to send email from TM1 using Powershell.

Why Reinvent the Wheel?

So, if there are already good scripts using VBscript that send email, why would I try to reinvent the wheel by doing the same thing a different way? Powershell can do much more than just send email. The script below doesn't just send email. This script does the following
  • Attaches the error log files and the .cfg file
  • Reads the most recent error file and puts the contents of the file directly into the body of the email
  • Shows some basic server environment variables 
  • Most recent lines of the TM1server.log file
  • Creates hyperlinks to the log folder, cfg folder and data folder
I encourage everyone to start with something like this, but extend it, to build the most meaningful email feedback tool for TM1 we can. Please comment here or tweet updates to @bobgillvan.


Steps to Setup TM1 Email Using Powershell

1) Turn on Powershell
Start..All Programs...Accessories..Windows Powershell...Windows Powershell (ISE)
set-executionpolicy Unrestricted
F5 (This just runs the code in the code window)
 
2) Customize and Test the Script
Here is the script. Just save this to a .ps1 file and then you can run it with powershell. I recommend calling this email.ps1 and then you can run Powershell ./email.ps1. You'll also need to enter your SMTP server information and any secure port references.

function sendMail{

     Write-Host "Sending Email"

     $pathLog = "C:\TM1Logs"
     $pathCfg = "C:\TM1Data"
     $pathTemp = "C:\Temp"

     #Wait to make sure all files from the TI process are written to disk. Sometimes there is a delay
     $i=0
     $filesWritten = $False
     do {
          $filesWritten = test-path C:\TM1Logs\tempemailfile*
          $i = $i+1
        }
     while ((-not $filesWritten) -and ($i -lt 10000))
    
     $fileToEmail = Get-Content C:\temp\TempEmailFileToEmail.txt
     $fileToName = Get-Content C:\temp\TempEmailFileToName.txt
     $fileFromEmail = Get-Content C:\temp\TempEmailFileFromEmail.txt
     $fileFromName = Get-Content C:\temp\TempEmailFileFromName.txt
     $fileToEmail = Get-Content C:\temp\TempEmailFileToEmail.txt
     $fileToName = Get-Content C:\temp\TempEmailFileToName.txt
     $fileSubject = Get-Content C:\temp\TempEmailFileSubject.txt
     $fileBody = Get-Content C:\temp\TempEmailFileBody.txt
     $fileLinkApplicationLogin = Get-Content C:\temp\TempEmailFileLinkApplicationLogin.txt
     $fileLinkDataFolder = Get-Content C:\temp\TempEmailFileLinkDataFolder.txt
     $fileLinkErrorFolder = Get-Content C:\temp\TempEmailFileLinkErrorFiles.txt


     # Remove doublequotes
     $fileToEmail = $fileToEmail.Replace("""","")
     $fileToName = $fileToName.Replace("""","")
     $fileFromEmail = $fileFromEmail.Replace("""","")
     $fileFromName = $fileFromName.Replace("""","")
     $fileToEmail = $fileToEmail.Replace("""","")
     $fileToName = $fileToName.Replace("""","")
     $fileSubject = $fileSubject.Replace("""","")
     $fileBody = $fileBody.Replace("""","")
     $fileLinkApplicationLogin = $fileLinkApplicationLogin.Replace("""","")
     $fileLinkDataFolder = $fileLinkDataFolder.Replace("""","")
     $fileLinkErrorFolder = $fileLinkErrorFolder.Replace("""","")


#     $fileToEmail
#     $fileToName
#     $fileFromEmail
#     $fileFromName
#     $fileToEmail
#     $fileToName
#     $fileSubject
#     $fileBody
#     $fileLinkApplicationLogin
#     $fileLinkDataFolder
#     $fileLinkErrorFolder

     #SMTP server name
     $smtpServer = "smtp.gmail.com"


     #Creating a Mail object
     $msg = new-object Net.Mail.MailMessage

     #Creating SMTP server object
     $smtp = new-object Net.Mail.SmtpClient("smtp.gmail.com",587)
#     $smtp = new-object Net.Mail.SmtpClient("smtp.secureserver.net",465)

     $smtp.EnableSsl = $true
     $smtp.Credentials = New-Object System.Net.NetworkCredential("gmailuser@gmail.com", "securegmailapppassword");
#     $smtp.Credentials = New-Object System.Net.NetworkCredential("username", "password");


     #Email structure
     $msg.From = "TM1Administrator@somewhere.org"
     $msg.ReplyTo = "TM1Administrator@somewhere.org"
     $msg.To.Add($fileToEmail)
#     $msg.To.Add("user@somewhere.org")
     $msg.subject = $fileSubject

     #Find most recent error file
#     Get-ChildItem $pathLog | where {$_.Name -like "TM1ProcessError*.log"} | sort LastWriteTime | select -last 1 | Write-Host
     $fileLastErrorText = Get-ChildItem $pathLog | where {$_.Name -like "TM1ProcessError*.log"} | sort LastWriteTime | select -last 1 | Get-Content

     #Attach Error Files
     Get-ChildItem $pathLog| % { if ($_.Name -like "TM1ProcessError*.log") {$msg.Attachments.Add($_.FullName)} }
#     Get-ChildItem $pathLog| % { if ($_.Name -like "TM1ProcessError*.log") {Write-Host $_.FullName} }

     #Attach TM1Server Log
     Get-ChildItem $pathLog | % { if ($_.Name -like "TM1server.log") {$msg.Attachments.Add($_.FullName)} }
#     Get-ChildItem $pathLog | % { if ($_.Name -like "TM1server.log" -and $_Size -lt 1) {Write-Host $_.FullName} }

     #Attach tm1s.cfg
     Get-ChildItem $pathCfg | % { if ($_.Name -like "TM1s.cfg") {$msg.Attachments.Add($_.FullName)} }

     #Show system links
     $msg.body = $fileBody + "`n`n" + "System links:`n`tLogin`t`t" + $fileLinkApplicationLogin + "`n"
     $msg.body = $msg.body + "`tError Folder`t" + $fileLinkErrorFolder + "`n"
     $msg.body = $msg.body + "`tData Folder`t" + $fileLinkDataFolder + "`n"


      #Show details of last error
     if ($fileLastErrorText -ne "") {
         $msg.body = $msg.body + "`n`n" + "Contents of most recent error file:`n" + $fileLastErrorText
     }
    
     #Show the most recent lines of the tm1server.log file
     $fileTM1serverLogFile = Get-Content ($pathLog + "\tm1server.log")
     $y=0
     foreach( $x in $fileTM1ServerLogFile.length..1) {
        $y += 1
        if ($y -lt 20){
            $fileTM1ServerLog += "`t" + $fileTM1ServerLogFile[$x] + "`n"
            }
        }

     $msg.body = $msg.body + "`n`n" + "Most recent lines of the TM1Server.log file:" + $fileTM1serverLog

     #Capture Server Variables
     $msg.body = $msg.body + "`n`nServer Information:`n"
     $msg.body = $msg.body + "`tServer Name:`t`t`t" + $env:COMPUTERNAME + "`n"
     $msg.body = $msg.body + "`tDNS:`t`t`t`t" + [System.Net.DNS]::GetHostName() + "`n"

     $systemMemoryFree = "{0:N0}" -f (get-counter -counter "\Memory\Available MBytes").CounterSamples[0].CookedValue
     $msg.body = $msg.body + "`tFree Memory:`t`t`t" + $systemMemoryFree + " MB`n"


#     $msg.body

     #Save a text file with the body for debugging
#     $msg.body | out-file c:\temp\TempEmailFile_FinishedEmail.txt

     #Sending email
     $smtp.Send($msg)
 
     #Clear temp files
     remove-item c:\temp\TempEmailFile* -recurse
 
}

#Calling function
sendMail





3) Call it from a TI
This part isn't very exciting. This script accepts command line variables, so all we need to do is generate a command line string and then send it to ExecuteCommand(). Here is code for the prologue tab that gives you a structure to work with.


vsLinkErrorFiles = '\\server\e$\tm1logs\cxmd';
vsLinkDataFolder = '\\server\F$\tm1data';
vsLinkApplicationLogin = 'http://server/tm1web/TM1WebLogin.aspx';

# Get the To for this environment, if necessary
If (psTo @='');
  psTo = 'user@somewhere.org';
EndIf;

# Get the CC for this environment, if necessary
If (psCC @='');
  psCC = 'user@somewhere.org';
EndIf;

# Get the From for this environment, if necessary
If (psFrom @='');
  psFrom = 'admin@somewhere.org';
EndIf;

# Write the contents of the email to a file for use by the script

  vsTempFile = 'f:\Temp\TempEmailFile';

  #Subject
  ASCIIOUTPUT(vsTempFile | 'Subject.txt', psSubject);

  #ToEmail
  ASCIIOUTPUT(vsTempFile | 'ToEmail.txt', psTo);

  #ToName
  ASCIIOUTPUT(vsTempFile | 'ToName.txt', 'Users');

  #FromEmail
  ASCIIOUTPUT(vsTempFile | 'FromEmail.txt', psFrom);

  #FromName
  ASCIIOUTPUT(vsTempFile | 'FromName.txt', 'TM1 Administrator');

  #Body
  ASCIIOUTPUT(vsTempFile | 'Body.txt', psBody);

  #ErrorFiles
  ASCIIOUTPUT(vsTempFile | 'LinkErrorFiles.txt', vsLinkErrorFiles);

  #DataFolder
  ASCIIOUTPUT(vsTempFile | 'LinkDataFolder.txt', vsLinkDataFolder);

  #LinkApplicationLogin
  ASCIIOUTPUT(vsTempFile | 'LinkApplicationLogin.txt', vsLinkApplicationLogin);


vCommand = 'powershell.exe f:\tm1data\cxmd\email.ps1 ';

ASCIIOUTPUT ('f:\temp\email_asciiOutput.txt', 'Send Email is running command ' | vCommand);
ExecuteCommand( vCommand,0);





Monday, November 18, 2013

Suppressing the OnSave Warning

When saving from a template, you may see the following error message:



This program is attempting to save ....xml to your hard drive.
Are you sure you want to save ....xml to your hard drive?


  While in development, this is slightly annoying. The simple workaround is to just click 'Yes' and move on. As we roll templates out into production, this becomes REALLY annoying for users. They often click 'No' causing an error, or feel nervous that the template is broken.

   There are a few simple things we can do to avoid this warning.

1. Check Client Browser Settings
In Internet Explorer, the following options should get enabled


        Initialize and script ActiveX controls 
        not marked as safe for scripting
        ENABLED  


2. Set Exporting Method to Indirect in Each .Template File
When looking at the XML code for a template, the <Features> container has an <Exporting> element with a Method property. By setting the Method property to "Indirect" this also suppresses the save warning.

    <Features>
     <Exporting Method="Indirect"></Exporting>
     ...
    </Features>
  

Once these items are checked, users should not see the warning anymore. When they click the Save button in Clarity, the next dialog they see should be the Save Successful message. For a list of other client browser settings, please check section 7.5.2 of the Clarity Install guide for your version of Clarity. Here is a link to the documentation section on the IBM website.

Wednesday, October 16, 2013

Super Fast Saves with Relational Writeback for IBM Clarity

When Is It Useful?

Clarity is a budgeting and planning tool. As a budget deadline approaches, system usage skyrockets. Everyone has left their budget entry until the last possible moment. For that one day, when everyone is furiously trying to finish updating and approving their budgets, some systems grind to a half. Heavy save activity on a Clarity system can result in locking. This is when one user needs to wait for another user to finish saving before their changes can be saved. If there are only a few users on the system, it's not a big deal. When you have 20 people in the system all at the same time, where seconds seem like hours, this can be very frustrating and costly. Relational writeback options try to address this by optimizing the way data is saved back to the cube.

How Do I Do It?

There are some configuration steps to be accomplished before using relational writeback. A special writeback stored procedure is required too. This is something tailored to each cube to save the correct data back to the correct place in the fastest way. At the bottom is a sample writeback stored procedure for SQL 2008.

What Are The Different Writeback Types?

Writeback is a common term for the process of sending updates back to a cube. If a value in the is currently 100 and we want to update it 150, only the change (or delta) of 50 is sent back. We already have a row of data for the 100. We add a new row for 50, and that gives us the correct value of 150. Below is a list showing the various writeback methods. For anyone still using Clarity 6.x, there is a relational writeback for that version too, but it is not as reliable or fast as the one in Clarity 7.x.
  • 0 = Native Analysis Services Writeback: This lets SQL Server Analysis Services do all the work. It doesn't require any additional stored procedures, but is the slowest option.
  • 1 = Clarity custom write-back - cell by cell method: This method inserts the deltas to a relational table directly, one at a time. This is faster than the native method, but still slow because it loops through all the cells that changed, one at a time.
  • 2 = Clarity custom write-back - SQL 2005: For folks using SQL 2005, one method is known...but SQL 2005 is not officially supported in Clarity 7. This option should never be used, but it's an option, so I list it for reference.
  • 3 = Clarity custom write-back - SQL 2008: Last, but not least, this is the option most folks should select. This performs a fast writeback by doing an update in place. It also processes the changes as a set in one operation. This should always be the fastest option.

How Do I Know What Fields To Put In The Writeback Table, In What Order?

Clarity runs the following SQL Query to get the list of fields to be saved. The 'Amount' field should be first, and then each of the dimension fields returned by this query. The ordering of the dimension fields isn't important, but the names must match exactly. Also, if there are alternate hierarchies, they need to have a fake dimension column too for this writeback.

Select DimensionName
FROM Dimensions
WHERE CubeID IN (
  SELECT CubeID FROM Cubes
  WHERE lower(Alias) = '<cube alias from olap manager>'
 )
ORDER BY 
DimensionID

What are the different approaches?

Everything can be handled using a SQL stored procedure, but another approach is to setup two views that make the stored procedure much simpler. By setting up one view unioning all non-ROLAP partitions together and one view onto the writeback table, all the grouping stuff can be encapsulated into the view. This makes it a bit easier when building the stored procedure as you don't need to group everything in each part.

What If I Need Help?

It is very common to need a little help implementing this feature. I recommend contacting Clear Insight ( http://www.clearinsight.ca ) for help. They can help implement this for you and have you on your way quickly.

--Sample Writeback Stored Procedure
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[stp_ClarityWB_2008]
/*This parameter must be named @WB_Amount*/
@WB_Amount <cube name>_WBTableType READONLY
AS
BEGIN
DECLARE @Message NVARCHAR(4000);
DECLARE @MemberName NVARCHAR(100);
DECLARE @Measure nvarchar(100);
/*CONFIGURABLE TRANSACTION ISOLATION LEVEL*/
SET TRANSACTION ISOLATION LEVEL READ COMMITTED
begin transaction
--superset WB table for all measure groups
CREATE TABLE #WB (
[Amount] [float] NULL,
[Account] [varchar](80) null,
[Scenario] [varchar](80) null,
[Measure] [varchar](80) null
)
CREATE UNIQUE CLUSTERED INDEX [IX_#WB] ON [dbo].[#WB]
(
[Account] ASC,
[Scenario] ASC,
[Measure] ASC
)
--insert transaction data into temporary table
insert into #WB([Amount],[Account],[Scenario],[Measure])
select SUM(coalesce([Amount],0.0)),[Account],[Scenario],[Measure]
from @WBTYPE
group by [Account],[Scenario],[Measure]
--determine measure group
select top 1 @Measure=[Measure] from #WB
if (@Measure='Amount') begin
--Perform writeback for Measure Group Amount
--ensure only one measure from measure group is being written to
SET @MemberName=''
select TOP 1 @MemberName=measure from #WB where Measure<>'Amount'
if (@@ROWCOUNT>0) BEGIN
SELECT @Message = N'Incorrect Measure - ' + @MemberName + ';Cannot write to
multiple Measures from same dimension';
ROLLBACK TRANSACTION
RAISERROR (@Message,16,1);
RETURN;
END
--Validate account dimension to ensure it exists in dim table and is a leaf
member
SET @MemberName=''
SELECT @MemberName=WB.[account] FROM #WB WB where WB.[Account] not in (select
[Level0_MemberId] from tbldimaccount)
-- check for Unknown member values
if (@@ROWCOUNT>0) BEGIN
SELECT @Message = N'Cannot write data to an Undefined, All or Parent Account
(' + @MemberName + ')';
ROLLBACK TRANSACTION
RAISERROR (@Message,16,1);
RETURN;
END
--Validate scenario dimension to ensure it exists in dim table and is a leaf
member
SET @MemberName=''
SELECT @MemberName=WB.[scenario] FROM #WB WB where WB.[scenario] not in
(select [Level0_MemberId] from tbldimscenario)
-- check for unknown member values
if (@@ROWCOUNT>0) BEGIN
SELECT @Message = N'Cannot write data to an Undefined, All or Parent Scenario
(' + @MemberName + ')';
ROLLBACK TRANSACTION
RAISERROR (@Message,16,1);
RETURN;
END
--Delete records from writeback table that will be updated by Clarity
DELETE
FROM dbo.Rolap_Budget
WHERE EXISTS(
SELECT *
FROM #WB WB
WHERE dbo.Rolap_Budget.[Account]=WB.[Account] and dbo.Rolap_
Budget.[Scenario]=WB.[Scenario])
--Insert deltas for Clarity update into WB table
insert into Rolap_Budget ([Amount],[Account],[Scenario],sourcedetail,dtstamp)
select sum(WB.[Amount])-
sum(coalesce(FactGroup.[Amount],0.0)),WB.[Account],WB.[Scenario], system_
user,GETDATE()
from #WB WB left outer join
(
select fact.[Account],fact.[Scenario],SUM(coalesce(fact.amount,0.0)) as
Amount
from
(
select * from tblfactactual UNION ALL
select * from tblfactbudget
) fact
group by fact.[Account],fact.[Scenario]
) factGroup
ON WB.[Account] = factgroup.[Account] and WB.[Scenario] =
factgroup.[Scenario]
group by WB.[Account],WB.[Scenario]
having sum(WB.[Amount])-sum(coalesce(FactGroup.[Amount],0.0))<>0
end
ELSE IF (@Measure='Count') BEGIN
--Perform writeback for Measure Group Count
--ensure only one measure from measure group is being written to
SET @MemberName=''
select TOP 1 @MemberName=measure from #WB where Measure<>'Count'
if (@@ROWCOUNT>0) BEGIN
SELECT @Message = N'Incorrect Measure - ' + @MemberName + ';Cannot write to
multiple Measures from same dimension';
ROLLBACK TRANSACTION
RAISERROR (@Message,16,1);
RETURN;
END
--Validate account dimension to ensure it exists in dim table and is a leaf
member
SET @MemberName=''
SELECT @MemberName=WB.[account] FROM #WB WB where WB.[Account] not in (select
[Level0_MemberId] from tbldimaccount)
-- check for Unknown member values
if (@@ROWCOUNT>0) BEGIN
SELECT @Message = N'Cannot write data to an Undefined, All or Parent Account
(' + @MemberName + ')';
ROLLBACK TRANSACTION
RAISERROR (@Message,16,1);
RETURN;
END
--Delete records from writeback table that will be updated by Clarity
DELETE
FROM dbo.Rolap_Budget
WHERE EXISTS(
SELECT * from #WB WB
where dbo.Rolap_Budget.[Account]=WB.[Account])
--Insert deltas for Clarity update into WB table
insert into Rolap_Budget_Count ([Amount],[Account],sourcedetail,dtstamp)
select sum(WB.[Amount])-sum(coalesce(FactGroup.[Amount],0.0)),WB.[Account],
system_user,GETDATE()
from #WB WB left outer join
(
select fact.[Account],SUM(coalesce(fact.amount,0.0)) as Amount
from
(
select * from tblfactactualcount
) fact
group by fact.[Account]
) factGroup
on WB.[Account] = factgroup.[Account]
group by WB.[Account]
having sum(WB.[Amount])-sum(coalesce(FactGroup.[Amount],0.0))<>0
END
ELSE BEGIN
SELECT @Message = N'Measure "' + @Measure + '" not defined.';
RAISERROR (@Message,16,1);
RETURN;
END
--drop stored proc temp table
DROP TABLE #WB
commit transaction
END



Thursday, September 12, 2013

What If Modelling in Clarity


What If Modelling is a great feature of Clarity. The challenge for this is in the setup of the tool. Each cube is different. Each team has a different type of modeling they want to do. At a high level, what if modeling enables power users to copy a (scenario, year) combination into another (scenario, year) combination and apply assumptions to it. "What If" Revenue doubles? "What If" Expenses go down by 50%? "What If" salaries go up by 50%?  For companies wanting an optimistic and pessimistic version of their Forecast or Budget, What If analysis is a great fit. With the touch of a button, optimistic an pessimistic versions can be updated from the latest Forecast or Budget, with the same assumptions applied on top of those most recent numbers, providing a true comparison between these scenarios.

Line Item Detail in Clarity


Line Item Detail is a powerful feature, but not often used in Clarity.  Line Item Detail allows users to enter detail below the lowest grain of the cube.  The most common use is for budget planning. Most cubes have an Account dimension that matches their chart of accounts. Some users may know that they have a few separate items they want to track, that all rollup into this account. For example, let's say we are planning insurance costs for an office, and there is only one Insurance account in the chart of accounts. By using Line Item Detail, a user could enter a line item for each type of insurance on the template. They get the detail they need for tracking, and the cube does not need to be changed. This can provide great flexibility when we just need a bit more detail on a given template.

Where is the documentation for my version of Clarity?

The IBM website can be a bit intimidating due to the size and quantity of products they support. How can we find documentation for our specific version of Clarity? Here are some steps to find the documentation for your version of Clarity. Some versions do not have full documentation. In that case, please try to find the documentation for the next lowest version. Here is a link directly to this section on the IBM site for Clarity 7.2.1FP2. If the link is broken or a newer release is available, you will need to follow the steps below to find the documentation.

Clarity 7.2.1 FP2 Documentation http://www-01.ibm.com/support/docview.wss?uid=swg27024452

Step 1 - IBM.com
From IBM.com, select the 'Support & downloads' menu and then the 'Documentation' link under Technical Support.

Step 2 - Search for 'Clarity'
On the left side of the screen, enter 'Clarity' into the Quick find search box, and press enter.

Step 3 - Add your version of Clarity
By first adding a checkmark beside Clarity, you will be able to select your particular version of Clarity and the operating system. I recommend selecting just the product version and leaving the operating system selection blank. Once you make the selection and move it to the right, you can click the Finish button.

Step 4 - Click on 'Clarity ... Documentation'
Now you should be able to see a link for the documentation for your version. On this page, you will find the server admin, server install, end user and reference guides for all Clarity products at this version.