Wednesday, December 29, 2010

Sybase SQL Anywhere Database Tracing and Profiling

While looking for a way to trace the incoming statements to one of our Sybase SQL Anywhere databases, I came across this really good post which helped me a lot. Sharing it for your reference.

To the Post

Wednesday, November 24, 2010

Integrating SharePoint and Silverlight 3

I'm working on developing Silverlight 3 web parts for SharePoint and found this article really interesting,

Link to Article


Tuesday, November 23, 2010

Install or Copy Assembly / dll into GAC in Vista

While doing some SharePoint stuff, I needed to make SharePoint find one of my DLLs I have used in my SharePoint web part. I was going to install the dll in my Global Assembly Cache of my Vista pc. I went to the %WINDOWS%\assembly folder and simply dragged the dll there, but it gave me an error saying "Access Denied". Then I did what I always do when I get this sort of authentication errors, I closed the explorer, then re-opened it as Administrator. Still no use. I tried to find the gacutil.exe but I couldn't find that either in my .NET Framework folders.

I had to do a bit of googling and found this way in one of the forums. You need to use the "Microsoft .NET Framework 2.0 Configuration" utility which lets you Install / Uninstall assemblies from the GAC easily. You can find this utility in the following location
"C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\mscorcfg.msc"

When you open the utility, there is a node called My Computer in the left hand pane. Expand it and click on the sub node "Assembly Cache". This will give you two options in the right hand pane, "View List of Assemblies...." and "Add an Assembly to ...". You can use the second option to add the assembly to GAC.

Hope this helps,

Wednesday, November 17, 2010

Import, Export, Transfer Multiple Tables with Pentaho Kettle

Working with Pentaho for the latest requirement we had on migrating sybase database to PostgreSQL, I had to find a way to Import/ Export all the tables from Sybase to Postgres and this had to be done quickly and with minimum effort. If it were SQL Server, I could go with the simple Import/ Export wizard it would have been a piece of cake. But Pentaho also has a similar approach and here's how to do it,

Step1: Create a Job in kettle.

You can do this simply going to file --? new --> Job

Step2: Create Data Connections

You need to create the source and destination database connections. You can do this in the view tab, right clicking the Data Connection and selecting New Data connection.

Step3: Open the wizard to Create the Job

Once the source and destination connections are in place, it's time to create the job that will actually copy the data. Go to Tools --> Wizards --> Copy Tables. In the first view select the source and the destination connections and press next. Then select the tables from the source connection that should be copied to the destination. In the third step of the wizard, provide details where the job and the relevant transformations should be saved. Press finish and it will create a new job with all the Create Scripts and Transformations.

Things to consider,

1. The wizard would not create Create Scripts for already existing tables in the Destination Database.
2. Based on what destination database product used, SQL Server, PostgreSQL, there might be things missing from the source database. i.e, Default Values, Identity Columns created as ordinary Numeric columns ..etc. In that case, it had better if the destination tables can be manually created.

There might be more considerations that should be taken care of, but these are what I came across.

Hope this helps!

Configure email in JasperServer with SMTP Authentication

This is an issue I came across while configuring JasperServer for emailing reports. Our mail server users SMTP Authentication. But, in the general email configuration of JasperServer ( file) doesn't provide a way to put this property. But this property can be set in a different location.

In this case you need to modify a file called "applicationContext-report-scheduling.xml". This file can be found in TOMCAT Installation Folder\webapps\jasperserver\WEB-INF folder. You need to find the following tag <bean id="reportSchedulerMailSender"....

and put the following property inside it,

<property name="javaMailProperties">
<prop key="mail.smtp.auth">true</prop>

Restart tomcat and things should work fine. Refer to the following forum thread where I got the answer for my post.

Forum Thread

Monday, November 15, 2010

Transfer / Load data to PostgreSQL using SQL Server , SSIS

SSIS can be used to load, insert data to Postgres databases, but, you need the OLE DB driver for that. There is a OLE DB driver provided by Postgres, but that didn't work for me. May be I need to find more why it didn't. But in the meantime, there is this Native OLE DB Driver for Postgres which works really fine. But it is not free. You can download a demo version of the driver from here,

PostgreSQL Native OLEDB Provider (PGNP)

Once you install the driver, you can use it in SSIS, SQL Server ....etc as an ordinary OLEDB provider to directly load data to Postgres tables.


Extracting data from Sybase SQL Anywhere using SSIS through ODBC

Recently I had to evaluate a few products for a data migration from Sybase SQL Anywhere 10 database to PostgreSQL database. SSIS was one of the tools. This post explains what I did to extract data from the Sybase database.

Step1: Create DSN

The first thing I did was to install SQL Anywhere on my computer so that it installs all the ODBC and OLE DB drivers. In my case I used the ODBC driver for SQL Anywhere 10. I created a system DSN using the odbc driver. I used the following values for the connection properties,

In ODBC Configuration for SQL Anywhere 10,
Under ODBC Tab,
I have given a name for my Connection in "Data Source Name" field.

Under Login Tab,
I have selected the "Supply User ID and Password" option and entered the credentials there.

Under the Database Tab,
Server name: The Name of the Sybase Server you are connecting to
Database name: The Name of the database.

And finally under Network Tab,
I have selected TCP/IP checkbox and entered the following details in the textbox next to it,
HOST=IP Address of the server;PORT=port number sybase server listens to(default 2638)

Go back to the ODBC tab and click on the "Test Connection" button which should give a test successful message box if everything went well.

Step2: Create Connection in SSIS

In SSIS, right click on the connection managers secsion and select ADO.NET connection. On the Connection Manager window, you need to select "odbc data provider" for the Provider. Then select the System DSN you created in the Use User or System data source name combo box. Test the connection to ensure that the connection is working.

Step3: Configure Data Reader Source

Inside a Data Flow Component, drag and drop a Data Reader Source and open its properties window.

Under the Connection Managers Tab,
Select the ADO.NET connection manager we just created from the Connection Manager combo box.

Under the Component Properties Tab,
Under Custom Properties, type in the Select Query in the text box next to "SQL Command" field.

In the Columns Mapping Tab,
Check whether the columns selected in the Select Query are correctly mapped with the out put columns. Generally these two should be identical. Press ok to close the window.

Ok now you are ready to extract data from a SQL Anywhere table. You can connect the dataflow with other transformations/Destinations.


Monday, November 08, 2010

Rebuild and Reorganize Indexes in SQL Server

With the time, when more and more data are inserted into a table, the indexes of it tend to fragment. This might critically effect query performance if not monitored and taken care of. Rebuilding the indexes time to time will prevent things like this happening. Here's how to rebuild indexes in SQL Server 2005 using TSQL,

Rebuilding all the indexes in a table


Rebuilding a specific index

Alternatively you can re-organize indexes instead of rebuilding them when the index is fragmented, but not very much. Re-organizing an index would not be as costly as Rebuilding. Also it would not need the table to be locked during the process as rebuilding would. Here's the TSQL for reorganizing an INDEX.


Tuesday, November 02, 2010

SharePoint and JasperServer Integration

It is true that JasperServer can be a real pain when it comes to maintaining all the reporting schedules. Specially when a particular schedule fails and when you want to re-run it. Yes, there isn't a way to re-run a schedule. You either have to edit the existing schedule so that I will run again or create a Run Now schedule with all the details copied from the old schedule.
But luckily JasperServer comes with its web services. Specially the "Repository" web service to administrate the resources in the JapserServer repository and "ReportScheduler" web service to work with all the scheduled jobs in JasperServer. the wsdls for these web services can be found in the following locations respectively,


I created a custom web part for sharepoint that has mainly two interfaces. The first to list all the schedules that is currently on JasperServer and the second, once you click on a particular schedule it will display details of the schedule also a button to re-run that job, I recall, re-run it with one click.

This is simple. ReportScheduler web service has several methods, together which will achieve this.
The main method that is used in the webserivce to create a scheduled job is "scheduleJob()" This requires an object of type "Job" as its input parameter. Job object contains all the detail of a JasperServer scheduled job such as, reportUnitURI, baseOutputName, Email....etc.

So how to re-run an existing schedule?

First you need to get the job id of the scheduled job you need to re-run. and use the GetJob() method, which requires jobid as its input param. This method will return an Job type object which represent the scheduled job of the given jobid. you can get a copy of this Job type object but with a bit of change. There are two types of calendar triggers you can create. Calendar Trigger and Simple Trigger. you can create an new object of type Simple Trigger and make the Star Date/Time and End Date/Time null. Then you need to set this simple trigger as the copied Job type object's simple trigger. Now you have a Job object with a simple trigger that has null start and end dates.
Now you can call the method ScheduleJob() and pass your job object to it which will create a new Run Now job but with all the details of the old job. you can create a button and do this creating new job thing in it's event handler class.
This is just an overview of what I did to implement Re-Run functionality on SharePoint which is not provided out of the box by JasperServer. Hope this helps and below are some screenshots of my simple webpart,


Thursday, July 01, 2010

Repository Migration from JasperServer Professional (Pro) to Community Edition (CE)

Recently I had to go through a painful server migration for JasperServer. Why was it painful? Because I had to shift from a Pro version to free Community version. The Free community version lacks from several features so restoring an repository export taken from a Pro version is simply a nightmare.
At some point I have almost given up after getting hundreds of errors and was going to manually upload all the reports, data sources, and create schedules. But bingo, something came to my mind. Instead of getting a full export from the Pro version, why not taking several exports of only the reporting folders? Yes that worked out. Yes, it is not as easy as the complete export, but much much easier than doing everything manually. I took several exports, specially of reporting schedules, avoiding resources that are not supported in the community edition, such as analysis views..etc. After that I individually imported these to my community edition and here it works smoothly.
So the trick is, if you ever want to transfer resources from a Pro version to Community version of JasperServer, take several exports/backups from the pro avoiding resources,features that are only supported in Pro. This will save your day!