Month: September 2014

Going beyond base Tableau – Unofficial Add-ons

Posted on Updated on

Advanced data visualization tools such as Tableau and Spotfire bundle capabilities such as Table calculations and Custom expressions respectively and make new dimensions of viewing data possible. Further, interactivity and customization which could be brought into dashboards with the use of parameters (in Tableau) and property controls (in Spotfire) set apart these visual analytic products from conventional BI query and analysis tools.

However, apart from out-of-the-box functionality, more could be achieved using these tools with some minor tweaks. One such example is to use phpGrid in a webpage (as part of the Tableau dashboard) as a user interface to show dynamic views depending on the data entered on the grid.

Below is a simple gantt chart which has parameters to filter data. Interactivity is limited to filtering data depending on the filters which only work on the data from the source file.


However, there are a couple of ways we could update the data into a database with these parameters. First is to use a phpgrid to interact with the database using the refresh button action. And another method is to use the GET method (to pass parameters as part of the URL) using the dashboard webpage.

By Ryan Robitaille

By Russell Christopher

Getting column values from filtered data selection

Posted on Updated on

Reference snippet for fetching the values from a particular column after filtering is applied to the data table.

import Spotfire.Dxp.Data.DataTable
from Spotfire.Dxp.Data import *
from Spotfire.Dxp.Data import DataValueCursor

dataTable = Document.ActiveDataTableReference
rows = Document.ActiveFilteringSelectionReference.GetSelection(dataTable).AsIndexSet()

for row in rows:
for column in dataTable.Columns:
if (column.Name == "colName"):
print (column.RowValues.GetFormattedValue(row))

Text Analytics using Natural Language Processing

Posted on Updated on

Natural Language Processing (NLP) combines artificial intelligence and machine learning techniques with linguistics to process and understand human language. Using NLP, various sources of unstructured data such as social media, call (text) logs, emails etc. could be leveraged to extract actionable insights. Some of the applications include text processing for information retrieval, sentiment analysis, question answering etc.

The core of the problem is that natural languages have been constantly evolving with growing vocabulary. In addition, some of the inherent aspects of the language such as grammar, syntax, semantics and varied writing styles add to the complexity of their analysis. It is quite challenging to arrive at definitive rules while creating systems that make sense of the language. As a result, a logical process of building a parsing system should focus more on using application-specific techniques and the domain in context.

Some of the techniques being:

NLP using Natural Language Toolkit (NLTK) library from Python

Using the open source library – NLTK 3.0 from Python, I was able to understand the trend of a set of ailments (in the medical domain). This could be achieved by counting the frequencies of these words (ailments) from call (text) logs pertaining to a certain time period.

Stanford NLP

In another NLP application, I used Stanford NLP libraries to understand customer opinion. To be more specific, this was to perform Sentiment Analysis on Yelp reviews.