Insights into Tableau Performance Testing


Here at Spry, Tableau is one of our go-to data visualization tools. In many cases, however, we seem to have performance issues when it comes to loading and filtering Tableau workbooks. Although we don't always work with very large amounts of data (~5-8M rows in a typical data set), we do utilize a number of types of visualizations, filter types, etc. This can result in less than optimal performance conditions when a particular dashboard uses complex interactions.

The Experiment

To pinpoint performance bottlenecks, we set out to benchmark Tableau. At it's core, the experiment is quite simple: take create a dashboard "template," use increasing sizes of data sets, use various connection types, and measure performance with Tableau's performance recording feature.

Here is our experiment setup:
  • Test workbook: one from a client that used line charts, tables, maps, dashboard filter actions, and parameter filtering
  • Data set sizes: 500,000 rows, 2 millions rows, 5 millions rows 
  • Connection types: Hortonworks Hive, Oracle Database 11g Express Edition, Tableau Data Extract
 In our testing, we measured the processing time for the following:
  • Load time of the dashboard
  • Filtering a map and removing the filter
  • Parameter filters of a text table


The Results

Our results were not surprising: working with flat files is faster than working with an ODBMS connection and working with extracts is faster than working with live connections. However, an extract of a CSV (or similar) isn't always an option when working with clients. In this case, the live Oracle connection had an acceptable performance rate on Tableau server. We are still analyzing our results and will followup in a future blog post.

No comments:

Post a Comment