Top 10 factors to consider while implementing Tableau solution!

  • Home
  • blog
  • Top 10 factors to consider while implementing Tableau solution!
blog image

Assuming you found answers to questions in my earlier blog and now implementing the Tableau node cluster as your enterprise reporting solution, I am now trying to expose a few more aspects of Tableau.

I am writing this blog to share experiences, in brief, we learned the hard way, this one is more specific to Tableau but some of them are generic.

New Tableau developers or architects tend to make common mistakes which may prove costly over time. If you are troubled by your reporting framework despite using Tableau, then there is something wrong going on with its implementation.

Equations Work has very rich experience using Tableau implementations for large enterprises, for more information please contact us here.


You must be aware of several users or a number of licenses that you are going to need, not only short term but also long term. A very common mistake is using Tableau with limited or no understanding of this knowledge. Tableau typically sells named and core licenses. If you are building a system where you intend to publish the reports to end users on your application then with a named license model, for every user to be able to view the Tableau report, you must purchase a named license. Named licenses are flexible and you can use them effectively with a smaller user base. For a large user base core license makes sense, but in this case, you will get limited processing power and it is expensive.

Handling large data

Processing large amounts of data is a challenge for every tool. If you are connecting Tableau to your large Time Series table or a collection, then this is the wrong approach! You should always compute a summary and use it in Tableau Desktop for creating reports. Using Live vs. Extract mode is a key decision to make for the performance.

Limit data points

Knowing the amount of data, you need to visualize is a very critical investigation before you choose a reporting tool. The key here is to use as much summarized data to render on the reports. One should avoid visualizing thousands of data points on the report, when you apply formulas to the report or add multiple measures, the report ends up doing increased processing (like the Cartesian join in SQL).

Connecting to NoSQL

Tableau can be used to connect almost all NoSQL databases, here you need an ODBC-based driver to be able to read from these NoSQL databases. Simba is one of the market leaders for third-party bi-directional ODBC connectors. While using such drivers, you need to ensure that the data you are going to process through them should be limited to achieve better performance. You can make them connect to summarized collections.

Server configuration

Once you understand the data requirement, you need to carefully choose the deployment strategy. Whether you need one fat server or 3 node cluster or 5 5-node cluster is a very important decision because it directly impacts your budget too. One needs a good understanding of core Tableau components because deploying them and separating them depends on the nature of your application e.g. cluster distribution of data server, web app, repository is a key decision.

Trusted Connections

Secure access is an inevitable requirement of any enterprise and Tableau offers very good feature of authenticating requests with their Trusted Authentication feature. The key here is to use it such that you won’t end up configuring it again and again with changing network policies of your company. Understanding what IP to put and configuring it on an environment below UTM or different subnets may be a bit tricky. You should always write the warm-up script to identify and configure relevant IP addresses to the Tableau Server, instead of manual configuration.

Data accessibility

While working with one of the large Investment Bank, I realized that you just can’t take database access for granted, even if you are accessing using a Tableau-like tool within their secure eco-system. Tableau has a solution to such a scenario, Tableau API. You can write programs to fetch data using API with languages like Python or Java and generate a TDE (Tableau Data Extract) that you can host on the Tableau Server. The downside of this is the performance impact while using large data and the need for API as a pre-requisite.

Monitor Tableau Server

For critical applications, it’s very important to monitor all Tableau components running on the cluster along with the Tableau Server admin console which in turn includes all reports. Tools like Nagios can be configured to monitor resources. Having a DR site is advisable for mission-critical applications.

Preparing environments

Setting up the Tableau cluster may be tedious if you are doing it for the first time, hence documenting the steps is advisable but much better is to script it so you can reuse it. If you are using cloud then you can save the image of the initial setup, e.g. save AMI on AWS. Having a dev stage and prod environment is advisable to ensure that you are aligned with your DevOps process. Three installations are allowed with a single Tableau license (this may change over time so please verify from Tableau support while you implement)

Non-functional aspects

Performance, Security & Scalability are essential for every business and sometimes is the reason for Tableau adoption. Tableau supports all three key non-functional aspects as out of the box feature.

Leave a Reply

Your email address will not be published. Required fields are marked *